Sample records for probabilistic causal analysis

  1. A Tutorial in Bayesian Potential Outcomes Mediation Analysis.

    PubMed

    Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P

    2018-01-01

    Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.

  2. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain.

  3. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’

    PubMed Central

    2017-01-01

    ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362

  5. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  6. Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network.

    PubMed

    Bui, Ngot; Yen, John; Honavar, Vasant

    2016-06-01

    Online health communities constitute a useful source of information and social support for patients. American Cancer Society's Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs.

  7. Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network

    PubMed Central

    Bui, Ngot; Yen, John; Honavar, Vasant

    2017-01-01

    Online health communities constitute a useful source of information and social support for patients. American Cancer Society’s Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs. PMID:29399599

  8. A Historical Overview and Contemporary Expansion of Psychological Theories of Determinism, Probabilistic Causality, Indeterminate Free Will, and Moral and Legal Responsibility

    ERIC Educational Resources Information Center

    Wilks, Duffy; Ratheal, Juli D'Ann

    2009-01-01

    The authors provide a historical overview of the development of contemporary theories of counseling and psychology in relation to determinism, probabilistic causality, indeterminate free will, and moral and legal responsibility. They propose a unique model of behavioral causality that incorporates a theory of indeterminate free will, a concept…

  9. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  10. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  11. A Study of Students' Reasoning about Probabilistic Causality: Implications for Understanding Complex Systems and for Instructional Design

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell

    2017-01-01

    Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…

  12. What is the nature of causality in the brain? - Inherently probabilistic. Comment on "Foundational perspectives on causality in large-scale brain networks" by M. Mannino and S.L. Bressler

    NASA Astrophysics Data System (ADS)

    Dhamala, Mukesh

    2015-12-01

    Understanding cause-and-effect (causal) relations from observations concerns all sciences including neuroscience. Appropriately defining causality and its nature, though, has been a topic of active discussion for philosophers and scientists for centuries. Although brain research, particularly functional neuroimaging research, is now moving rapidly beyond identification of brain regional activations towards uncovering causal relations between regions, the nature of causality has not be been thoroughly described and resolved. In the current review article [1], Mannino and Bressler take us on a beautiful journey into the history of the work on causality and make a well-reasoned argument that the causality in the brain is inherently probabilistic. This notion is consistent with brain anatomy and functions, and is also inclusive of deterministic cases of inputs leading to outputs in the brain.

  13. Learning about causes from people and about people as causes: probabilistic models and social causal reasoning.

    PubMed

    Buchsbaum, Daphna; Seiver, Elizabeth; Bridgers, Sophie; Gopnik, Alison

    2012-01-01

    A major challenge children face is uncovering the causal structure of the world around them. Previous research on children's causal inference has demonstrated their ability to learn about causal relationships in the physical environment using probabilistic evidence. However, children must also learn about causal relationships in the social environment, including discovering the causes of other people's behavior, and understanding the causal relationships between others' goal-directed actions and the outcomes of those actions. In this chapter, we argue that social reasoning and causal reasoning are deeply linked, both in the real world and in children's minds. Children use both types of information together and in fact reason about both physical and social causation in fundamentally similar ways. We suggest that children jointly construct and update causal theories about their social and physical environment and that this process is best captured by probabilistic models of cognition. We first present studies showing that adults are able to jointly infer causal structure and human action structure from videos of unsegmented human motion. Next, we describe how children use social information to make inferences about physical causes. We show that the pedagogical nature of a demonstrator influences children's choices of which actions to imitate from within a causal sequence and that this social information interacts with statistical causal evidence. We then discuss how children combine evidence from an informant's testimony and expressed confidence with evidence from their own causal observations to infer the efficacy of different potential causes. We also discuss how children use these same causal observations to make inferences about the knowledge state of the social informant. Finally, we suggest that psychological causation and attribution are part of the same causal system as physical causation. We present evidence that just as children use covariation between physical causes and their effects to learn physical causal relationships, they also use covaration between people's actions and the environment to make inferences about the causes of human behavior.

  14. Causality

    NASA Astrophysics Data System (ADS)

    Pearl, Judea

    2000-03-01

    Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.

  15. Which causal structures might support a quantum-classical gap?

    NASA Astrophysics Data System (ADS)

    Pienaar, Jacques

    2017-04-01

    A causal scenario is a graph that describes the cause and effect relationships between all relevant variables in an experiment. A scenario is deemed ‘not interesting’ if there is no device-independent way to distinguish the predictions of classical physics from any generalised probabilistic theory (including quantum mechanics). Conversely, an interesting scenario is one in which there exists a gap between the predictions of different operational probabilistic theories, as occurs for example in Bell-type experiments. Henson, Lal and Pusey (HLP) recently proposed a sufficient condition for a causal scenario to not be interesting. In this paper we supplement their analysis with some new techniques and results. We first show that existing graphical techniques due to Evans can be used to confirm by inspection that many graphs are interesting without having to explicitly search for inequality violations. For three exceptional cases—the graphs numbered \\#15,16,20 in HLP—we show that there exist non-Shannon type entropic inequalities that imply these graphs are interesting. In doing so, we find that existing methods of entropic inequalities can be greatly enhanced by conditioning on the specific values of certain variables.

  16. Learning to make things happen: Infants' observational learning of social and physical causal events.

    PubMed

    Waismeyer, Anna; Meltzoff, Andrew N

    2017-10-01

    Infants learn about cause and effect through hands-on experience; however, they also can learn about causality simply from observation. Such observational causal learning is a central mechanism by which infants learn from and about other people. Across three experiments, we tested infants' observational causal learning of both social and physical causal events. Experiment 1 assessed infants' learning of a physical event in the absence of visible spatial contact between the causes and effects. Experiment 2 developed a novel paradigm to assess whether infants could learn about a social causal event from third-party observation of a social interaction between two people. Experiment 3 compared learning of physical and social events when the outcomes occurred probabilistically (happening some, but not all, of the time). Infants demonstrated significant learning in all three experiments, although learning about probabilistic cause-effect relations was most difficult. These findings about infant observational causal learning have implications for children's rapid nonverbal learning about people, things, and their causal relations. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Bayesian networks improve causal environmental assessments for evidence-based policy

    EPA Science Inventory

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the p...

  18. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  19. Application of dynamic uncertain causality graph in spacecraft fault diagnosis: Logic cycle

    NASA Astrophysics Data System (ADS)

    Yao, Quanying; Zhang, Qin; Liu, Peng; Yang, Ping; Zhu, Ma; Wang, Xiaochen

    2017-04-01

    Intelligent diagnosis system are applied to fault diagnosis in spacecraft. Dynamic Uncertain Causality Graph (DUCG) is a new probability graphic model with many advantages. In the knowledge expression of spacecraft fault diagnosis, feedback among variables is frequently encountered, which may cause directed cyclic graphs (DCGs). Probabilistic graphical models (PGMs) such as bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning, but BN does not allow DCGs. In this paper, DUGG is applied to fault diagnosis in spacecraft: introducing the inference algorithm for the DUCG to deal with feedback. Now, DUCG has been tested in 16 typical faults with 100% diagnosis accuracy.

  20. Knowledge Representation Standards and Interchange Formats for Causal Graphs

    NASA Technical Reports Server (NTRS)

    Throop, David R.; Malin, Jane T.; Fleming, Land

    2005-01-01

    In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.

  1. Right external globus pallidus changes are associated with altered causal awareness in youth with depression

    PubMed Central

    Griffiths, K R; Lagopoulos, J; Hermens, D F; Hickie, I B; Balleine, B W

    2015-01-01

    Cognitive impairment is a functionally disabling feature of depression contributing to maladaptive decision-making, a loss of behavioral control and an increased disease burden. The ability to calculate the causal efficacy of ones actions in achieving specific goals is critical to normal decision-making and, in this study, we combined voxel-based morphometry (VBM), shape analysis and diffusion tensor tractography to investigate the relationship between cortical–basal ganglia structural integrity and such causal awareness in 43 young subjects with depression and 21 demographically similar healthy controls. Volumetric analysis determined a relationship between right pallidal size and sensitivity to the causal status of specific actions. More specifically, shape analysis identified dorsolateral surface vertices where an inward location was correlated with reduced levels of causal awareness. Probabilistic tractography revealed that affected parts of the pallidum were primarily connected with the striatum, dorsal thalamus and hippocampus. VBM did not reveal any whole-brain gray matter regions that correlated with causal awareness. We conclude that volumetric reduction within the indirect pathway involving the right dorsolateral pallidum is associated with reduced awareness of the causal efficacy of goal-directed actions in young depressed individuals. This causal awareness task allows for the identification of a functionally and biologically relevant subgroup to which more targeted cognitive interventions could be applied, potentially enhancing the long-term outcomes for these individuals. PMID:26440541

  2. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less

  3. Inductive Reasoning about Causally Transmitted Properties

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D.; Tenenbaum, Joshua B.

    2008-01-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates'…

  4. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  5. A breakthrough in neuroscience needs a "Nebulous Cartesian System" Oscillations, quantum dynamics and chaos in the brain and vegetative system.

    PubMed

    Başar, Erol; Güntekin, Bahar

    2007-04-01

    The Cartesian System is a fundamental conceptual and analytical framework related and interwoven with the concept and applications of Newtonian Dynamics. In order to analyze quantum processes physicist moved to a Probabilistic Cartesian System in which the causality principle became a probabilistic one. This means the trajectories of particles (obeying quantum rules) can be described only with the concept of cloudy wave packets. The approach to the brain-body-mind problem requires more than the prerequisite of modern physics and quantum dynamics. In the analysis of the brain-body-mind construct we have to include uncertain causalities and consequently multiple uncertain causalities. These multiple causalities originate from (1) nonlinear properties of the vegetative system (e.g. irregularities in biochemical transmitters, cardiac output, turbulences in the vascular system, respiratory apnea, nonlinear oscillatory interactions in peristalsis); (2) nonlinear behavior of the neuronal electricity (e.g. chaotic behavior measured by EEG), (3) genetic modulations, and (4) additional to these physiological entities nonlinear properties of physical processes in the body. The brain shows deterministic chaos with a correlation dimension of approx. D(2)=6, the smooth muscles approx. D(2)=3. According to these facts we propose a hyper-probabilistic approach or a hyper-probabilistic Cartesian System to describe and analyze the processes in the brain-body-mind system. If we add aspects as our sentiments, emotions and creativity to this construct, better said to this already hyper-probabilistic construct, this "New Cartesian System" is more than hyper-probabilistic, it is a nebulous system, we can predict the future only in a nebulous way; however, despite this chain of reasoning we can still provide predictions on brain-body-mind incorporations. We tentatively assume that the processes or mechanisms of the brain-body-mind system can be analyzed and predicted similar to the metaphor of "finding the walking path in a cloudy or foggy day". This is meant by stating "The Nebulous Cartesian System" (NCS). Descartes, at his time undertaking his genius step, did not possess the knowledge of today's physiology and modern physics; we think that the time has come to consider such a New Cartesian System. To deal with this, we propose the utilization of the Heisenberg S-Matrix and a modified version of the Feynman Diagrams which we call "Brain Feynman Diagrams". Another metaphor to consider within the oscillatory approach of the NCS is the "string theory". We also emphasize that fundamental steps should be undertaken in order to create the own dynamical framework of the brain-body-mind incorporation; suggestions or metaphors from physics and mathematics are useful; however, the grammar of the brains intrinsic language must be understood with the help of a new biologically founded, adaptive-probabilistic Cartesian system. This new Cartesian System will undergo mutations and transcend to the philosophy of Henri Bergson in parallel to the Evolution theory of Charles Darwin to open gateways for approaching the brain-body-mind problem.

  6. Computational Everyday Life Human Behavior Model as Servicable Knowledge

    NASA Astrophysics Data System (ADS)

    Motomura, Yoichi; Nishida, Yoshifumi

    A project called `Open life matrix' is not only a research activity but also real problem solving as an action research. This concept is realized by large-scale data collection, probabilistic causal structure model construction and information service providing using the model. One concrete outcome of this project is childhood injury prevention activity in new team consist of hospital, government, and many varieties of researchers. The main result from the project is a general methodology to apply probabilistic causal structure models as servicable knowledge for action research. In this paper, the summary of this project and future direction to emphasize action research driven by artificial intelligence technology are discussed.

  7. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  8. The Role of Probability and Intentionality in Preschoolers' Causal Generalizations

    ERIC Educational Resources Information Center

    Sobel, David M.; Sommerville, Jessica A.; Travers, Lea V.; Blumenthal, Emily J.; Stoddard, Emily

    2009-01-01

    Three experiments examined whether preschoolers recognize that the causal properties of objects generalize to new members of the same set given either deterministic or probabilistic data. Experiment 1 found that 3- and 4-year-olds were able to make such a generalization given deterministic data but were at chance when they observed probabilistic…

  9. Missing Data as a Causal and Probabilistic Problem

    DTIC Science & Technology

    2015-07-01

    for causal effects identification [18] seems promising. That is, use MID as a guide for constructing a “ zoo ” of structures where recoverability does...not seem to be possible, and then construct a general method for show- ing non-recoverability for this “ zoo .” Some results on non-recoverability do

  10. Conceptual Tools for Understanding Nature - Proceedings of the 3rd International Symposium

    NASA Astrophysics Data System (ADS)

    Costa, G.; Calucci, M.

    1997-04-01

    The Table of Contents for the full book PDF is as follows: * Foreword * Some Limits of Science and Scientists * Three Limits of Scientific Knowledge * On Features and Meaning of Scientific Knowledge * How Science Approaches the World: Risky Truths versus Misleading Certitudes * On Discovery and Justification * Thought Experiments: A Philosophical Analysis * Causality: Epistemological Questions and Cognitive Answers * Scientific Inquiry via Rational Hypothesis Revision * Probabilistic Epistemology * The Transferable Belief Model for Uncertainty Representation * Chemistry and Complexity * The Difficult Epistemology of Medicine * Epidemiology, Causality and Medical Anthropology * Conceptual Tools for Transdisciplinary Unified Theory * Evolution and Learning in Economic Organizations * The Possible Role of Symmetry in Physics and Cosmology * Observational Cosmology and/or other Imaginable Models of the Universe

  11. Dynamic Uncertain Causality Graph for Knowledge Representation and Probabilistic Reasoning: Directed Cyclic Graph and Joint Probability Distribution.

    PubMed

    Zhang, Qin

    2015-07-01

    Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.

  12. Systems and methods for modeling and analyzing networks

    DOEpatents

    Hill, Colin C; Church, Bruce W; McDonagh, Paul D; Khalil, Iya G; Neyarapally, Thomas A; Pitluk, Zachary W

    2013-10-29

    The systems and methods described herein utilize a probabilistic modeling framework for reverse engineering an ensemble of causal models, from data and then forward simulating the ensemble of models to analyze and predict the behavior of the network. In certain embodiments, the systems and methods described herein include data-driven techniques for developing causal models for biological networks. Causal network models include computational representations of the causal relationships between independent variables such as a compound of interest and dependent variables such as measured DNA alterations, changes in mRNA, protein, and metabolites to phenotypic readouts of efficacy and toxicity.

  13. A self-agency bias in preschoolers' causal inferences

    PubMed Central

    Kushnir, Tamar; Wellman, Henry M.; Gelman, Susan A.

    2013-01-01

    Preschoolers' causal learning from intentional actions – causal interventions – is subject to a self-agency bias. We propose that this bias is evidence-based; it is responsive to causal uncertainty. In the current studies, two causes (one child-controlled, one experimenter-controlled) were associated with one or two effects, first independently, then simultaneously. When initial independent effects were probabilistic, and thus subsequent simultaneous actions were causally ambiguous, children showed a self-agency bias. Children showed no bias when initial effects were deterministic. Further controls establish that children's self-agency bias is not a wholesale preference but rather is influenced by uncertainty in causal evidence. These results demonstrate that children's own experience of action influences their causal learning, and suggest possible benefits in uncertain and ambiguous everyday learning contexts. PMID:19271843

  14. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.

  15. Causation in epidemiology

    PubMed Central

    Parascandola, M; Weed, D

    2001-01-01

    Causation is an essential concept in epidemiology, yet there is no single, clearly articulated definition for the discipline. From a systematic review of the literature, five categories can be delineated: production, necessary and sufficient, sufficient-component, counterfactual, and probabilistic. Strengths and weaknesses of these categories are examined in terms of proposed characteristics of a useful scientific definition of causation: it must be specific enough to distinguish causation from mere correlation, but not so narrow as to eliminate apparent causal phenomena from consideration. Two categories—production and counterfactual—are present in any definition of causation but are not themselves sufficient as definitions. The necessary and sufficient cause definition assumes that all causes are deterministic. The sufficient-component cause definition attempts to explain probabilistic phenomena via unknown component causes. Thus, on both of these views, heavy smoking can be cited as a cause of lung cancer only when the existence of unknown deterministic variables is assumed. The probabilistic definition, however, avoids these assumptions and appears to best fit the characteristics of a useful definition of causation. It is also concluded that the probabilistic definition is consistent with scientific and public health goals of epidemiology. In debates in the literature over these goals, proponents of epidemiology as pure science tend to favour a narrower deterministic notion of causation models while proponents of epidemiology as public health tend to favour a probabilistic view. The authors argue that a single definition of causation for the discipline should be and is consistent with both of these aims. It is concluded that a counterfactually-based probabilistic definition is more amenable to the quantitative tools of epidemiology, is consistent with both deterministic and probabilistic phenomena, and serves equally well for the acquisition and the application of scientific knowledge.


Keywords: causality; counterfactual; philosophy PMID:11707485

  16. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    NASA Technical Reports Server (NTRS)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  17. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  18. Bayes and blickets: Effects of knowledge on causal induction in children and adults

    PubMed Central

    Griffiths, Thomas L.; Sobel, David M.; Tenenbaum, Joshua B.; Gopnik, Alison

    2011-01-01

    People are adept at inferring novel causal relations, even from only a few observations. Prior knowledge about the probability of encountering causal relations of various types and the nature of the mechanisms relating causes and effects plays a crucial role in these inferences. We test a formal account of how this knowledge can be used and acquired, based on analyzing causal induction as Bayesian inference. Five studies explored the predictions of this account with adults and 4-year-olds, using tasks in which participants learned about the causal properties of a set of objects. The studies varied the two factors that our Bayesian approach predicted should be relevant to causal induction: the prior probability with which causal relations exist, and the assumption of a deterministic or a probabilistic relation between cause and effect. Adults’ judgments (Experiments 1, 2, and 4) were in close correspondence with the quantitative predictions of the model, and children’s judgments (Experiments 3 and 5) agreed qualitatively with this account. PMID:21972897

  19. Whose statistical reasoning is facilitated by a causal structure intervention?

    PubMed

    McNair, Simon; Feeney, Aidan

    2015-02-01

    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.

  20. Probability misjudgment, cognitive ability, and belief in the paranormal.

    PubMed

    Musch, Jochen; Ehrenberg, Katja

    2002-05-01

    According to the probability misjudgment account of paranormal belief (Blackmore & Troscianko, 1985), believers in the paranormal tend to wrongly attribute remarkable coincidences to paranormal causes rather than chance. Previous studies have shown that belief in the paranormal is indeed positively related to error rates in probabilistic reasoning. General cognitive ability could account for a relationship between these two variables without assuming a causal role of probabilistic reasoning in the forming of paranormal beliefs, however. To test this alternative explanation, a belief in the paranormal scale (BPS) and a battery of probabilistic reasoning tasks were administered to 123 university students. Confirming previous findings, a significant correlation between BPS scores and error rates in probabilistic reasoning was observed. This relationship disappeared, however, when cognitive ability as measured by final examination grades was controlled for. Lower cognitive ability correlated substantially with belief in the paranormal. This finding suggests that differences in general cognitive performance rather than specific probabilistic reasoning skills provide the basis for paranormal beliefs.

  1. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE PAGES

    Williams, Adam D.

    2015-11-17

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  2. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam D.

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  3. Causal learning and inference as a rational process: the new synthesis.

    PubMed

    Holyoak, Keith J; Cheng, Patricia W

    2011-01-01

    Over the past decade, an active line of research within the field of human causal learning and inference has converged on a general representational framework: causal models integrated with bayesian probabilistic inference. We describe this new synthesis, which views causal learning and inference as a fundamentally rational process, and review a sample of the empirical findings that support the causal framework over associative alternatives. Causal events, like all events in the distal world as opposed to our proximal perceptual input, are inherently unobservable. A central assumption of the causal approach is that humans (and potentially nonhuman animals) have been designed in such a way as to infer the most invariant causal relations for achieving their goals based on observed events. In contrast, the associative approach assumes that learners only acquire associations among important observed events, omitting the representation of the distal relations. By incorporating bayesian inference over distributions of causal strength and causal structures, along with noisy-logical (i.e., causal) functions for integrating the influences of multiple causes on a single effect, human judgments about causal strength and structure can be predicted accurately for relatively simple causal structures. Dynamic models of learning based on the causal framework can explain patterns of acquisition observed with serial presentation of contingency data and are consistent with available neuroimaging data. The approach has been extended to a diverse range of inductive tasks, including category-based and analogical inferences.

  4. Globally conditioned Granger causality in brain–brain and brain–heart interactions: a combined heart rate variability/ultra-high-field (7 T) functional magnetic resonance imaging study

    PubMed Central

    Passamonti, Luca; Wald, Lawrence L.; Barbieri, Riccardo

    2016-01-01

    The causal, directed interactions between brain regions at rest (brain–brain networks) and between resting-state brain activity and autonomic nervous system (ANS) outflow (brain–heart links) have not been completely elucidated. We collected 7 T resting-state functional magnetic resonance imaging (fMRI) data with simultaneous respiration and heartbeat recordings in nine healthy volunteers to investigate (i) the causal interactions between cortical and subcortical brain regions at rest and (ii) the causal interactions between resting-state brain activity and the ANS as quantified through a probabilistic, point-process-based heartbeat model which generates dynamical estimates for sympathetic and parasympathetic activity as well as sympathovagal balance. Given the high amount of information shared between brain-derived signals, we compared the results of traditional bivariate Granger causality (GC) with a globally conditioned approach which evaluated the additional influence of each brain region on the causal target while factoring out effects concomitantly mediated by other brain regions. The bivariate approach resulted in a large number of possibly spurious causal brain–brain links, while, using the globally conditioned approach, we demonstrated the existence of significant selective causal links between cortical/subcortical brain regions and sympathetic and parasympathetic modulation as well as sympathovagal balance. In particular, we demonstrated a causal role of the amygdala, hypothalamus, brainstem and, among others, medial, middle and superior frontal gyri, superior temporal pole, paracentral lobule and cerebellar regions in modulating the so-called central autonomic network (CAN). In summary, we show that, provided proper conditioning is employed to eliminate spurious causalities, ultra-high-field functional imaging coupled with physiological signal acquisition and GC analysis is able to quantify directed brain–brain and brain–heart interactions reflecting central modulation of ANS outflow. PMID:27044985

  5. Drug-disease association and drug-repositioning predictions in complex diseases using causal inference-probabilistic matrix factorization.

    PubMed

    Yang, Jihong; Li, Zheng; Fan, Xiaohui; Cheng, Yiyu

    2014-09-22

    The high incidence of complex diseases has become a worldwide threat to human health. Multiple targets and pathways are perturbed during the pathological process of complex diseases. Systematic investigation of complex relationship between drugs and diseases is necessary for new association discovery and drug repurposing. For this purpose, three causal networks were constructed herein for cardiovascular diseases, diabetes mellitus, and neoplasms, respectively. A causal inference-probabilistic matrix factorization (CI-PMF) approach was proposed to predict and classify drug-disease associations, and further used for drug-repositioning predictions. First, multilevel systematic relations between drugs and diseases were integrated from heterogeneous databases to construct causal networks connecting drug-target-pathway-gene-disease. Then, the association scores between drugs and diseases were assessed by evaluating a drug's effects on multiple targets and pathways. Furthermore, PMF models were learned based on known interactions, and associations were then classified into three types by trained models. Finally, therapeutic associations were predicted based upon the ranking of association scores and predicted association types. In terms of drug-disease association prediction, modified causal inference included in CI-PMF outperformed existing causal inference with a higher AUC (area under receiver operating characteristic curve) score and greater precision. Moreover, CI-PMF performed better than single modified causal inference in predicting therapeutic drug-disease associations. In the top 30% of predicted associations, 58.6% (136/232), 50.8% (31/61), and 39.8% (140/352) hit known therapeutic associations, while precisions obtained by the latter were only 10.2% (231/2264), 8.8% (36/411), and 9.7% (189/1948). Clinical verifications were further conducted for the top 100 newly predicted therapeutic associations. As a result, 21, 12, and 32 associations have been studied and many treatment effects of drugs on diseases were investigated for cardiovascular diseases, diabetes mellitus, and neoplasms, respectively. Related chains in causal networks were extracted for these 65 clinical-verified associations, and we further illustrated the therapeutic role of etodolac in breast cancer by inferred chains. Overall, CI-PMF is a useful approach for associating drugs with complex diseases and provides potential values for drug repositioning.

  6. Representing Causation

    ERIC Educational Resources Information Center

    Wolff, Phillip

    2007-01-01

    The dynamics model, which is based on L. Talmy's (1988) theory of force dynamics, characterizes causation as a pattern of forces and a position vector. In contrast to counterfactual and probabilistic models, the dynamics model naturally distinguishes between different cause-related concepts and explains the induction of causal relationships from…

  7. Guidelines for a graph-theoretic implementation of structural equation modeling

    USGS Publications Warehouse

    Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William

    2012-01-01

    Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.

  8. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  9. Sex differences in the inference and perception of causal relations within a video game

    PubMed Central

    Young, Michael E.

    2014-01-01

    The learning of immediate causation within a dynamic environment was examined. Participants encountered seven decision points in which they needed to choose, which of three possible candidates was the cause of explosions in the environment. Each candidate was firing a weapon at random every few seconds, but only one of them produced an immediate effect. Some participants showed little learning, but most demonstrated increases in accuracy across time. On average, men showed higher accuracy and shorter latencies that were not explained by differences in self-reported prior video game experience. This result suggests that prior reports of sex differences in causal choice in the game are not specific to situations involving delayed or probabilistic causal relations. PMID:25202293

  10. Sex differences in the inference and perception of causal relations within a video game.

    PubMed

    Young, Michael E

    2014-01-01

    The learning of immediate causation within a dynamic environment was examined. Participants encountered seven decision points in which they needed to choose, which of three possible candidates was the cause of explosions in the environment. Each candidate was firing a weapon at random every few seconds, but only one of them produced an immediate effect. Some participants showed little learning, but most demonstrated increases in accuracy across time. On average, men showed higher accuracy and shorter latencies that were not explained by differences in self-reported prior video game experience. This result suggests that prior reports of sex differences in causal choice in the game are not specific to situations involving delayed or probabilistic causal relations.

  11. Probabilistic Reversal Learning in Schizophrenia: Stability of Deficits and Potential Causal Mechanisms.

    PubMed

    Reddy, Lena Felice; Waltz, James A; Green, Michael F; Wynn, Jonathan K; Horan, William P

    2016-07-01

    Although individuals with schizophrenia show impaired feedback-driven learning on probabilistic reversal learning (PRL) tasks, the specific factors that contribute to these deficits remain unknown. Recent work has suggested several potential causes including neurocognitive impairments, clinical symptoms, and specific types of feedback-related errors. To examine this issue, we administered a PRL task to 126 stable schizophrenia outpatients and 72 matched controls, and patients were retested 4 weeks later. The task involved an initial probabilistic discrimination learning phase and subsequent reversal phases in which subjects had to adjust their responses to sudden shifts in the reinforcement contingencies. Patients showed poorer performance than controls for both the initial discrimination and reversal learning phases of the task, and performance overall showed good test-retest reliability among patients. A subgroup analysis of patients (n = 64) and controls (n = 49) with good initial discrimination learning revealed no between-group differences in reversal learning, indicating that the patients who were able to achieve all of the initial probabilistic discriminations were not impaired in reversal learning. Regarding potential contributors to impaired discrimination learning, several factors were associated with poor PRL, including higher levels of neurocognitive impairment, poor learning from both positive and negative feedback, and higher levels of indiscriminate response shifting. The results suggest that poor PRL performance in schizophrenia can be the product of multiple mechanisms. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.

    PubMed

    Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone

    2017-05-31

    Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update prior beliefs with TMS delivered at 300 ms after target onset. Copyright © 2017 the authors 0270-6474/17/375419-10$15.00/0.

  13. The Bogus Science.

    ERIC Educational Resources Information Center

    Anderson, C. C.

    1981-01-01

    Discusses three sorts of psychological science: Science 1, a natural science with conditional (causal) laws; Science 2, with probabilistic laws; and Science 3, a "human science" trying to capture the complexity of human experience and behavior. Argues that Science 2 and Science 3 should be treated as belief systems which people may find useful for…

  14. The good, the bad, and the timely: how temporal order and moral judgment influence causal selection

    PubMed Central

    Reuter, Kevin; Kirfel, Lara; van Riel, Raphael; Barlassina, Luca

    2014-01-01

    Causal selection is the cognitive process through which one or more elements in a complex causal structure are singled out as actual causes of a certain effect. In this paper, we report on an experiment in which we investigated the role of moral and temporal factors in causal selection. Our results are as follows. First, when presented with a temporal chain in which two human agents perform the same action one after the other, subjects tend to judge the later agent to be the actual cause. Second, the impact of temporal location on causal selection is almost canceled out if the later agent did not violate a norm while the former did. We argue that this is due to the impact that judgments of norm violation have on causal selection—even if the violated norm has nothing to do with the obtaining effect. Third, moral judgments about the effect influence causal selection even in the case in which agents could not have foreseen the effect and did not intend to bring it about. We discuss our findings in connection to recent theories of the role of moral judgment in causal reasoning, on the one hand, and to probabilistic models of temporal location, on the other. PMID:25477851

  15. Hemispheric Division of Function Is the Result of Independent Probabilistic Biases

    ERIC Educational Resources Information Center

    Whitehouse, Andrew J. O.; Bishop, Dorothy V. M.

    2009-01-01

    Verbal and visuospatial abilities are typically subserved by different cerebral hemispheres: the left hemisphere for the former and the right hemisphere for the latter. However little is known of the origin of this division of function. Causal theories propose that functional asymmetry is an obligatory pattern of organisation, while statistical…

  16. Epidemiological causality.

    PubMed

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.

  17. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  18. Conservative forgetful scholars: How people learn causal structure through sequences of interventions.

    PubMed

    Bramley, Neil R; Lagnado, David A; Speekenbrink, Maarten

    2015-05-01

    Interacting with a system is key to uncovering its causal structure. A computational framework for interventional causal learning has been developed over the last decade, but how real causal learners might achieve or approximate the computations entailed by this framework is still poorly understood. Here we describe an interactive computer task in which participants were incentivized to learn the structure of probabilistic causal systems through free selection of multiple interventions. We develop models of participants' intervention choices and online structure judgments, using expected utility gain, probability gain, and information gain and introducing plausible memory and processing constraints. We find that successful participants are best described by a model that acts to maximize information (rather than expected score or probability of being correct); that forgets much of the evidence received in earlier trials; but that mitigates this by being conservative, preferring structures consistent with earlier stated beliefs. We explore 2 heuristics that partly explain how participants might be approximating these models without explicitly representing or updating a hypothesis space. (c) 2015 APA, all rights reserved).

  19. Tractography-Based Score for Learning Effective Connectivity From Multimodal Imaging Data Using Dynamic Bayesian Networks.

    PubMed

    Dang, Shilpa; Chaudhury, Santanu; Lall, Brejesh; Roy, Prasun K

    2018-05-01

    Effective connectivity (EC) is the methodology for determining functional-integration among the functionally active segregated regions of the brain. By definition EC is "the causal influence exerted by one neuronal group on another" which is constrained by anatomical connectivity (AC) (axonal connections). AC is necessary for EC but does not fully determine it, because synaptic communication occurs dynamically in a context-dependent fashion. Although there is a vast emerging evidence of structure-function relationship using multimodal imaging studies, till date only a few studies have done joint modeling of the two modalities: functional MRI (fMRI) and diffusion tensor imaging (DTI). We aim to propose a unified probabilistic framework that combines information from both sources to learn EC using dynamic Bayesian networks (DBNs). DBNs are probabilistic graphical temporal models that learn EC in an exploratory fashion. Specifically, we propose a novel anatomically informed (AI) score that evaluates fitness of a given connectivity structure to both DTI and fMRI data simultaneously. The AI score is employed in structure learning of DBN given the data. Experiments with synthetic-data demonstrate the face validity of structure learning with our AI score over anatomically uninformed counterpart. Moreover, real-data results are cross-validated by performing classification-experiments. EC inferred on real fMRI-DTI datasets is found to be consistent with previous literature and show promising results in light of the AC present as compared to other classically used techniques such as Granger-causality. Multimodal analyses provide a more reliable basis for differentiating brain under abnormal/diseased conditions than the single modality analysis.

  20. Quantum Mechanics predicts evolutionary biology.

    PubMed

    Torday, J S

    2018-07-01

    Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. A Bayesian network coding scheme for annotating biomedical information presented to genetic counseling clients.

    PubMed

    Green, Nancy

    2005-04-01

    We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.

  2. Conditional spectrum computation incorporating multiple causal earthquakes and ground-motion prediction models

    USGS Publications Warehouse

    Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas

    2013-01-01

    The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.

  3. Dynamic Uncertain Causality Graph for Knowledge Representation and Reasoning: Utilization of Statistical Data and Domain Knowledge in Complex Cases.

    PubMed

    Zhang, Qin; Yao, Quanying

    2018-05-01

    The dynamic uncertain causality graph (DUCG) is a newly presented framework for uncertain causality representation and probabilistic reasoning. It has been successfully applied to online fault diagnoses of large, complex industrial systems, and decease diagnoses. This paper extends the DUCG to model more complex cases than what could be previously modeled, e.g., the case in which statistical data are in different groups with or without overlap, and some domain knowledge and actions (new variables with uncertain causalities) are introduced. In other words, this paper proposes to use -mode, -mode, and -mode of the DUCG to model such complex cases and then transform them into either the standard -mode or the standard -mode. In the former situation, if no directed cyclic graph is involved, the transformed result is simply a Bayesian network (BN), and existing inference methods for BNs can be applied. In the latter situation, an inference method based on the DUCG is proposed. Examples are provided to illustrate the methodology.

  4. The probabilistic convolution tree: efficient exact Bayesian inference for faster LC-MS/MS protein inference.

    PubMed

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called "causal independence"). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to O(k log(k)2) and the space to O(k log(k)) where k is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions.

  5. The Probabilistic Convolution Tree: Efficient Exact Bayesian Inference for Faster LC-MS/MS Protein Inference

    PubMed Central

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called “causal independence”). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to and the space to where is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions. PMID:24626234

  6. Who is susceptible to conjunction fallacies in category-based induction?

    PubMed

    Feeney, Aidan; Shafto, Patrick; Dunning, Darren

    2007-10-01

    Recent evidence suggests that the conjunction fallacy observed in people's probabilistic reasoning is also to be found in their evaluations of inductive argument strength. We presented 130 participants with materials likely to produce a conjunction fallacy either by virtue of a shared categorical or a causal relationship between the categories in the argument. We also took a measure of participants' cognitive ability. We observed conjunction fallacies overall with both sets of materials but found an association with ability for the categorical materials only. Our results have implications for accounts of individual differences in reasoning, for the relevance theory of induction, and for the recent claim that causal knowledge is important in inductive reasoning.

  7. Inference in the brain: Statistics flowing in redundant population codes

    PubMed Central

    Pitkow, Xaq; Angelaki, Dora E

    2017-01-01

    It is widely believed that the brain performs approximate probabilistic inference to estimate causal variables in the world from ambiguous sensory data. To understand these computations, we need to analyze how information is represented and transformed by the actions of nonlinear recurrent neural networks. We propose that these probabilistic computations function by a message-passing algorithm operating at the level of redundant neural populations. To explain this framework, we review its underlying concepts, including graphical models, sufficient statistics, and message-passing, and then describe how these concepts could be implemented by recurrently connected probabilistic population codes. The relevant information flow in these networks will be most interpretable at the population level, particularly for redundant neural codes. We therefore outline a general approach to identify the essential features of a neural message-passing algorithm. Finally, we argue that to reveal the most important aspects of these neural computations, we must study large-scale activity patterns during moderately complex, naturalistic behaviors. PMID:28595050

  8. Causality re-established.

    PubMed

    D'Ariano, Giacomo Mauro

    2018-07-13

    Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  9. Reasoning about Causal Relationships: Inferences on Causal Networks

    PubMed Central

    Rottman, Benjamin Margolin; Hastie, Reid

    2013-01-01

    Over the last decade, a normative framework for making causal inferences, Bayesian Probabilistic Causal Networks, has come to dominate psychological studies of inference based on causal relationships. The following causal networks—[X→Y→Z, X←Y→Z, X→Y←Z]—supply answers for questions like, “Suppose both X and Y occur, what is the probability Z occurs?” or “Suppose you intervene and make Y occur, what is the probability Z occurs?” In this review, we provide a tutorial for how normatively to calculate these inferences. Then, we systematically detail the results of behavioral studies comparing human qualitative and quantitative judgments to the normative calculations for many network structures and for several types of inferences on those networks. Overall, when the normative calculations imply that an inference should increase, judgments usually go up; when calculations imply a decrease, judgments usually go down. However, two systematic deviations appear. First, people’s inferences violate the Markov assumption. For example, when inferring Z from the structure X→Y→Z, people think that X is relevant even when Y completely mediates the relationship between X and Z. Second, even when people’s inferences are directionally consistent with the normative calculations, they are often not as sensitive to the parameters and the structure of the network as they should be. We conclude with a discussion of productive directions for future research. PMID:23544658

  10. Data-driven confounder selection via Markov and Bayesian networks.

    PubMed

    Häggström, Jenny

    2018-06-01

    To unbiasedly estimate a causal effect on an outcome unconfoundedness is often assumed. If there is sufficient knowledge on the underlying causal structure then existing confounder selection criteria can be used to select subsets of the observed pretreatment covariates, X, sufficient for unconfoundedness, if such subsets exist. Here, estimation of these target subsets is considered when the underlying causal structure is unknown. The proposed method is to model the causal structure by a probabilistic graphical model, for example, a Markov or Bayesian network, estimate this graph from observed data and select the target subsets given the estimated graph. The approach is evaluated by simulation both in a high-dimensional setting where unconfoundedness holds given X and in a setting where unconfoundedness only holds given subsets of X. Several common target subsets are investigated and the selected subsets are compared with respect to accuracy in estimating the average causal effect. The proposed method is implemented with existing software that can easily handle high-dimensional data, in terms of large samples and large number of covariates. The results from the simulation study show that, if unconfoundedness holds given X, this approach is very successful in selecting the target subsets, outperforming alternative approaches based on random forests and LASSO, and that the subset estimating the target subset containing all causes of outcome yields smallest MSE in the average causal effect estimation. © 2017, The International Biometric Society.

  11. Statistical analysis of life history calendar data.

    PubMed

    Eerola, Mervi; Helske, Satu

    2016-04-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.

  12. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwood, Caroline S

    The goal of this project is to identify gene networks that are critical for efficient biohydrogen production by leveraging variation in gene content and gene expression in independently isolated Rhodopseudomonas palustris strains. Coexpression methods were applied to large data sets that we have collected to define probabilistic causal gene networks. To our knowledge this a first systems level approach that takes advantage of strain-to strain variability to computationally define networks critical for a particular bacterial phenotypic trait.

  14. Causal Inference and Explaining Away in a Spiking Network

    PubMed Central

    Moreno-Bote, Rubén; Drugowitsch, Jan

    2015-01-01

    While the brain uses spiking neurons for communication, theoretical research on brain computations has mostly focused on non-spiking networks. The nature of spike-based algorithms that achieve complex computations, such as object probabilistic inference, is largely unknown. Here we demonstrate that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons. The network naturally imposes the non-negativity of causal contributions that is fundamental to causal inference, and uses simple operations, such as linear synapses with realistic time constants, and neural spike generation and reset non-linearities. The network infers the set of most likely causes from an observation using explaining away, which is dynamically implemented by spike-based, tuned inhibition. The algorithm performs remarkably well even when the network intrinsically generates variable spike trains, the timing of spikes is scrambled by external sources of noise, or the network is mistuned. This type of network might underlie tasks such as odor identification and classification. PMID:26621426

  15. Causal Inference and Explaining Away in a Spiking Network.

    PubMed

    Moreno-Bote, Rubén; Drugowitsch, Jan

    2015-12-01

    While the brain uses spiking neurons for communication, theoretical research on brain computations has mostly focused on non-spiking networks. The nature of spike-based algorithms that achieve complex computations, such as object probabilistic inference, is largely unknown. Here we demonstrate that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons. The network naturally imposes the non-negativity of causal contributions that is fundamental to causal inference, and uses simple operations, such as linear synapses with realistic time constants, and neural spike generation and reset non-linearities. The network infers the set of most likely causes from an observation using explaining away, which is dynamically implemented by spike-based, tuned inhibition. The algorithm performs remarkably well even when the network intrinsically generates variable spike trains, the timing of spikes is scrambled by external sources of noise, or the network is mistuned. This type of network might underlie tasks such as odor identification and classification.

  16. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Cost-Effectiveness of a Model Infection Control Program for Preventing Multi-Drug-Resistant Organism Infections in Critically Ill Surgical Patients.

    PubMed

    Jayaraman, Sudha P; Jiang, Yushan; Resch, Stephen; Askari, Reza; Klompas, Michael

    2016-10-01

    Interventions to contain two multi-drug-resistant Acinetobacter (MDRA) outbreaks reduced the incidence of multi-drug-resistant (MDR) organisms, specifically methicillin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus, and Clostridium difficile in the general surgery intensive care unit (ICU) of our hospital. We therefore conducted a cost-effective analysis of a proactive model infection-control program to reduce transmission of MDR organisms based on the practices used to control the MDRA outbreak. We created a model of a proactive infection control program based on the 2011 MDRA outbreak response. We built a decision analysis model and performed univariable and probabilistic sensitivity analyses to evaluate the cost-effectiveness of the proposed program compared with standard infection control practices to reduce transmission of these MDR organisms. The cost of a proactive infection control program would be $68,509 per year. The incremental cost-effectiveness ratio (ICER) was calculated to be $3,804 per aversion of transmission of MDR organisms in a one-year period compared with standard infection control. On the basis of probabilistic sensitivity analysis, a willingness-to-pay (WTP) threshold of $14,000 per transmission averted would have a 42% probability of being cost-effective, rising to 100% at $22,000 per transmission averted. This analysis gives an estimated ICER for implementing a proactive program to prevent transmission of MDR organisms in the general surgery ICU. To better understand the causal relations between the critical steps in the program and the rate reductions, a randomized study of a package of interventions to prevent healthcare-associated infections should be considered.

  18. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  19. Using Structured Knowledge Representation for Context-Sensitive Probabilistic Modeling

    DTIC Science & Technology

    2008-01-01

    Morgan Kaufmann, 1988. [24] J. Pearl, Causality: Models, Reasoning, and Inference, Cambridge University Press, 2000. [25] J. Piaget , Piaget’s theory ...5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES...AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved

  20. Dispositional Employability and Online Training Purchase. Evidence from Employees' Behavior in Spain.

    PubMed

    Torrent-Sellens, Joan; Ficapal-Cusí, Pilar; Boada-Grau, Joan

    2016-01-01

    This article explores the relationship between dispositional employability and online training purchase. Through a sample of 883 employees working for enterprises in Spain, and a using principal component analysis and binomial logit probabilistic models, the research revealed two main results. First, it was found that dispositional employability is characterized by five factors: "openness to changes at work," "career motivation and work resilience," "work and career proactivity," "optimism and engagement at work," and "work identity." Second, the research also found a double causality in the relationship analysis between dispositional employability and online training purchase. However, this causality is not direct. In explaining dispositional employability, certain motivations and types of behavior of employees participating in online training are significant. In particular, greater sensitivity toward career-related personal empowerment, a greater predisposition toward developing new experiences at work, and a greater awareness of the fact that positive job outcomes are related to preparation conscientiousness. In explaining online training purchase, employees who are more motivated and who better identify with their jobs are more likely to pay. Moreover, employees who spend more time on training and have less contact with new trends in their jobs, find it hard to keep calm in difficult situations, and have a greater predisposition toward effort, and preference for novelty, variety and challenges at work are more likely to purchase online training.

  1. Dispositional Employability and Online Training Purchase. Evidence from Employees' Behavior in Spain

    PubMed Central

    Torrent-Sellens, Joan; Ficapal-Cusí, Pilar; Boada-Grau, Joan

    2016-01-01

    This article explores the relationship between dispositional employability and online training purchase. Through a sample of 883 employees working for enterprises in Spain, and a using principal component analysis and binomial logit probabilistic models, the research revealed two main results. First, it was found that dispositional employability is characterized by five factors: “openness to changes at work,” “career motivation and work resilience,” “work and career proactivity,” “optimism and engagement at work,” and “work identity.” Second, the research also found a double causality in the relationship analysis between dispositional employability and online training purchase. However, this causality is not direct. In explaining dispositional employability, certain motivations and types of behavior of employees participating in online training are significant. In particular, greater sensitivity toward career-related personal empowerment, a greater predisposition toward developing new experiences at work, and a greater awareness of the fact that positive job outcomes are related to preparation conscientiousness. In explaining online training purchase, employees who are more motivated and who better identify with their jobs are more likely to pay. Moreover, employees who spend more time on training and have less contact with new trends in their jobs, find it hard to keep calm in difficult situations, and have a greater predisposition toward effort, and preference for novelty, variety and challenges at work are more likely to purchase online training. PMID:27313557

  2. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  3. NESTOR: A Computer-Based Medical Diagnostic Aid That Integrates Causal and Probabilistic Knowledge.

    DTIC Science & Technology

    1984-11-01

    indiidual conditional probabilities between one cause node and its effect node, but less common to know a joint conditional probability between a...PERFOAMING ORG. REPORT NUMBER * 7. AUTI4ORs) O Gregory F. Cooper 1 CONTRACT OR GRANT NUMBERIa) ONR N00014-81-K-0004 g PERFORMING ORGANIZATION NAME AND...ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK Department of Computer Science AREA & WORK UNIT NUMBERS Stanford University Stanford, CA 94305 USA 12. REPORT

  4. Combining Human and Machine Intelligence to Derive Agents' Behavioral Rules for Groundwater Irrigation

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Quinn, C.; Cai, X.

    2015-12-01

    One major challenge of agent-based modeling is to derive agents' behavioral rules due to behavioral uncertainty and data scarcity. This study proposes a new approach to combine a data-driven modeling based on the directed information (i.e., machine intelligence) with expert domain knowledge (i.e., human intelligence) to derive the behavioral rules of agents considering behavioral uncertainty. A directed information graph algorithm is applied to identifying the causal relationships between agents' decisions (i.e., groundwater irrigation depth) and time-series of environmental, socio-economical and institutional factors. A case study is conducted for the High Plains aquifer hydrological observatory (HO) area, U.S. Preliminary results show that four factors, corn price (CP), underlying groundwater level (GWL), monthly mean temperature (T) and precipitation (P) have causal influences on agents' decisions on groundwater irrigation depth (GWID) to various extents. Based on the similarity of the directed information graph for each agent, five clusters of graphs are further identified to represent all the agents' behaviors in the study area as shown in Figure 1. Using these five representative graphs, agents' monthly optimal groundwater pumping rates are derived through the probabilistic inference. Such data-driven relationships and probabilistic quantifications are then coupled with a physically-based groundwater model to investigate the interactions between agents' pumping behaviors and the underlying groundwater system in the context of coupled human and natural systems.

  5. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  6. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  7. Bayesian network modelling of upper gastrointestinal bleeding

    NASA Astrophysics Data System (ADS)

    Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri

    2013-09-01

    Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.

  8. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  9. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  10. "It Can Bust at Any Seam": Lessons for Deep Space Flight from Mawson's 1911-1914 Australasian Antarctic Expedition

    NASA Astrophysics Data System (ADS)

    Wallace, Phillip Scott

    2010-09-01

    Lessons useful for manned space flight can be gained by looking at exploring expeditions of the past. An aviation-accident style investigation was conducted on two fatalities that occurred on an Antarctic expedition in 1912-13. The causal factors of the accidents were determined; and lessons for future missions beyond LEO gleaned from both the causal factors and from looking at the expedition as a whole. The investigation highlighted, among other things, that probabilistic hazards can eventually take a life and that factors of terrain can and will damage equipment and kill men; that consumables should be segregated such that one mishap does not reduce margins to below those needed for survival, and that manned missions need to be able to jury-rig equipment in the field.

  11. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  12. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  13. Probabilistic delay differential equation modeling of event-related potentials.

    PubMed

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Graphical Models for Recovering Probabilistic and Causal Queries from Missing Data

    DTIC Science & Technology

    2014-11-01

    NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...we can apply our results to problems of attrition in which missingness is a severe obstacle to sound inferences. Related works are discussed in...due to the collider path between Y and Ry ). 8 Related Work Deletion based methods such as listwise deletion that are easy to understand as well as

  15. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  16. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  17. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  18. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    NASA Astrophysics Data System (ADS)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly favor one range of probabilistic projections over another, that the choice of results on which to base policy must necessarily involve ethical considerations, as they have inevitable consequences for the distribution of risk In particular, the choice to use a more "optimistic" PDF for climate sensitivity (or other components of the causal chain) leads to the allowance of higher emissions consistent with any specified goal for risk reduction, and thus leads to higher climate impacts, in exchange for lower mitigation costs.

  19. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  20. A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    2016-07-18

    This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.

  1. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  2. Brain resting-state networks in adolescents with high-functioning autism: Analysis of spatial connectivity and temporal neurodynamics.

    PubMed

    Bernas, Antoine; Barendse, Evelien M; Aldenkamp, Albert P; Backes, Walter H; Hofman, Paul A M; Hendriks, Marc P H; Kessels, Roy P C; Willems, Frans M J; de With, Peter H N; Zinger, Svitlana; Jansen, Jacobus F A

    2018-02-01

    Autism spectrum disorder (ASD) is mainly characterized by functional and communication impairments as well as restrictive and repetitive behavior. The leading hypothesis for the neural basis of autism postulates globally abnormal brain connectivity, which can be assessed using functional magnetic resonance imaging (fMRI). Even in the absence of a task, the brain exhibits a high degree of functional connectivity, known as intrinsic, or resting-state, connectivity. Global default connectivity in individuals with autism versus controls is not well characterized, especially for a high-functioning young population. The aim of this study is to test whether high-functioning adolescents with ASD (HFA) have an abnormal resting-state functional connectivity. We performed spatial and temporal analyses on resting-state networks (RSNs) in 13 HFA adolescents and 13 IQ- and age-matched controls. For the spatial analysis, we used probabilistic independent component analysis (ICA) and a permutation statistical method to reveal the RSN differences between the groups. For the temporal analysis, we applied Granger causality to find differences in temporal neurodynamics. Controls and HFA display very similar patterns and strengths of resting-state connectivity. We do not find any significant differences between HFA adolescents and controls in the spatial resting-state connectivity. However, in the temporal dynamics of this connectivity, we did find differences in the causal effect properties of RSNs originating in temporal and prefrontal cortices. The results show a difference between HFA and controls in the temporal neurodynamics from the ventral attention network to the salience-executive network: a pathway involving cognitive, executive, and emotion-related cortices. We hypothesized that this weaker dynamic pathway is due to a subtle trigger challenging the cognitive state prior to the resting state.

  3. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  4. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  5. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  6. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  7. Causal discovery in the geosciences-Using synthetic data to learn how to interpret results

    NASA Astrophysics Data System (ADS)

    Ebert-Uphoff, Imme; Deng, Yi

    2017-02-01

    Causal discovery algorithms based on probabilistic graphical models have recently emerged in geoscience applications for the identification and visualization of dynamical processes. The key idea is to learn the structure of a graphical model from observed spatio-temporal data, thus finding pathways of interactions in the observed physical system. Studying those pathways allows geoscientists to learn subtle details about the underlying dynamical mechanisms governing our planet. Initial studies using this approach on real-world atmospheric data have shown great potential for scientific discovery. However, in these initial studies no ground truth was available, so that the resulting graphs have been evaluated only by whether a domain expert thinks they seemed physically plausible. The lack of ground truth is a typical problem when using causal discovery in the geosciences. Furthermore, while most of the connections found by this method match domain knowledge, we encountered one type of connection for which no explanation was found. To address both of these issues we developed a simulation framework that generates synthetic data of typical atmospheric processes (advection and diffusion). Applying the causal discovery algorithm to the synthetic data allowed us (1) to develop a better understanding of how these physical processes appear in the resulting connectivity graphs, and thus how to better interpret such connectivity graphs when obtained from real-world data; (2) to solve the mystery of the previously unexplained connections.

  8. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  9. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  10. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  11. Questions on causality and responsibility arising from an outbreak of Pseudomonas aeruginosa infections in Norway

    PubMed Central

    Iversen, Bjørn G; Hofmann, Bjørn; Aavitsland, Preben

    2008-01-01

    In 2002, Norway experienced a large outbreak of Pseudomonas aeruginosa infections in hospitals with 231 confirmed cases. This fuelled intense public and professional debates on what were the causes and who were responsible. In epidemiology, other sciences, in philosophy and in law there is a long tradition of discussing the concept of causality. We use this outbreak as a case; apply various theories of causality from different disciplines to discuss the roles and responsibilities of some of the parties involved. Mackie's concept of INUS conditions, Hill's nine viewpoints to study association for claiming causation, deterministic and probabilistic ways of reasoning, all shed light on the issues of causality in this outbreak. Moreover, applying legal theories of causation (counterfactual reasoning and the "but-for" test and the NESS test) proved especially useful, but the case also illustrated the weaknesses of the various theories of causation. We conclude that many factors contributed to causing the outbreak, but that contamination of a medical device in the production facility was the major necessary condition. The reuse of the medical device in hospitals contributed primarily to the size of the outbreak. The unintended error by its producer – and to a minor extent by the hospital practice – was mainly due to non-application of relevant knowledge and skills, and appears to constitute professional negligence. Due to criminal procedure laws and other factors outside the discourse of causality, no one was criminally charged for the outbreak which caused much suffering and shortening the life of at least 34 people. PMID:18947429

  12. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  13. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  14. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  15. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  16. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  17. Data-driven Modeling of Metal-oxide Sensors with Dynamic Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Gosangi, Rakesh; Gutierrez-Osuna, Ricardo

    2011-09-01

    We present a data-driven probabilistic framework to model the transient response of MOX sensors modulated with a sequence of voltage steps. Analytical models of MOX sensors are usually built based on the physico-chemical properties of the sensing materials. Although building these models provides an insight into the sensor behavior, they also require a thorough understanding of the underlying operating principles. Here we propose a data-driven approach to characterize the dynamical relationship between sensor inputs and outputs. Namely, we use dynamic Bayesian networks (DBNs), probabilistic models that represent temporal relations between a set of random variables. We identify a set of control variables that influence the sensor responses, create a graphical representation that captures the causal relations between these variables, and finally train the model with experimental data. We validated the approach on experimental data in terms of predictive accuracy and classification performance. Our results show that DBNs can accurately predict the dynamic response of MOX sensors, as well as capture the discriminatory information present in the sensor transients.

  18. Quantum Enhanced Inference in Markov Logic Networks

    NASA Astrophysics Data System (ADS)

    Wittek, Peter; Gogolin, Christian

    2017-04-01

    Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.

  19. Quantum Enhanced Inference in Markov Logic Networks.

    PubMed

    Wittek, Peter; Gogolin, Christian

    2017-04-19

    Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.

  20. Quantum Enhanced Inference in Markov Logic Networks

    PubMed Central

    Wittek, Peter; Gogolin, Christian

    2017-01-01

    Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning. PMID:28422093

  1. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  3. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  4. Probabilistic QoS Analysis In Wireless Sensor Networks

    DTIC Science & Technology

    2012-04-01

    and A.O. Fapojuwo. TDMA scheduling with optimized energy efficiency and minimum delay in clustered wireless sensor networks . IEEE Trans. on Mobile...Research Computer Science and Engineering, Department of 5-1-2012 Probabilistic QoS Analysis in Wireless Sensor Networks Yunbo Wang University of...Wang, Yunbo, "Probabilistic QoS Analysis in Wireless Sensor Networks " (2012). Computer Science and Engineering: Theses, Dissertations, and Student

  5. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  6. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  7. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial (and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.

  8. Probabilistic Multiple-Bias Modeling Applied to the Canadian Data From the Interphone Study of Mobile Phone Use and Risk of Glioma, Meningioma, Acoustic Neuroma, and Parotid Gland Tumors.

    PubMed

    Momoli, F; Siemiatycki, J; McBride, M L; Parent, M-É; Richardson, L; Bedard, D; Platt, R; Vrijheid, M; Cardis, E; Krewski, D

    2017-10-01

    We undertook a re-analysis of the Canadian data from the 13-country case-control Interphone Study (2001-2004), in which researchers evaluated the associations of mobile phone use with the risks of brain, acoustic neuroma, and parotid gland tumors. In the main publication of the multinational Interphone Study, investigators concluded that biases and errors prevented a causal interpretation. We applied a probabilistic multiple-bias model to address possible biases simultaneously, using validation data from billing records and nonparticipant questionnaires as information on recall error and selective participation. In our modeling, we sought to adjust for these sources of uncertainty and to facilitate interpretation. For glioma, when comparing those in the highest quartile of use (>558 lifetime hours) to those who were not regular users, the odds ratio was 2.0 (95% confidence interval: 1.2, 3.4). After adjustment for selection and recall biases, the odds ratio was 2.2 (95% limits: 1.3, 4.1). There was little evidence of an increase in the risk of meningioma, acoustic neuroma, or parotid gland tumors in relation to mobile phone use. Adjustments for selection and recall biases did not materially affect interpretation in our results from Canadian data. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Directional connectivity of resting state human fMRI data using cascaded ICA-PDC analysis.

    PubMed

    Silfverhuth, Minna J; Remes, Jukka; Starck, Tuomo; Nikkinen, Juha; Veijola, Juha; Tervonen, Osmo; Kiviniemi, Vesa

    2011-11-01

    Directional connectivity measures, such as partial directed coherence (PDC), give us means to explore effective connectivity in the human brain. By utilizing independent component analysis (ICA), the original data-set reduction was performed for further PDC analysis. To test this cascaded ICA-PDC approach in causality studies of human functional magnetic resonance imaging (fMRI) data. Resting state group data was imaged from 55 subjects using a 1.5 T scanner (TR 1800 ms, 250 volumes). Temporal concatenation group ICA in a probabilistic ICA and further repeatability runs (n = 200) were overtaken. The reduced data-set included the time series presentation of the following nine ICA components: secondary somatosensory cortex, inferior temporal gyrus, intracalcarine cortex, primary auditory cortex, amygdala, putamen and the frontal medial cortex, posterior cingulate cortex and precuneus, comprising the default mode network components. Re-normalized PDC (rPDC) values were computed to determine directional connectivity at the group level at each frequency. The integrative role was suggested for precuneus while the role of major divergence region may be proposed to primary auditory cortex and amygdala. This study demonstrates the potential of the cascaded ICA-PDC approach in directional connectivity studies of human fMRI.

  10. Generalized mutual information and Tsirelson's bound

    NASA Astrophysics Data System (ADS)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-01

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  11. Generalized mutual information and Tsirelson's bound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-04

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  12. Fundamentals of clinical methodology: 2. Etiology.

    PubMed

    Sadegh-Zadeh, K

    1998-03-01

    The concept of etiology is analyzed and the possibilities and limitations of deterministic, probabilistic, and fuzzy etiology are explored. Different kinds of formal structures for the relation of causation are introduced which enable us to explicate the notion of cause on qualitative, comparative, and quantitative levels. The conceptual framework developed is an approach to a theory of causality that may be useful in etiologic research, in building nosological systems, and in differential diagnosis, therapeutic decision-making, and controlled clinical trials. The bearings of the theory are exemplified by examining the current Chlamydia pneumoniae hypothesis on the incidence of myocardial infarction.

  13. Scientific thinking in young children: theoretical advances, empirical research, and policy implications.

    PubMed

    Gopnik, Alison

    2012-09-28

    New theoretical ideas and empirical research show that very young children's learning and thinking are strikingly similar to much learning and thinking in science. Preschoolers test hypotheses against data and make causal inferences; they learn from statistics and informal experimentation, and from watching and listening to others. The mathematical framework of probabilistic models and Bayesian inference can describe this learning in precise ways. These discoveries have implications for early childhood education and policy. In particular, they suggest both that early childhood experience is extremely important and that the trend toward more structured and academic early childhood programs is misguided.

  14. Alternate Methods in Refining the SLS Nozzle Plug Loads

    NASA Technical Reports Server (NTRS)

    Burbank, Scott; Allen, Andrew

    2013-01-01

    Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.

  15. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  16. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  17. Probabilistic Sensitivity Analysis with Respect to Bounds of Truncated Distributions (PREPRINT)

    DTIC Science & Technology

    2010-04-01

    AFRL-RX-WP-TP-2010-4147 PROBABILISTIC SENSITIVITY ANALYSIS WITH RESPECT TO BOUNDS OF TRUNCATED DISTRIBUTIONS (PREPRINT) H. Millwater and...5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) H. Millwater and Y. Feng 5d. PROJECT...Z39-18 1 Probabilistic Sensitivity Analysis with respect to Bounds of Truncated Distributions H. Millwater and Y. Feng Department of Mechanical

  18. Automated interviews on clinical case reports to elicit directed acyclic graphs.

    PubMed

    Luciani, Davide; Stefanini, Federico M

    2012-05-01

    Setting up clinical reports within hospital information systems makes it possible to record a variety of clinical presentations. Directed acyclic graphs (Dags) offer a useful way of representing causal relations in clinical problem domains and are at the core of many probabilistic models described in the medical literature, like Bayesian networks. However, medical practitioners are not usually trained to elicit Dag features. Part of the difficulty lies in the application of the concept of direct causality before selecting all the causal variables of interest for a specific patient. We designed an automated interview to tutor medical doctors in the development of Dags to represent their understanding of clinical reports. Medical notions were analyzed to find patterns in medical reasoning that can be followed by algorithms supporting the elicitation of causal Dags. Clinical relevance was defined to help formulate only relevant questions by driving an expert's attention towards variables causally related to nodes already inserted in the graph. Key procedural features of the proposed interview are described by four algorithms. The automated interview comprises questions on medical notions, phrased in medical terms. The first elicitation session produces questions concerning the patient's chief complaints and the outcomes related to diseases serving as diagnostic hypotheses, their observable manifestations and risk factors. The second session focuses on questions that refine the initial causal paths by considering syndromes, dysfunctions, pathogenic anomalies, biases and effect modifiers. A case study concerning a gastro-enterological problem and one dealing with an infected patient illustrate the output produced by the algorithms, depending on the answers provided by the doctor. The proposed elicitation framework is characterized by strong consistency with medical background and by a progressive introduction of relevant medical topics. Revision and testing of the subjectively elicited Dag is performed by matching the collected answers with the evidence included in accepted sources of biomedical knowledge. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  20. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  1. Specifying design conservatism: Worst case versus probabilistic analysis

    NASA Technical Reports Server (NTRS)

    Miles, Ralph F., Jr.

    1993-01-01

    Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.

  2. CAUSAL ANALYSIS AND PROBABILITY DATA: EXAMPLES FOR IMPAIRED AQUATIC CONDITION

    EPA Science Inventory

    Causal analysis is plausible reasoning applied to diagnosing observed effect(s), for example, diagnosing

    cause of biological impairment in a stream. Sir Bradford Hill basically defined the application of causal

    analysis when he enumerated the elements of causality f...

  3. Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan

    2005-01-01

    Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.

  4. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.

  5. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110

  6. A Case Study for Probabilistic Methods Validation (MSFC Center Director's Discretionary Fund, Project No. 94-26)

    NASA Technical Reports Server (NTRS)

    Price J. M.; Ortega, R.

    1998-01-01

    Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.

  7. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  8. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  9. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    PubMed

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  10. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.

  11. Global/local methods for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.

    1993-01-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  12. Global/local methods for probabilistic structural analysis

    NASA Astrophysics Data System (ADS)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  13. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  14. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  15. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 3: Literature surveys and technical reports

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.

  16. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  17. Probabilistic Sensitivity Analysis of Fretting Fatigue (Preprint)

    DTIC Science & Technology

    2009-04-01

    AFRL-RX-WP-TP-2009-4091 PROBABILISTIC SENSITIVITY ANALYSIS OF FRETTING FATIGUE (Preprint) Patrick J. Golden, Harry R. Millwater , and...Sensitivity Analysis of Fretting Fatigue Patrick J. Golden * Air Force Research Laboratory, Wright-Patterson AFB, OH 45433 Harry R. Millwater † and

  18. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  19. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  20. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  1. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 5: Auxiliary shuttle risk analyses

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.

  2. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  3. Extending earthquakes' reach through cascading.

    PubMed

    Marsan, David; Lengliné, Olivier

    2008-02-22

    Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

  4. Entropic Nonsignaling Correlations.

    PubMed

    Chaves, Rafael; Budroni, Costantino

    2016-06-17

    We introduce the concept of entropic nonsignaling correlations, i.e., entropies arising from probabilistic theories that are compatible with the fact that we cannot transmit information instantaneously. We characterize and show the relevance of these entropic correlations in a variety of different scenarios, ranging from typical Bell experiments to more refined descriptions such as bilocality and information causality. In particular, we apply the framework to derive the first entropic inequality testing genuine tripartite nonlocality in quantum systems of arbitrary dimension and also prove the first known monogamy relation for entropic Bell inequalities. Further, within the context of complex Bell networks, we show that entropic nonlocal correlations can be activated.

  5. Behavioral genetics and criminal responsibility at the courtroom.

    PubMed

    Tatarelli, Roberto; Del Casale, Antonio; Tatarelli, Caterina; Serata, Daniele; Rapinesi, Chiara; Sani, Gabriele; Kotzalidis, Georgios D; Girardi, Paolo

    2014-04-01

    Several questions arise from the recent use of behavioral genetic research data in the courtroom. Ethical issues concerning the influence of biological factors on human free will, must be considered when specific gene patterns are advocated to constrain court's judgment, especially regarding violent crimes. Aggression genetics studies are both difficult to interpret and inconsistent, hence, in the absence of a psychiatric diagnosis, genetic data are currently difficult to prioritize in the courtroom. The judge's probabilistic considerations in formulating a sentence must take into account causality, and the latter cannot be currently ensured by genetic data. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Entropic Nonsignaling Correlations

    NASA Astrophysics Data System (ADS)

    Chaves, Rafael; Budroni, Costantino

    2016-06-01

    We introduce the concept of entropic nonsignaling correlations, i.e., entropies arising from probabilistic theories that are compatible with the fact that we cannot transmit information instantaneously. We characterize and show the relevance of these entropic correlations in a variety of different scenarios, ranging from typical Bell experiments to more refined descriptions such as bilocality and information causality. In particular, we apply the framework to derive the first entropic inequality testing genuine tripartite nonlocality in quantum systems of arbitrary dimension and also prove the first known monogamy relation for entropic Bell inequalities. Further, within the context of complex Bell networks, we show that entropic nonlocal correlations can be activated.

  7. New Insights into Signed Path Coefficient Granger Causality Analysis.

    PubMed

    Zhang, Jian; Li, Chong; Jiang, Tianzi

    2016-01-01

    Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of "signed path coefficient Granger causality," a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an "excitatory" or "inhibitory" influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation.

  8. Electromagnetic Compatibility (EMC) in Microelectronics.

    DTIC Science & Technology

    1983-02-01

    Fault Tree Analysis", System Saftey Symposium, June 8-9, 1965, Seattle: The Boeing Company . 12. Fussell, J.B., "Fault Tree Analysis-Concepts and...procedure for assessing EMC in microelectronics and for applying DD, 1473 EOiTO OP I, NOV6 IS OESOL.ETE UNCLASSIFIED SECURITY CLASSIFICATION OF THIS...CRITERIA 2.1 Background 2 2.2 The Probabilistic Nature of EMC 2 2.3 The Probabilistic Approach 5 2.4 The Compatibility Factor 6 3 APPLYING PROBABILISTIC

  9. NESSUS/EXPERT - An expert system for probabilistic structural analysis methods

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Palmer, K.; Fink, P.

    1988-01-01

    An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.

  10. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  11. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  12. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  13. Causal inference in nonlinear systems: Granger causality versus time-delayed mutual information

    NASA Astrophysics Data System (ADS)

    Li, Songting; Xiao, Yanyang; Zhou, Douglas; Cai, David

    2018-05-01

    The Granger causality (GC) analysis has been extensively applied to infer causal interactions in dynamical systems arising from economy and finance, physics, bioinformatics, neuroscience, social science, and many other fields. In the presence of potential nonlinearity in these systems, the validity of the GC analysis in general is questionable. To illustrate this, here we first construct minimal nonlinear systems and show that the GC analysis fails to infer causal relations in these systems—it gives rise to all types of incorrect causal directions. In contrast, we show that the time-delayed mutual information (TDMI) analysis is able to successfully identify the direction of interactions underlying these nonlinear systems. We then apply both methods to neuroscience data collected from experiments and demonstrate that the TDMI analysis but not the GC analysis can identify the direction of interactions among neuronal signals. Our work exemplifies inference hazards in the GC analysis in nonlinear systems and suggests that the TDMI analysis can be an appropriate tool in such a case.

  14. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  15. Probabilistic structural analysis methods for space transportation propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.

    1991-01-01

    Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .

  16. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  17. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  18. Detectability of Granger causality for subsampled continuous-time neurophysiological processes.

    PubMed

    Barnett, Lionel; Seth, Anil K

    2017-01-01

    Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity from neurophysiological recordings. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. New Insights into Signed Path Coefficient Granger Causality Analysis

    PubMed Central

    Zhang, Jian; Li, Chong; Jiang, Tianzi

    2016-01-01

    Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of “signed path coefficient Granger causality,” a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an “excitatory” or “inhibitory” influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation. PMID:27833547

  20. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 2: Literature surveys of critical Space Shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Rajagopal, K. R.

    1992-01-01

    The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.

  1. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  2. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  3. 75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY: Environmental Protection Agency... site, ``2010 release of the Causal Analysis/Diagnosis Decision Information System (CADDIS).'' The... analyses, downloadable software tools, and links to outside information sources. II. How to Submit Comments...

  4. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  5. Structure and Strength in Causal Induction

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2005-01-01

    We present a framework for the rational analysis of elemental causal induction--learning about the existence of a relationship between a single cause and effect--based upon causal graphical models. This framework makes precise the distinction between causal structure and causal strength: the difference between asking whether a causal relationship…

  6. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... 2010 version of the Causal Analysis/Diagnosis Decision Information System (CADDIS). This Web site was... methods; information on basic and advanced data analyses; downloadable software tools; and an online... ENVIRONMENTAL PROTECTION AGENCY [FRL-9206-7] 2010 Release of CADDIS (Causal Analysis/Diagnosis...

  7. Using Propensity Score Analysis for Making Causal Claims in Research Articles

    ERIC Educational Resources Information Center

    Bai, Haiyan

    2011-01-01

    The central role of the propensity score analysis (PSA) in observational studies is for causal inference; as such, PSA is often used for making causal claims in research articles. However, there are still some issues for researchers to consider when making claims of causality using PSA results. This summary first briefly reviews PSA, followed by…

  8. Causal Analysis After Haavelmo

    PubMed Central

    Heckman, James; Pinto, Rodrigo

    2014-01-01

    Haavelmo's seminal 1943 and 1944 papers are the first rigorous treatment of causality. In them, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are defined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAGs) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare the simplicity of an analysis of causality based on Haavelmo's methodology with the complex and nonintuitive approach used in the causal literature of DAGs—the “do-calculus” of Pearl (2009). We discuss the severe limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo. In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover them. PMID:25729123

  9. Causal Analysis/Diagnosis Decision Information System (CADDIS)

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems to determine the cause of contamination.

  10. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  11. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  12. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  13. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  14. The analysis of the possibility of using 10-minute rainfall series to determine the maximum rainfall amount with 5 minutes duration

    NASA Astrophysics Data System (ADS)

    Kaźmierczak, Bartosz; Wartalska, Katarzyna; Wdowikowski, Marcin; Kotowski, Andrzej

    2017-11-01

    Modern scientific research in the area of heavy rainfall analysis regarding to the sewerage design indicates the need to develop and use probabilistic rain models. One of the issues that remains to be resolved is the length of the shortest amount of rain to be analyzed. It is commonly believed that the best time is 5 minutes, while the least rain duration measured by the national services is often 10 or even 15 minutes. Main aim of this paper is to present the difference between probabilistic rainfall models results given from rainfall time series including and excluding 5 minutes rainfall duration. Analysis were made for long-time period from 1961-2010 on polish meteorological station Legnica. To develop best fitted to measurement rainfall data probabilistic model 4 probabilistic distributions were used. Results clearly indicates that models including 5 minutes rainfall duration remains more appropriate to use.

  15. Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves

    NASA Astrophysics Data System (ADS)

    Rošt'áková, Zuzana; Rosipal, Roman

    2018-02-01

    Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.

  16. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  17. Bounded-Degree Approximations of Stochastic Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Christopher J.; Pinar, Ali; Kiyavash, Negar

    2017-06-01

    We propose algorithms to approximate directed information graphs. Directed information graphs are probabilistic graphical models that depict causal dependencies between stochastic processes in a network. The proposed algorithms identify optimal and near-optimal approximations in terms of Kullback-Leibler divergence. The user-chosen sparsity trades off the quality of the approximation against visual conciseness and computational tractability. One class of approximations contains graphs with speci ed in-degrees. Another class additionally requires that the graph is connected. For both classes, we propose algorithms to identify the optimal approximations and also near-optimal approximations, using a novel relaxation of submodularity. We also propose algorithms to identifymore » the r-best approximations among these classes, enabling robust decision making.« less

  18. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  19. Quantifying time-varying cellular secretions with local linear models.

    PubMed

    Byers, Jeff M; Christodoulides, Joseph A; Delehanty, James B; Raghu, Deepa; Raphael, Marc P

    2017-07-01

    Extracellular protein concentrations and gradients initiate a wide range of cellular responses, such as cell motility, growth, proliferation and death. Understanding inter-cellular communication requires spatio-temporal knowledge of these secreted factors and their causal relationship with cell phenotype. Techniques which can detect cellular secretions in real time are becoming more common but generalizable data analysis methodologies which can quantify concentration from these measurements are still lacking. Here we introduce a probabilistic approach in which local-linear models and the law of mass action are applied to obtain time-varying secreted concentrations from affinity-based biosensor data. We first highlight the general features of this approach using simulated data which contains both static and time-varying concentration profiles. Next we apply the technique to determine concentration of secreted antibodies from 9E10 hybridoma cells as detected using nanoplasmonic biosensors. A broad range of time-dependent concentrations was observed: from steady-state secretions of 230 pM near the cell surface to large transients which reached as high as 56 nM over several minutes and then dissipated.

  20. Analyzing system safety in lithium-ion grid energy storage

    NASA Astrophysics Data System (ADS)

    Rosewater, David; Williams, Adam

    2015-12-01

    As grid energy storage systems become more complex, it grows more difficult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to fill the gaps recognized in PRA for designing complex systems and hence be more effective or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. We conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.

  1. Implementation and reporting of causal mediation analysis in 2015: a systematic review in epidemiological studies.

    PubMed

    Liu, Shao-Hsien; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-07-20

    Causal mediation analysis is often used to understand the impact of variables along the causal pathway of an occurrence relation. How well studies apply and report the elements of causal mediation analysis remains unknown. We systematically reviewed epidemiological studies published in 2015 that employed causal mediation analysis to estimate direct and indirect effects of observed associations between an exposure on an outcome. We identified potential epidemiological studies through conducting a citation search within Web of Science and a keyword search within PubMed. Two reviewers independently screened studies for eligibility. For eligible studies, one reviewer performed data extraction, and a senior epidemiologist confirmed the extracted information. Empirical application and methodological details of the technique were extracted and summarized. Thirteen studies were eligible for data extraction. While the majority of studies reported and identified the effects of measures, most studies lacked sufficient details on the extent to which identifiability assumptions were satisfied. Although most studies addressed issues of unmeasured confounders either from empirical approaches or sensitivity analyses, the majority did not examine the potential bias arising from the measurement error of the mediator. Some studies allowed for exposure-mediator interaction and only a few presented results from models both with and without interactions. Power calculations were scarce. Reporting of causal mediation analysis is varied and suboptimal. Given that the application of causal mediation analysis will likely continue to increase, developing standards of reporting of causal mediation analysis in epidemiological research would be prudent.

  2. 75 FR 18205 - Notice of Peer Review Meeting for the External Peer Review Drafts of Two Documents on Using...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...

  3. ProbCD: enrichment analysis accounting for categorization uncertainty.

    PubMed

    Vêncio, Ricardo Z N; Shmulevich, Ilya

    2007-10-12

    As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.

  4. Study on localization of epileptic focus based on causality analysis

    NASA Astrophysics Data System (ADS)

    Shan, Shaojie; Li, Hanjun; Tang, Xiaoying

    2018-05-01

    In this paper, we considered that the ECoG signal contain abundant pathological information, which can be used for the localization of epileptic focus before epileptic seizures in 1-2 mins. In order to validate this hypothesis, cutting the ECoG into three stages: before seizure, seizure and after seizure, then through using Granger causality algorithm, PSI causality algorithm, Transfer Entropy causality algorithm at different stages of epilepsy ECoG, we were able to do the causality analysis of ECoG data. The results have shown that there is significant difference with the causality value of the epileptic focus area in before seizure, seizure and after seizure. An increase is in the causality value of each channel during epileptic seizure. After epileptic seizure, the causality between the channels showed a downward trend, but the difference was not obvious. The difference of the causality provides a reliable technical method to assist the clinical diagnosis of epileptic focus.

  5. The Probability Heuristics Model of Syllogistic Reasoning.

    ERIC Educational Resources Information Center

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  6. An Instructional Module on Mokken Scale Analysis

    ERIC Educational Resources Information Center

    Wind, Stefanie A.

    2017-01-01

    Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…

  7. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  8. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  9. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  10. Natural selection. VII. History and interpretation of kin selection theory.

    PubMed

    Frank, S A

    2013-06-01

    Kin selection theory is a kind of causal analysis. The initial form of kin selection ascribed cause to costs, benefits and genetic relatedness. The theory then slowly developed a deeper and more sophisticated approach to partitioning the causes of social evolution. Controversy followed because causal analysis inevitably attracts opposing views. It is always possible to separate total effects into different component causes. Alternative causal schemes emphasize different aspects of a problem, reflecting the distinct goals, interests and biases of different perspectives. For example, group selection is a particular causal scheme with certain advantages and significant limitations. Ultimately, to use kin selection theory to analyse natural patterns and to understand the history of debates over different approaches, one must follow the underlying history of causal analysis. This article describes the history of kin selection theory, with emphasis on how the causal perspective improved through the study of key patterns of natural history, such as dispersal and sex ratio, and through a unified approach to demographic and social processes. Independent historical developments in the multivariate analysis of quantitative traits merged with the causal analysis of social evolution by kin selection. © 2013 The Author. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.

  11. EXPERIENCES WITH USING PROBABILISTIC EXPOSURE ANALYSIS METHODS IN THE U.S. EPA

    EPA Science Inventory

    Over the past decade various Offices and Programs within the U.S. EPA have either initiated or increased the development and application of probabilistic exposure analysis models. These models have been applied to a broad range of research or regulatory problems in EPA, such as e...

  12. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  13. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  14. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  15. Probabilistic interpretation of Peelle's pertinent puzzle and its resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Kenneth M.; Kawano, T.; Talou, P.

    2004-01-01

    Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less

  16. Probabilistic Interpretation of Peelle's Pertinent Puzzle and its Resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Kenneth M.; Kawano, Toshihiko; Talou, Patrick

    2005-05-24

    Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less

  17. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  18. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  19. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    NASA Astrophysics Data System (ADS)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  20. Probabilistic analysis of the efficiency of the damping devices against nuclear fuel container falling

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2017-07-01

    The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.

  1. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  2. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk

    2016-06-08

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  3. Probabilistic safety assessment of the design of a tall buildings under the extreme load

    NASA Astrophysics Data System (ADS)

    Králik, Juraj

    2016-06-01

    The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.

  4. Observational Research Opportunities and Limitations

    PubMed Central

    Boyko, Edward J.

    2013-01-01

    Medical research continues to progress in its ability to identify treatments and characteristics associated with benefits and adverse outcomes. The principle engine for the evaluation of treatment efficacy is the randomized controlled trial (RCT). Due to the cost and other considerations, RCTs cannot address all clinically important decisions. Observational research often is used to address issues not addressed or not addressable by RCTs. This article provides an overview of the benefits and limitations of observational research to serve as a guide to the interpretation of this category of research designs in diabetes investigations. The potential for bias is higher in observational research but there are design and analysis features that can address these concerns although not completely eliminate them. Pharmacoepidemiologic research may provide important information regarding relative safety and effectiveness of diabetes pharmaceuticals. Such research must effectively address the important issue of confounding by indication in order to produce clinically meaningful results. Other methods such as instrumental variable analysis are being employed to enable stronger causal inference but these methods also require fulfillment of several key assumptions that may or may not be realistic. Nearly all clinical decisions involve probabilistic reasoning and confronting uncertainly, so a realistic goal for observational research may not be the high standard set by RCTs but instead the level of certainty needed to influence a diagnostic or treatment decision. PMID:24055326

  5. Observational research--opportunities and limitations.

    PubMed

    Boyko, Edward J

    2013-01-01

    Medical research continues to progress in its ability to identify treatments and characteristics associated with benefits and adverse outcomes. The principal engine for the evaluation of treatment efficacy is the randomized controlled trial (RCT). Due to the cost and other considerations, RCTs cannot address all clinically important decisions. Observational research often is used to address issues not addressed or not addressable by RCTs. This article provides an overview of the benefits and limitations of observational research to serve as a guide to the interpretation of this category of research designs in diabetes investigations. The potential for bias is higher in observational research but there are design and analysis features that can address these concerns although not completely eliminate them. Pharmacoepidemiologic research may provide important information regarding relative safety and effectiveness of diabetes pharmaceuticals. Such research must effectively address the important issue of confounding by indication in order to produce clinically meaningful results. Other methods such as instrumental variable analysis are being employed to enable stronger causal inference but these methods also require fulfillment of several key assumptions that may or may not be realistic. Nearly all clinical decisions involve probabilistic reasoning and confronting uncertainly, so a realistic goal for observational research may not be the high standard set by RCTs but instead the level of certainty needed to influence a diagnostic or treatment decision. © 2013.

  6. Probabilistic sensitivity analysis for decision trees with multiple branches: use of the Dirichlet distribution in a Bayesian framework.

    PubMed

    Briggs, Andrew H; Ades, A E; Price, Martin J

    2003-01-01

    In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.

  7. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  9. The role of awareness campaigns in the improvement of separate collection rates of municipal waste among university students: A Causal Chain Approach.

    PubMed

    Saladié, Òscar; Santos-Lacueva, Raquel

    2016-02-01

    One of the main objectives of municipal waste management policies is to improve separate collection, both quantitatively and qualitatively. Several factors influence people behavior to recycling and, consequently, they play an important role to achieve the goals proposed in the management policies. People can improve separate collection rates because of a wide range of causes with different weight. Here, we have determined the uplift in probability to improve separate collection of municipal waste created by the awareness campaigns among 806 undergraduate students at Universitat Rovira i Virgili (Catalonia) by means of the Causal Chain Approach, a probabilistic method. A 73.2% state having improved separate collection in recent years and the most of them (75.4%) remember some awareness campaign. The results show the uplift in probability to improve separate collection attributable to the awareness campaigns is 17.9%. They should be taken into account by policy makers in charge of municipal waste management. Nevertheless, it must be assumed an awareness campaign will never be sufficient to achieve the objectives defined in municipal waste management programmes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  11. Causal Mediation Analysis: Warning! Assumptions Ahead

    ERIC Educational Resources Information Center

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  12. A General Approach to Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Keele, Luke; Tingley, Dustin

    2010-01-01

    Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…

  13. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  15. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    NASA Astrophysics Data System (ADS)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  16. Reflections on Heckman and Pinto’s Causal Analysis After Haavelmo

    DTIC Science & Technology

    2013-11-01

    Econometric Analysis , Cambridge University Press, 477–490, 1995. Halpern, J. (1998). Axiomatizing causal reasoning. In Uncertainty in Artificial...Models, Structural Models and Econometric Policy Evaluation. Elsevier B.V., Amsterdam, 4779–4874. Heckman, J. J. (1979). Sample selection bias as a...Reflections on Heckman and Pinto’s “Causal Analysis After Haavelmo” Judea Pearl University of California, Los Angeles Computer Science Department Los

  17. Application of a stochastic snowmelt model for probabilistic decisionmaking

    NASA Technical Reports Server (NTRS)

    Mccuen, R. H.

    1983-01-01

    A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.

  18. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.

  19. Influence analysis for high-dimensional time series with an application to epileptic seizure onset zone detection

    PubMed Central

    Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred

    2013-01-01

    Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014

  20. Incorporating psychological influences in probabilistic cost analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the scope and magnitude of the cost-overrun problem, the benefits are likely to be significant.« less

  1. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  2. CADDIS Volume 4. Data Analysis: Getting Started

    EPA Pesticide Factsheets

    Assembling data for an ecological causal analysis, matching biological and environmental samples in time and space, organizing data along conceptual causal pathways, data quality and quantity requirements, Data Analysis references.

  3. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  4. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  5. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  6. Weighting-Based Sensitivity Analysis in Causal Mediation Studies

    ERIC Educational Resources Information Center

    Hong, Guanglei; Qin, Xu; Yang, Fan

    2018-01-01

    Through a sensitivity analysis, the analyst attempts to determine whether a conclusion of causal inference could be easily reversed by a plausible violation of an identification assumption. Analytic conclusions that are harder to alter by such a violation are expected to add a higher value to scientific knowledge about causality. This article…

  7. Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  8. Probabilistic evaluation of SSME structural components

    NASA Astrophysics Data System (ADS)

    Rajagopal, K. R.; Newell, J. F.; Ho, H.

    1991-05-01

    The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.

  9. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  10. [Causal analysis approaches in epidemiology].

    PubMed

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  11. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  12. What Do Patients Think about the Cause of Their Mental Disorder? A Qualitative and Quantitative Analysis of Causal Beliefs of Mental Disorder in Inpatients in Psychosomatic Rehabilitation.

    PubMed

    Magaard, Julia Luise; Schulz, Holger; Brütt, Anna Levke

    2017-01-01

    Patients' causal beliefs about their mental disorders are important for treatment because they affect illness-related behaviours. However, there are few studies exploring patients' causal beliefs about their mental disorder. (a) To qualitatively explore patients' causal beliefs of their mental disorder, (b) to explore frequencies of patients stating causal beliefs, and (c) to investigate differences of causal beliefs according to patients' primary diagnoses. Inpatients in psychosomatic rehabilitation were asked an open-ended question about their three most important causal beliefs about their mental illness. Answers were obtained from 678 patients, with primary diagnoses of depression (N = 341), adjustment disorder (N = 75), reaction to severe stress (N = 57) and anxiety disorders (N = 40). Two researchers developed a category system inductively and categorised the reported causal beliefs. Qualitative analysis has been supplemented by logistic regression analyses. The causal beliefs were organized into twelve content-related categories. Causal beliefs referring to "problems at work" (47%) and "problems in social environment" (46%) were most frequently mentioned by patients with mental disorders. 35% of patients indicate causal beliefs related to "self/internal states". Patients with depression and patients with anxiety disorders stated similar causal beliefs, whereas patients with reactions to severe stress and adjustment disorders stated different causal beliefs in comparison to patients with depression. There was no opportunity for further exploration, because we analysed written documents. These results add a detailed insight to mentally ill patients' causal beliefs to illness perception literature. Additionally, evidence about differences in frequencies of causal beliefs between different illness groups complement previous findings. For future research it is important to clarify the relation between patients' causal beliefs and the chosen treatment.

  13. What Do Patients Think about the Cause of Their Mental Disorder? A Qualitative and Quantitative Analysis of Causal Beliefs of Mental Disorder in Inpatients in Psychosomatic Rehabilitation

    PubMed Central

    Magaard, Julia Luise; Schulz, Holger; Brütt, Anna Levke

    2017-01-01

    Background Patients’ causal beliefs about their mental disorders are important for treatment because they affect illness-related behaviours. However, there are few studies exploring patients’ causal beliefs about their mental disorder. Objectives (a) To qualitatively explore patients’ causal beliefs of their mental disorder, (b) to explore frequencies of patients stating causal beliefs, and (c) to investigate differences of causal beliefs according to patients’ primary diagnoses. Method Inpatients in psychosomatic rehabilitation were asked an open-ended question about their three most important causal beliefs about their mental illness. Answers were obtained from 678 patients, with primary diagnoses of depression (N = 341), adjustment disorder (N = 75), reaction to severe stress (N = 57) and anxiety disorders (N = 40). Two researchers developed a category system inductively and categorised the reported causal beliefs. Qualitative analysis has been supplemented by logistic regression analyses. Results The causal beliefs were organized into twelve content-related categories. Causal beliefs referring to “problems at work” (47%) and “problems in social environment” (46%) were most frequently mentioned by patients with mental disorders. 35% of patients indicate causal beliefs related to “self/internal states”. Patients with depression and patients with anxiety disorders stated similar causal beliefs, whereas patients with reactions to severe stress and adjustment disorders stated different causal beliefs in comparison to patients with depression. Limitations There was no opportunity for further exploration, because we analysed written documents. Conclusions These results add a detailed insight to mentally ill patients’ causal beliefs to illness perception literature. Additionally, evidence about differences in frequencies of causal beliefs between different illness groups complement previous findings. For future research it is important to clarify the relation between patients’ causal beliefs and the chosen treatment. PMID:28056066

  14. Paradoxical Behavior of Granger Causality

    NASA Astrophysics Data System (ADS)

    Witt, Annette; Battaglia, Demian; Gail, Alexander

    2013-03-01

    Granger causality is a standard tool for the description of directed interaction of network components and is popular in many scientific fields including econometrics, neuroscience and climate science. For time series that can be modeled as bivariate auto-regressive processes we analytically derive an expression for spectrally decomposed Granger Causality (SDGC) and show that this quantity depends only on two out of four groups of model parameters. Then we present examples of such processes whose SDGC expose paradoxical behavior in the sense that causality is high for frequency ranges with low spectral power. For avoiding misinterpretations of Granger causality analysis we propose to complement it by partial spectral analysis. Our findings are illustrated by an example from brain electrophysiology. Finally, we draw implications for the conventional definition of Granger causality. Bernstein Center for Computational Neuroscience Goettingen

  15. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The

  16. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  17. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  18. Articles Published in Four School Psychology Journals from 2000 to 2005: An Analysis of Experimental/Intervention Research

    ERIC Educational Resources Information Center

    Bliss, Stacy L.; Skinner, Christopher H.; Hautau, Briana; Carroll, Erin E.

    2008-01-01

    Using an experimenter-developed system, articles from four school psychology journals for the years 2000-2005 (n = 929) were classified. Results showed that 40% of the articles were narrative, 29% correlational, 16% descriptive, 8% causal-experimental, 4% causal-comparative, and 2% were meta-analytic. Further analysis of the causal-experimental…

  19. Diagnostic causal reasoning with verbal information.

    PubMed

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  1. CADDIS Volume 5. Causal Databases: Home page (Duplicate?)

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  2. CADDIS Volume 5. Causal Databases: Interactive Conceptual Diagrams (ICDs) User Guide

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  3. Analyzing system safety in lithium-ion grid energy storage

    DOE PAGES

    Rosewater, David; Williams, Adam

    2015-10-08

    As grid energy storage systems become more complex, it grows more di cult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to ll the gaps recognized in PRA for designing complex systems and hence be more e ectivemore » or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. Lastly, we conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.« less

  4. Probabilistic finite elements for fracture and fatigue analysis

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  5. A Causality Analysis of the Link between Higher Education and Economic Development.

    ERIC Educational Resources Information Center

    De Meulemeester, Jean-Luc; Rochat, Denis

    1995-01-01

    Summarizes a study exploring the relationship between higher education and economic development, using cointegration and Granger-causality tests. Results show a significant causality from higher education efforts in Sweden, United Kingdom, Japan, and France. However, a similar causality link has not been found for Italy or Australia. (68…

  6. Pathway Analysis and the Search for Causal Mechanisms

    ERIC Educational Resources Information Center

    Weller, Nicholas; Barnes, Jeb

    2016-01-01

    The study of causal mechanisms interests scholars across the social sciences. Case studies can be a valuable tool in developing knowledge and hypotheses about how causal mechanisms function. The usefulness of case studies in the search for causal mechanisms depends on effective case selection, and there are few existing guidelines for selecting…

  7. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.

  8. Diagram-based Analysis of Causal Systems (DACS): elucidating inter-relationships between determinants of acute lower respiratory infections among children in sub-Saharan Africa.

    PubMed

    Rehfuess, Eva A; Best, Nicky; Briggs, David J; Joffe, Mike

    2013-12-06

    Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings.Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce.

  9. Diagram-based Analysis of Causal Systems (DACS): elucidating inter-relationships between determinants of acute lower respiratory infections among children in sub-Saharan Africa

    PubMed Central

    2013-01-01

    Background Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Results Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings. Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Conclusions Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce. PMID:24314302

  10. Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*

    PubMed Central

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S.

    2014-01-01

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889

  11. CADDIS Volume 5. Causal Databases: Interactive Conceptual Diagrams (ICDs) Quick Start Instructions

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  12. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  13. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  14. Probabilistic structural analysis of space propulsion system LOX post

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.

    1990-01-01

    The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.

  15. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  16. Causality Analysis of Neural Connectivity: Critical Examination of Existing Methods and Advances of New Methods

    PubMed Central

    Hu, Sanqing; Dai, Guojun; Worrell, Gregory A.; Dai, Qionghai; Liang, Hualou

    2012-01-01

    Granger causality (GC) is one of the most popular measures to reveal causality influence of time series and has been widely applied in economics and neuroscience. Especially, its counterpart in frequency domain, spectral GC, as well as other Granger-like causality measures have recently been applied to study causal interactions between brain areas in different frequency ranges during cognitive and perceptual tasks. In this paper, we show that: 1) GC in time domain cannot correctly determine how strongly one time series influences the other when there is directional causality between two time series, and 2) spectral GC and other Granger-like causality measures have inherent shortcomings and/or limitations because of the use of the transfer function (or its inverse matrix) and partial information of the linear regression model. On the other hand, we propose two novel causality measures (in time and frequency domains) for the linear regression model, called new causality and new spectral causality, respectively, which are more reasonable and understandable than GC or Granger-like measures. Especially, from one simple example, we point out that, in time domain, both new causality and GC adopt the concept of proportion, but they are defined on two different equations where one equation (for GC) is only part of the other (for new causality), thus the new causality is a natural extension of GC and has a sound conceptual/theoretical basis, and GC is not the desired causal influence at all. By several examples, we confirm that new causality measures have distinct advantages over GC or Granger-like measures. Finally, we conduct event-related potential causality analysis for a subject with intracranial depth electrodes undergoing evaluation for epilepsy surgery, and show that, in the frequency domain, all measures reveal significant directional event-related causality, but the result from new spectral causality is consistent with event-related time–frequency power spectrum activity. The spectral GC as well as other Granger-like measures are shown to generate misleading results. The proposed new causality measures may have wide potential applications in economics and neuroscience. PMID:21511564

  17. Loop Analysis of Causal Feedback in Epidemiology: An Illustration Relating To Urban Neighborhoods and Resident Depressive Experiences

    PubMed Central

    2008-01-01

    The causal feedback implied by urban neighborhood conditions that shape human health experiences, that in turn shape neighborhood conditions through a complex causal web, raises a challenge for traditional epidemiological causal analyses. This article introduces the loop analysis method, and builds off of a core loop model linking neighborhood property vacancy rate, resident depressive symptoms, rate of neighborhood death, and rate of neighborhood exit in a feedback network. I justify and apply loop analysis to the specific example of depressive symptoms and abandoned urban residential property to show how inquiries into the behavior of causal systems can answer different kinds of hypotheses, and thereby compliment those of causal modeling using statistical models. Neighborhood physical conditions that are only indirectly influenced by depressive symptoms may nevertheless manifest in the mental health experiences of their residents; conversely, neighborhood physical conditions may be a significant mental health risk for the population of neighborhood residents. I find that participatory greenspace programs are likely to produce adaptive responses in depressive symptoms and different neighborhood conditions, which are different in character to non-participatory greenspace interventions. PMID:17706851

  18. Comparison of causality analysis on simultaneously measured fMRI and NIRS signals during motor tasks.

    PubMed

    Anwar, Abdul Rauf; Muthalib, Makii; Perrey, Stephane; Galka, Andreas; Granert, Oliver; Wolff, Stephan; Deuschl, Guenther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2013-01-01

    Brain activity can be measured using different modalities. Since most of the modalities tend to complement each other, it seems promising to measure them simultaneously. In to be presented research, the data recorded from Functional Magnetic Resonance Imaging (fMRI) and Near Infrared Spectroscopy (NIRS), simultaneously, are subjected to causality analysis using time-resolved partial directed coherence (tPDC). Time-resolved partial directed coherence uses the principle of state space modelling to estimate Multivariate Autoregressive (MVAR) coefficients. This method is useful to visualize both frequency and time dynamics of causality between the time series. Afterwards, causality results from different modalities are compared by estimating the Spearman correlation. In to be presented study, we used directionality vectors to analyze correlation, rather than actual signal vectors. Results show that causality analysis of the fMRI correlates more closely to causality results of oxy-NIRS as compared to deoxy-NIRS in case of a finger sequencing task. However, in case of simple finger tapping, no clear difference between oxy-fMRI and deoxy-fMRI correlation is identified.

  19. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  20. Costing the satellite power system

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.

    1978-01-01

    The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.

  1. 76 FR 28102 - Notice of Issuance of Regulatory Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-13

    ..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...

  2. Probability from a Socio-Cultural Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  3. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  4. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  5. Southeast regional fatal study : a causal chain analysis in North Carolina

    DOT National Transportation Integrated Search

    2002-01-01

    This project completed the North Carolina portion of the pooled fund Southeast Fatal Analysis Project. In addition : to completing this portion of the causal chain analysis for NC Crashes, the project developed a ranked : comprehensive list of candid...

  6. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  7. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562

  8. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  9. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  10. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  11. Asymptotic approximation method of force reconstruction: Application and analysis of stationary random forces

    NASA Astrophysics Data System (ADS)

    Sanchez, J.

    2018-06-01

    In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.

  12. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    NASA Technical Reports Server (NTRS)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  13. Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.

    PubMed

    Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony

    2005-04-01

    Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.

  14. Prediction of Human Activity by Discovering Temporal Sequence Patterns.

    PubMed

    Li, Kang; Fu, Yun

    2014-08-01

    Early prediction of ongoing human activity has become more valuable in a large variety of time-critical applications. To build an effective representation for prediction, human activities can be characterized by a complex temporal composition of constituent simple actions and interacting objects. Different from early detection on short-duration simple actions, we propose a novel framework for long -duration complex activity prediction by discovering three key aspects of activity: Causality, Context-cue, and Predictability. The major contributions of our work include: (1) a general framework is proposed to systematically address the problem of complex activity prediction by mining temporal sequence patterns; (2) probabilistic suffix tree (PST) is introduced to model causal relationships between constituent actions, where both large and small order Markov dependencies between action units are captured; (3) the context-cue, especially interactive objects information, is modeled through sequential pattern mining (SPM), where a series of action and object co-occurrence are encoded as a complex symbolic sequence; (4) we also present a predictive accumulative function (PAF) to depict the predictability of each kind of activity. The effectiveness of our approach is evaluated on two experimental scenarios with two data sets for each: action-only prediction and context-aware prediction. Our method achieves superior performance for predicting global activity classes and local action units.

  15. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  16. Causal inference in survival analysis using pseudo-observations.

    PubMed

    Andersen, Per K; Syriopoulou, Elisavet; Parner, Erik T

    2017-07-30

    Causal inference for non-censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization ('G-formula') or (2) inverse probability of treatment assignment weights ('propensity score'). To do causal inference in survival analysis, one needs to address right-censoring, and often, special techniques are required for that purpose. We will show how censoring can be dealt with 'once and for all' by means of so-called pseudo-observations when doing causal inference in survival analysis. The pseudo-observations can be used as a replacement of the outcomes without censoring when applying 'standard' causal inference methods, such as (1) or (2) earlier. We study this idea for estimating the average causal effect of a binary treatment on the survival probability, the restricted mean lifetime, and the cumulative incidence in a competing risks situation. The methods will be illustrated in a small simulation study and via a study of patients with acute myeloid leukemia who received either myeloablative or non-myeloablative conditioning before allogeneic hematopoetic cell transplantation. We will estimate the average causal effect of the conditioning regime on outcomes such as the 3-year overall survival probability and the 3-year risk of chronic graft-versus-host disease. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Uncertainties in evaluation of hazard and seismic risk

    NASA Astrophysics Data System (ADS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela

    2015-04-01

    Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But, how many cities, villages, metropolitan areas etc. in seismic regions are constructed on rock? Most of them are located on soil deposits? A soil is of basic type sand or gravel (termed coarse soils), silt or clay (termed fine soils) etc. The effect on nonlinearity is very large. For example, if we maintain the same spectral amplification factor (SAF=5.8942) as for relatively strong earthquake on May 3,1990(MW=6.4),then at Bacǎu seismic station for Vrancea earthquake on May 30,1990 (MW =6.9) the peak acceleration has to be a*max =0.154g and the actual recorded was only, amax =0.135g(-14.16%). Also, for Vrancea earthquake on August 30,1986(MW=7.1),the peak acceleration has to be a*max = 0.107g instead of real value recorded of 0.0736 g(- 45.57%). There are many data for more than 60 seismic stations. There is a strong nonlinear dependence of SAF with earthquake magnitude in each site. The authors are coming with an alternative approach called "real spectral amplification factors" instead of GMPE for all extra-Carpathian area where all cities and villages are located on soil deposits. Key words: Probabilistic Seismic Hazard; Uncertainties; Nonlinear seismology; Spectral amplification factors(SAF).

  18. Causal inference in economics and marketing.

    PubMed

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  19. Causal inference in economics and marketing

    PubMed Central

    Varian, Hal R.

    2016-01-01

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual—a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference. PMID:27382144

  20. New Evidence on Causal Relationship between Approximate Number System (ANS) Acuity and Arithmetic Ability in Elementary-School Students: A Longitudinal Cross-Lagged Analysis.

    PubMed

    He, Yunfeng; Zhou, Xinlin; Shi, Dexin; Song, Hairong; Zhang, Hui; Shi, Jiannong

    2016-01-01

    Approximate number system (ANS) acuity and mathematical ability have been found to be closely associated in recent studies. However, whether and how these two measures are causally related still remain less addressed. There are two hypotheses about the possible causal relationship: ANS acuity influences mathematical performances, or access to math education sharpens ANS acuity. Evidences in support of both hypotheses have been reported, but these two hypotheses have never been tested simultaneously. Therefore, questions still remain whether only one-direction or reciprocal causal relationships existed in the association. In this work, we provided a new evidence on the causal relationship between ANS acuity and arithmetic ability. ANS acuity and mathematical ability of elementary-school students were measured sequentially at three time points within one year, and all possible causal directions were evaluated simultaneously using cross-lagged regression analysis. The results show that ANS acuity influences later arithmetic ability while the reverse causal direction was not supported. Our finding adds a strong evidence to the causal association between ANS acuity and mathematical ability, and also has important implications for educational intervention designed to train ANS acuity and thereby promote mathematical ability.

  1. New Evidence on Causal Relationship between Approximate Number System (ANS) Acuity and Arithmetic Ability in Elementary-School Students: A Longitudinal Cross-Lagged Analysis

    PubMed Central

    He, Yunfeng; Zhou, Xinlin; Shi, Dexin; Song, Hairong; Zhang, Hui; Shi, Jiannong

    2016-01-01

    Approximate number system (ANS) acuity and mathematical ability have been found to be closely associated in recent studies. However, whether and how these two measures are causally related still remain less addressed. There are two hypotheses about the possible causal relationship: ANS acuity influences mathematical performances, or access to math education sharpens ANS acuity. Evidences in support of both hypotheses have been reported, but these two hypotheses have never been tested simultaneously. Therefore, questions still remain whether only one-direction or reciprocal causal relationships existed in the association. In this work, we provided a new evidence on the causal relationship between ANS acuity and arithmetic ability. ANS acuity and mathematical ability of elementary-school students were measured sequentially at three time points within one year, and all possible causal directions were evaluated simultaneously using cross-lagged regression analysis. The results show that ANS acuity influences later arithmetic ability while the reverse causal direction was not supported. Our finding adds a strong evidence to the causal association between ANS acuity and mathematical ability, and also has important implications for educational intervention designed to train ANS acuity and thereby promote mathematical ability. PMID:27462291

  2. A Framework for Estimating Causal Effects in Latent Class Analysis: Is There a Causal Link Between Early Sex and Subsequent Profiles of Delinquency?

    PubMed Central

    Lanza, Stephanie T.; Coffman, Donna L.

    2013-01-01

    Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p=0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p=0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work. PMID:23839479

  3. A framework for estimating causal effects in latent class analysis: is there a causal link between early sex and subsequent profiles of delinquency?

    PubMed

    Butera, Nicole M; Lanza, Stephanie T; Coffman, Donna L

    2014-06-01

    Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p = 0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p = 0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work.

  4. Causal network analysis of head and neck keloid tissue identifies potential master regulators.

    PubMed

    Garcia-Rodriguez, Laura; Jones, Lamont; Chen, Kang Mei; Datta, Indrani; Divine, George; Worsham, Maria J

    2016-10-01

    To generate novel insights and hypotheses in keloid development from potential master regulators. Prospective cohort. Six fresh keloid and six normal skin samples from 12 anonymous donors were used in a prospective cohort study. Genome-wide profiling was done previously on the cohort using the Infinium HumanMethylation450 BeadChip (Illumina, San Diego, CA). The 190 statistically significant CpG islands between keloid and normal tissue mapped to 152 genes (P < .05). The top 10 statistically significant genes (VAMP5, ACTR3C, GALNT3, KCNAB2, LRRC61, SCML4, SYNGR1, TNS1, PLEKHG5, PPP1R13-α, false discovery rate <.015) were uploaded into the Ingenuity Pathway Analysis software's Causal Network Analysis (QIAGEN, Redwood City, CA). To reflect expected gene expression direction in the context of methylation changes, the inverse of the methylation ratio from keloid versus normal tissue was used for the analysis. Causal Network Analysis identified disease-specific master regulator molecules based on downstream differentially expressed keloid-specific genes and expected directionality of expression (hypermethylated vs. hypomethylated). Causal Network Analysis software identified four hierarchical networks that included four master regulators (pyroxamide, tributyrin, PRKG2, and PENK) and 19 intermediate regulators. Causal Network Analysis of differentiated methylated gene data of keloid versus normal skin demonstrated four causal networks with four master regulators. These hierarchical networks suggest potential driver roles for their downstream keloid gene targets in the pathogenesis of the keloid phenotype, likely triggered due to perturbation/injury to normal tissue. NA Laryngoscope, 126:E319-E324, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  5. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  6. Performance Analysis of the Probabilistic Multi-Hypothesis Tracking Algorithm on the SEABAR Data Sets

    DTIC Science & Technology

    2009-07-01

    Performance Analysis of the Probabilistic Multi- Hypothesis Tracking Algorithm On the SEABAR Data Sets Dr. Christian G . Hempel Naval...Hypothesis Tracking,” NUWC-NPT Technical Report 10,428, Naval Undersea Warfare Center Division, Newport, RI, 15 February 1995. [2] G . McLachlan, T...the 9th International Conference on Information Fusion, Florence Italy, July, 2006. [8] C. Hempel, “Track Initialization for Multi-Static Active Sonay

  7. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  8. ANALYSIS OF CONCORDANCE OF PROBABILISTIC AGGREGATE EXPOSURE PREDICTIONS WITH OBSERVED BIOMONITORING RESULTS: AN EXAMPLE USING CTEPP DATA

    EPA Science Inventory

    Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...

  9. A Probabilistic Model of Phonological Relationships from Contrast to Allophony

    ERIC Educational Resources Information Center

    Hall, Kathleen Currie

    2009-01-01

    This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…

  10. Identifying Seizure Onset Zone From the Causal Connectivity Inferred Using Directed Information

    NASA Astrophysics Data System (ADS)

    Malladi, Rakesh; Kalamangalam, Giridhar; Tandon, Nitin; Aazhang, Behnaam

    2016-10-01

    In this paper, we developed a model-based and a data-driven estimator for directed information (DI) to infer the causal connectivity graph between electrocorticographic (ECoG) signals recorded from brain and to identify the seizure onset zone (SOZ) in epileptic patients. Directed information, an information theoretic quantity, is a general metric to infer causal connectivity between time-series and is not restricted to a particular class of models unlike the popular metrics based on Granger causality or transfer entropy. The proposed estimators are shown to be almost surely convergent. Causal connectivity between ECoG electrodes in five epileptic patients is inferred using the proposed DI estimators, after validating their performance on simulated data. We then proposed a model-based and a data-driven SOZ identification algorithm to identify SOZ from the causal connectivity inferred using model-based and data-driven DI estimators respectively. The data-driven SOZ identification outperforms the model-based SOZ identification algorithm when benchmarked against visual analysis by neurologist, the current clinical gold standard. The causal connectivity analysis presented here is the first step towards developing novel non-surgical treatments for epilepsy.

  11. Linking melodic expectation to expressive performance timing and perceived musical tension.

    PubMed

    Gingras, Bruno; Pearce, Marcus T; Goodchild, Meghan; Dean, Roger T; Wiggins, Geraint; McAdams, Stephen

    2016-04-01

    This research explored the relations between the predictability of musical structure, expressive timing in performance, and listeners' perceived musical tension. Studies analyzing the influence of expressive timing on listeners' affective responses have been constrained by the fact that, in most pieces, the notated durations limit performers' interpretive freedom. To circumvent this issue, we focused on the unmeasured prelude, a semi-improvisatory genre without notated durations. In Experiment 1, 12 professional harpsichordists recorded an unmeasured prelude on a harpsichord equipped with a MIDI console. Melodic expectation was assessed using a probabilistic model (IDyOM [Information Dynamics of Music]) whose expectations have been previously shown to match closely those of human listeners. Performance timing information was extracted from the MIDI data using a score-performance matching algorithm. Time-series analyses showed that, in a piece with unspecified note durations, the predictability of melodic structure measurably influenced tempo fluctuations in performance. In Experiment 2, another 10 harpsichordists, 20 nonharpsichordist musicians, and 20 nonmusicians listened to the recordings from Experiment 1 and rated the perceived tension continuously. Granger causality analyses were conducted to investigate predictive relations among melodic expectation, expressive timing, and perceived tension. Although melodic expectation, as modeled by IDyOM, modestly predicted perceived tension for all participant groups, neither of its components, information content or entropy, was Granger causal. In contrast, expressive timing was a strong predictor and was Granger causal. However, because melodic expectation was also predictive of expressive timing, our results outline a complete chain of influence from predictability of melodic structure via expressive performance timing to perceived musical tension. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  13. Overview of the SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division Activities and Technical Projects

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.

  14. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  15. Granger causality for state-space models

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Seth, Anil K.

    2015-04-01

    Granger causality has long been a prominent method for inferring causal interactions between stochastic variables for a broad range of complex physical systems. However, it has been recognized that a moving average (MA) component in the data presents a serious confound to Granger causal analysis, as routinely performed via autoregressive (AR) modeling. We solve this problem by demonstrating that Granger causality may be calculated simply and efficiently from the parameters of a state-space (SS) model. Since SS models are equivalent to autoregressive moving average models, Granger causality estimated in this fashion is not degraded by the presence of a MA component. This is of particular significance when the data has been filtered, downsampled, observed with noise, or is a subprocess of a higher dimensional process, since all of these operations—commonplace in application domains as diverse as climate science, econometrics, and the neurosciences—induce a MA component. We show how Granger causality, conditional and unconditional, in both time and frequency domains, may be calculated directly from SS model parameters via solution of a discrete algebraic Riccati equation. Numerical simulations demonstrate that Granger causality estimators thus derived have greater statistical power and smaller bias than AR estimators. We also discuss how the SS approach facilitates relaxation of the assumptions of linearity, stationarity, and homoscedasticity underlying current AR methods, thus opening up potentially significant new areas of research in Granger causal analysis.

  16. Role of Central Serotonin in Anticipation of Rewarding and Punishing Outcomes: Effects of Selective Amygdala or Orbitofrontal 5-HT Depletion

    PubMed Central

    Rygula, Rafal; Clarke, Hannah F.; Cardinal, Rudolf N.; Cockcroft, Gemma J.; Xia, Jing; Dalley, Jeff W.; Robbins, Trevor W.; Roberts, Angela C.

    2015-01-01

    Understanding the role of serotonin (or 5-hydroxytryptamine, 5-HT) in aversive processing has been hampered by the contradictory findings, across studies, of increased sensitivity to punishment in terms of subsequent response choice but decreased sensitivity to punishment-induced response suppression following gross depletion of central 5-HT. To address this apparent discrepancy, the present study determined whether both effects could be found in the same animals by performing localized 5-HT depletions in the amygdala or orbitofrontal cortex (OFC) of a New World monkey, the common marmoset. 5-HT depletion in the amygdala impaired response choice on a probabilistic visual discrimination task by increasing the effectiveness of misleading, or false, punishment and reward, and decreased response suppression in a variable interval test of punishment sensitivity that employed the same reward and punisher. 5-HT depletion in the OFC also disrupted probabilistic discrimination learning and decreased response suppression. Computational modeling of behavior on the discrimination task showed that the lesions reduced reinforcement sensitivity. A novel, unitary account of the findings in terms of the causal role of 5-HT in the anticipation of both negative and positive motivational outcomes is proposed and discussed in relation to current theories of 5-HT function and our understanding of mood and anxiety disorders. PMID:24879752

  17. Two-step estimation in ratio-of-mediator-probability weighted causal mediation analysis.

    PubMed

    Bein, Edward; Deutsch, Jonah; Hong, Guanglei; Porter, Kristin E; Qin, Xu; Yang, Cheng

    2018-04-15

    This study investigates appropriate estimation of estimator variability in the context of causal mediation analysis that employs propensity score-based weighting. Such an analysis decomposes the total effect of a treatment on the outcome into an indirect effect transmitted through a focal mediator and a direct effect bypassing the mediator. Ratio-of-mediator-probability weighting estimates these causal effects by adjusting for the confounding impact of a large number of pretreatment covariates through propensity score-based weighting. In step 1, a propensity score model is estimated. In step 2, the causal effects of interest are estimated using weights derived from the prior step's regression coefficient estimates. Statistical inferences obtained from this 2-step estimation procedure are potentially problematic if the estimated standard errors of the causal effect estimates do not reflect the sampling uncertainty in the estimation of the weights. This study extends to ratio-of-mediator-probability weighting analysis a solution to the 2-step estimation problem by stacking the score functions from both steps. We derive the asymptotic variance-covariance matrix for the indirect effect and direct effect 2-step estimators, provide simulation results, and illustrate with an application study. Our simulation results indicate that the sampling uncertainty in the estimated weights should not be ignored. The standard error estimation using the stacking procedure offers a viable alternative to bootstrap standard error estimation. We discuss broad implications of this approach for causal analysis involving propensity score-based weighting. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Commentary: Using Potential Outcomes to Understand Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Jo, Booil; Stuart, Elizabeth A.

    2011-01-01

    In this commentary, we demonstrate how the potential outcomes framework can help understand the key identification assumptions underlying causal mediation analysis. We show that this framework can lead to the development of alternative research design and statistical analysis strategies applicable to the longitudinal data settings considered by…

  19. Fracture mechanics analysis of cracked structures using weight function and neural network method

    NASA Astrophysics Data System (ADS)

    Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.

    2018-06-01

    Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.

  20. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  1. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  2. Applying causal mediation analysis to personality disorder research.

    PubMed

    Walters, Glenn D

    2018-01-01

    This article is designed to address fundamental issues in the application of causal mediation analysis to research on personality disorders. Causal mediation analysis is used to identify mechanisms of effect by testing variables as putative links between the independent and dependent variables. As such, it would appear to have relevance to personality disorder research. It is argued that proper implementation of causal mediation analysis requires that investigators take several factors into account. These factors are discussed under 5 headings: variable selection, model specification, significance evaluation, effect size estimation, and sensitivity testing. First, care must be taken when selecting the independent, dependent, mediator, and control variables for a mediation analysis. Some variables make better mediators than others and all variables should be based on reasonably reliable indicators. Second, the mediation model needs to be properly specified. This requires that the data for the analysis be prospectively or historically ordered and possess proper causal direction. Third, it is imperative that the significance of the identified pathways be established, preferably with a nonparametric bootstrap resampling approach. Fourth, effect size estimates should be computed or competing pathways compared. Finally, investigators employing the mediation method are advised to perform a sensitivity analysis. Additional topics covered in this article include parallel and serial multiple mediation designs, moderation, and the relationship between mediation and moderation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  4. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  5. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  6. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  8. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2015-02-20

    being integrated within MAT, including Granger causality. Granger causality tests whether a data series helps when predicting future values of another...relations by econometric models and cross-spectral methods. Econometrica: Journal of the Econometric Society, 424-438. Granger, C. W. (1980). Testing ... testing dataset. This effort is described in Section 3.2. 3.1. Improvements in Granger Causality User Interface Various metrics of causality are

  9. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2013-11-20

    Granger causality F-test validation 3.1.2. Dynamic time warping for uneven temporal relationships Many causal relationships are imperfectly...mapping for dynamic feedback models Granger causality and DTW can identify causal relationships and consider complex temporal factors. However, many ...variant of the tf-idf algorithm (Manning, Raghavan, Schutze et al., 2008), typically used in search engines, to “score” features. The (-log tf) in

  10. Interactive Planning under Uncertainty with Casual Modeling and Analysis

    DTIC Science & Technology

    2006-01-01

    Tool ( CAT ), a system for creating and analyzing causal models similar to Bayes networks. In order to use CAT as a tool for planning, users go through...an iterative process in which they use CAT to create and an- alyze alternative plans. One of the biggest difficulties is that the number of possible...Causal Analysis Tool ( CAT ), which is a tool for representing and analyzing causal networks sim- ilar to Bayesian networks. In order to represent plans

  11. Probabilistic Cues to Grammatical Category in English Orthography and Their Influence during Reading

    ERIC Educational Resources Information Center

    Arciuli, Joanne; Monaghan, Padraic

    2009-01-01

    We investigated probabilistic cues to grammatical category (noun vs. verb) in English orthography. These cues are located in both the beginnings and endings of words--as identified in our large-scale corpus analysis. Experiment 1 tested participants' sensitivity to beginning and ending cues while making speeded grammatical classifications.…

  12. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  13. Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.

    PubMed

    Frommholz, Ingo; Roelleke, Thomas

    2016-01-01

    Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.

  14. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  15. Structured statistical models of inductive reasoning.

    PubMed

    Kemp, Charles; Tenenbaum, Joshua B

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet both goals and describes [corrected] 4 applications of the framework: a taxonomic model, a spatial model, a threshold model, and a causal model. Each model makes probabilistic inferences about the extensions of novel properties, but the priors for the 4 models are defined over different kinds of structures that capture different relationships between the categories in a domain. The framework therefore shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.

  16. Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections

    NASA Astrophysics Data System (ADS)

    Wakazuki, Y.

    2015-12-01

    A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.

  17. A framework for the probabilistic analysis of meteotsunamis

    USGS Publications Warehouse

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  18. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  19. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    NASA Astrophysics Data System (ADS)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  20. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  1. Sensory Impairments and Autism: A Re-Examination of Causal Modelling

    ERIC Educational Resources Information Center

    Gerrard, Sue; Rugg, Gordon

    2009-01-01

    Sensory impairments are widely reported in autism, but remain largely unexplained by existing models. This article examines Kanner's causal reasoning and identifies unsupported assumptions implicit in later empirical work. Our analysis supports a heterogeneous causal model for autistic characteristics. We propose that the development of a…

  2. A new discriminative kernel from probabilistic models.

    PubMed

    Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert

    2002-10-01

    Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.

  3. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  4. On the security of a novel probabilistic signature based on bilinear square Diffie-Hellman problem and its extension.

    PubMed

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications.

  5. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  6. Estimating short-run and long-run interaction mechanisms in interictal state.

    PubMed

    Ozkaya, Ata; Korürek, Mehmet

    2010-04-01

    We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.

  7. Probabilistic Analysis of Radiation Doses for Shore-Based Individuals in Operation Tomodachi

    DTIC Science & Technology

    2013-05-01

    Based Upon Oxygen Consumption Rates. EPA/600/R-06/129F, U.S. Environmental Protection Agency, Washington, D.C. May. USEPA (U.S. Environmental...pascal (Pa) pound-force per square inch (psi) 6.894 757 × 103 pascal (Pa) Angle/ Temperature /Time hour (h) 3.6 × 103 second (s) degree of arc (o...equivalent and effective dose is the sievert (Sv). (1 Sv = 1 J kg–1). 1 DTRA-TR-12-002: Probabilistic Analysis of Radiation Doses for Shore-Based

  8. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  9. Structural system reliability calculation using a probabilistic fault tree analysis method

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  10. Redundant variables and Granger causality

    NASA Astrophysics Data System (ADS)

    Angelini, L.; de Tommaso, M.; Marinazzo, D.; Nitti, L.; Pellicoro, M.; Stramaglia, S.

    2010-03-01

    We discuss the use of multivariate Granger causality in presence of redundant variables: the application of the standard analysis, in this case, leads to under estimation of causalities. Using the un-normalized version of the causality index, we quantitatively develop the notions of redundancy and synergy in the frame of causality and propose two approaches to group redundant variables: (i) for a given target, the remaining variables are grouped so as to maximize the total causality and (ii) the whole set of variables is partitioned to maximize the sum of the causalities between subsets. We show the application to a real neurological experiment, aiming to a deeper understanding of the physiological basis of abnormal neuronal oscillations in the migraine brain. The outcome by our approach reveals the change in the informational pattern due to repetitive transcranial magnetic stimulations.

  11. Translating context to causality in cardiovascular disparities research.

    PubMed

    Benn, Emma K T; Goldfeld, Keith S

    2016-04-01

    Moving from a descriptive focus to a comprehensive analysis grounded in causal inference can be particularly daunting for disparities researchers. However, even a simple model supported by the theoretical underpinnings of causality gives researchers a better chance to make correct inferences about possible interventions that can benefit our most vulnerable populations. This commentary provides a brief description of how race/ethnicity and context relate to questions of causality, and uses a hypothetical scenario to explore how different researchers might analyze the data to estimate causal effects of interest. Perhaps although not entirely removed of bias, these causal estimates will move us a step closer to understanding how to intervene. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Dynamic Granger-Geweke causality modeling with application to interictal spike propagation

    PubMed Central

    Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.

    2010-01-01

    A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280

  13. NESSUS/NASTRAN Interface

    NASA Technical Reports Server (NTRS)

    Millwater, Harry; Riha, David

    1996-01-01

    The NESSUS probabilistic analysis computer program has been developed with a built-in finite element analysis program NESSUS/FEM. However, the NESSUS/FEM program is specialized for engine structures and may not contain sufficient features for other applications. In addition, users often become well acquainted with a particular finite element code and want to use that code for probabilistic structural analysis. For these reasons, this work was undertaken to develop an interface between NESSUS and NASTRAN such that NASTRAN can be used for the finite element analysis and NESSUS can be used for the probabilistic analysis. In addition, NESSUS was restructured such that other finite element codes could be more easily coupled with NESSUS. NESSUS has been enhanced such that NESSUS will modify the NASTRAN input deck for a given set of random variables, run NASTRAN and read the NASTRAN result. The coordination between the two codes is handled automatically. The work described here was implemented within NESSUS 6.2 which was delivered to NASA in September 1995. The code runs on Unix machines: Cray, HP, Sun, SGI and IBM. The new capabilities have been implemented such that a user familiar with NESSUS using NESSUS/FEM and NASTRAN can immediately use NESSUS with NASTRAN. In other words, the interface with NASTRAN has been implemented in an analogous manner to the interface with NESSUS/FEM. Only finite element specific input has been changed. This manual is written as an addendum to the existing NESSUS 6.2 manuals. We assume users have access to NESSUS manuals and are familiar with the operation of NESSUS including probabilistic finite element analysis. Update pages to the NESSUS PFEM manual are contained in Appendix E. The finite element features of the code and the probalistic analysis capabilities are summarized.

  14. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    PubMed

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.

  15. Base-Rate Neglect as a Function of Base Rates in Probabilistic Contingency Learning

    ERIC Educational Resources Information Center

    Kutzner, Florian; Freytag, Peter; Vogel, Tobias; Fiedler, Klaus

    2008-01-01

    When humans predict criterion events based on probabilistic predictors, they often lend excessive weight to the predictor and insufficient weight to the base rate of the criterion event. In an operant analysis, using a matching-to-sample paradigm, Goodie and Fantino (1996) showed that humans exhibit base-rate neglect when predictors are associated…

  16. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  17. Elasto-limited plastic analysis of structures for probabilistic conditions

    NASA Astrophysics Data System (ADS)

    Movahedi Rad, M.

    2018-06-01

    With applying plastic analysis and design methods, significant saving in material can be obtained. However, as a result of this benefit excessive plastic deformations and large residual displacements might develop, which in turn might lead to unserviceability and collapse of the structure. In this study, for deterministic problem the residual deformation of structures is limited by considering a constraint on the complementary strain energy of the residual forces. For probabilistic problem the constraint for the complementary strain energy of the residual forces is given randomly and critical stresses updated during the iteration. Limit curves are presented for the plastic limit load factors. The results show that these constraints have significant effects on the load factors. The formulations of the deterministic and probabilistic problems lead to mathematical programming which are solved by the use of nonlinear algorithm.

  18. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  19. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less

  20. Of arrows and flows. Causality, determination, and specificity in the Central Dogma of molecular biology.

    PubMed

    Fantini, Bernardino

    2006-01-01

    From its first proposal, the Central Dogma had a graphical form, complete with arrows of different types, and this form quickly became its standard presentation. In different scientific contexts, arrows have different meanings and in this particular case the arrows indicated the flow of information among different macromolecules. A deeper analysis illustrates that the arrows also imply a causal statement, directly connected to the causal role of genetic information. The author suggests a distinction between two different kinds of causal links, defined as 'physical causality' and 'biological determination', both implied in the production of biological specificity.

  1. Frequency distribution of causal connectivity in rat sensorimotor network: resting-state fMRI analyses.

    PubMed

    Shim, Woo H; Baek, Kwangyeol; Kim, Jeong Kon; Chae, Yongwook; Suh, Ji-Yeon; Rosen, Bruce R; Jeong, Jaeseung; Kim, Young R

    2013-01-01

    Resting-state functional MRI (fMRI) has emerged as an important method for assessing neural networks, enabling extensive connectivity analyses between multiple brain regions. Among the analysis techniques proposed, partial directed coherence (PDC) provides a promising tool to unveil causal connectivity networks in the frequency domain. Using the MRI time series obtained from the rat sensorimotor system, we applied PDC analysis to determine the frequency-dependent causality networks. In particular, we compared in vivo and postmortem conditions to establish the statistical significance of directional PDC values. Our results demonstrate that two distinctive frequency populations drive the causality networks in rat; significant, high-frequency causal connections clustered in the range of 0.2-0.4 Hz, and the frequently documented low-frequency connections <0.15 Hz. Frequency-dependence and directionality of the causal connection are characteristic between sensorimotor regions, implying the functional role of frequency bands to transport specific resting-state signals. In particular, whereas both intra- and interhemispheric causal connections between heterologous sensorimotor regions are robust over all frequency levels, the bilaterally homologous regions are interhemispherically linked mostly via low-frequency components. We also discovered a significant, frequency-independent, unidirectional connection from motor cortex to thalamus, indicating dominant cortical inputs to the thalamus in the absence of external stimuli. Additionally, to address factors underlying the measurement error, we performed signal simulations and revealed that the interactive MRI system noise alone is a likely source of the inaccurate PDC values. This work demonstrates technical basis for the PDC analysis of resting-state fMRI time series and the presence of frequency-dependent causality networks in the sensorimotor system.

  2. Verification of Causal Influences of Reasoning Skills and Epistemology on Physics Conceptual Learning

    ERIC Educational Resources Information Center

    Ding, Lin

    2014-01-01

    This study seeks to test the causal influences of reasoning skills and epistemologies on student conceptual learning in physics. A causal model, integrating multiple variables that were investigated separately in the prior literature, is proposed and tested through path analysis. These variables include student preinstructional reasoning skills…

  3. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  4. Economic Analysis of a Multi-Site Prevention Program: Assessment of Program Costs and Characterizing Site-level Variability

    PubMed Central

    Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.

    2013-01-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559

  5. Economic analysis of a multi-site prevention program: assessment of program costs and characterizing site-level variability.

    PubMed

    Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H

    2013-10-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.

  6. Probabilistic grammatical model for helix‐helix contact site classification

    PubMed Central

    2013-01-01

    Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601

  7. CADDIS Publications

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  8. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  9. DEVELOPMENT PLAN FOR THE CAUSAL ANALYSIS / DIAGNOSIS DECISION INFORMATION SYSTEM (CADDIS) 2001-2004

    EPA Science Inventory

    The Causal Analysis/Diagnosis Decision Information System (CADDIS) is a web-based system that provides technical support for states, tribes and other users of the Office of Water's Stressor Identification Guidance. The Stressor Identific...

  10. CADDIS Authors & Contributors

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  11. CADDIS Site Map

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  12. CADDIS Frequent Questions

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  13. CADDIS Related Links

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  14. CADDIS Recent Additions

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  15. CADDIS Basic Information

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  16. Flood quantile estimation at ungauged sites by Bayesian networks

    NASA Astrophysics Data System (ADS)

    Mediero, L.; Santillán, D.; Garrote, L.

    2012-04-01

    Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a stochastic generator of synthetic data was developed. Synthetic basin characteristics were randomised, keeping the statistical properties of observed physical and climatic variables in the homogeneous region. The synthetic flood quantiles were stochastically generated taking the regression equation as basis. The learnt Bayesian network was validated by the reliability diagram, the Brier Score and the ROC diagram, which are common measures used in the validation of probabilistic forecasts. Summarising, the flood quantile estimations through Bayesian networks supply information about the prediction uncertainty as a probability distribution function of discharges is given as result. Therefore, the Bayesian network model has application as a decision support for water resources and planning management.

  17. Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.

    NASA Astrophysics Data System (ADS)

    Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan

    2017-09-01

    Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.

  18. Factor Configurations with Governance as Conditions for Low HIV/AIDS Prevalence in HIV/AIDS Recipient Countries: Fuzzy-set Analysis

    PubMed Central

    Lee, Hwa-Young; Kang, Minah

    2015-01-01

    This paper aims to investigate whether good governance of a recipient country is a necessary condition and what combinations of factors including governance factor are sufficient for low prevalence of HIV/AIDS in HIV/AIDS aid recipient countries during the period of 2002-2010. For this, Fuzzy-set Qualitative Comparative Analysis (QCA) was used. Nine potential attributes for a causal configuration for low HIV/AIDS prevalence were identified through a review of previous studies. For each factor, full membership, full non-membership, and crossover point were specified using both author's knowledge and statistical information of the variables. Calibration and conversion to a fuzzy-set score were conducted using Fs/QCA 2.0 and probabilistic tests for necessary and sufficiency were performed by STATA 11. The result suggested that governance is the necessary condition for low prevalence of HIV/AIDS in a recipient country. From sufficiency test, two pathways were resulted. The low level of governance can lead to low level of HIV/AIDS prevalence when it is combined with other favorable factors, especially, low economic inequality, high economic development and high health expenditure. However, strengthening governance is a more practical measure to keep low prevalence of HIV/AIDS because it is hard to achieve both economic development and economic quality. This study highlights that a comprehensive policy measure is the key for achieving low prevalence of HIV/AIDS in recipient country. PMID:26617451

  19. Factor Configurations with Governance as Conditions for Low HIV/AIDS Prevalence in HIV/AIDS Recipient Countries: Fuzzy-set Analysis.

    PubMed

    Lee, Hwa-Young; Yang, Bong-Min; Kang, Minah

    2015-11-01

    This paper aims to investigate whether good governance of a recipient country is a necessary condition and what combinations of factors including governance factor are sufficient for low prevalence of HIV/AIDS in HIV/AIDS aid recipient countries during the period of 2002-2010. For this, Fuzzy-set Qualitative Comparative Analysis (QCA) was used. Nine potential attributes for a causal configuration for low HIV/AIDS prevalence were identified through a review of previous studies. For each factor, full membership, full non-membership, and crossover point were specified using both author's knowledge and statistical information of the variables. Calibration and conversion to a fuzzy-set score were conducted using Fs/QCA 2.0 and probabilistic tests for necessary and sufficiency were performed by STATA 11. The result suggested that governance is the necessary condition for low prevalence of HIV/AIDS in a recipient country. From sufficiency test, two pathways were resulted. The low level of governance can lead to low level of HIV/AIDS prevalence when it is combined with other favorable factors, especially, low economic inequality, high economic development and high health expenditure. However, strengthening governance is a more practical measure to keep low prevalence of HIV/AIDS because it is hard to achieve both economic development and economic quality. This study highlights that a comprehensive policy measure is the key for achieving low prevalence of HIV/AIDS in recipient country.

  20. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CHROMATED COPPER ARSENATE ( CAA )-TREATED PLAYSETS AND DECKS: PART 2 SENSITIVITY AND UNCERTAINTY ANALYSIS

    EPA Science Inventory

    A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...

  1. Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.

    PubMed

    Zhao, Yuchao; Frey, H Christopher

    2004-11-01

    Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.

  2. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  3. CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.

    2006-01-01

    This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.

  4. Stratified exact tests for the weak causal null hypothesis in randomized trials with a binary outcome.

    PubMed

    Chiba, Yasutaka

    2017-09-01

    Fisher's exact test is commonly used to compare two groups when the outcome is binary in randomized trials. In the context of causal inference, this test explores the sharp causal null hypothesis (i.e. the causal effect of treatment is the same for all subjects), but not the weak causal null hypothesis (i.e. the causal risks are the same in the two groups). Therefore, in general, rejection of the null hypothesis by Fisher's exact test does not mean that the causal risk difference is not zero. Recently, Chiba (Journal of Biometrics and Biostatistics 2015; 6: 244) developed a new exact test for the weak causal null hypothesis when the outcome is binary in randomized trials; the new test is not based on any large sample theory and does not require any assumption. In this paper, we extend the new test; we create a version of the test applicable to a stratified analysis. The stratified exact test that we propose is general in nature and can be used in several approaches toward the estimation of treatment effects after adjusting for stratification factors. The stratified Fisher's exact test of Jung (Biometrical Journal 2014; 56: 129-140) tests the sharp causal null hypothesis. This test applies a crude estimator of the treatment effect and can be regarded as a special case of our proposed exact test. Our proposed stratified exact test can be straightforwardly extended to analysis of noninferiority trials and to construct the associated confidence interval. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. On the Security of a Novel Probabilistic Signature Based on Bilinear Square Diffie-Hellman Problem and Its Extension

    PubMed Central

    Zhao, Zhenguo; Shi, Wenbo

    2014-01-01

    Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications. PMID:25025083

  6. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  7. Fostering Deeper Critical Inquiry with Causal Layered Analysis

    ERIC Educational Resources Information Center

    Haigh, Martin

    2016-01-01

    Causal layered analysis (CLA) is a technique that enables deeper critical inquiry through a structured exploration of four layers of causation. CLA's layers reach down from the surface litany of media understanding, through the layer of systemic causes identified by conventional research, to underpinning worldviews, ideologies and philosophies,…

  8. Tools for Detecting Causality in Space Systems

    NASA Astrophysics Data System (ADS)

    Johnson, J.; Wing, S.

    2017-12-01

    Complex systems such as the solar and magnetospheric envivonment often exhibit patterns of behavior that suggest underlying organizing principles. Causality is a key organizing principle that is particularly difficult to establish in strongly coupled nonlinear systems, but essential for understanding and modeling the behavior of systems. While traditional methods of time-series analysis can identify linear correlations, they do not adequately quantify the distinction between causal and coincidental dependence. We discuss tools for detecting causality including: granger causality, transfer entropy, conditional redundancy, and convergent cross maps. The tools are illustrated by applications to magnetospheric and solar physics including radiation belt, Dst (a magnetospheric state variable), substorm, and solar cycle dynamics.

  9. Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code

    NASA Technical Reports Server (NTRS)

    Lemonds, Jeffrey; Kumar, Virendra

    1995-01-01

    An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.

  10. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  11. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  12. Probabilistic analysis of the torsional effects on the tall building resistance due to earthquake even

    NASA Astrophysics Data System (ADS)

    Králik, Juraj; Králik, Juraj

    2017-07-01

    The paper presents the results from the deterministic and probabilistic analysis of the accidental torsional effect of reinforced concrete tall buildings due to earthquake even. The core-column structural system was considered with various configurations in plane. The methodology of the seismic analysis of the building structures in Eurocode 8 and JCSS 2000 is discussed. The possibilities of the utilization the LHS method to analyze the extensive and robust tasks in FEM is presented. The influence of the various input parameters (material, geometry, soil, masses and others) is considered. The deterministic and probability analysis of the seismic resistance of the structure was calculated in the ANSYS program.

  13. Consequences of bullying victimization in childhood and adolescence: A systematic review and meta-analysis

    PubMed Central

    Moore, Sophie E; Norman, Rosana E; Suetani, Shuichi; Thomas, Hannah J; Sly, Peter D; Scott, James G

    2017-01-01

    AIM To identify health and psychosocial problems associated with bullying victimization and conduct a meta-analysis summarizing the causal evidence. METHODS A systematic review was conducted using PubMed, EMBASE, ERIC and PsycINFO electronic databases up to 28 February 2015. The study included published longitudinal and cross-sectional articles that examined health and psychosocial consequences of bullying victimization. All meta-analyses were based on quality-effects models. Evidence for causality was assessed using Bradford Hill criteria and the grading system developed by the World Cancer Research Fund. RESULTS Out of 317 articles assessed for eligibility, 165 satisfied the predetermined inclusion criteria for meta-analysis. Statistically significant associations were observed between bullying victimization and a wide range of adverse health and psychosocial problems. The evidence was strongest for causal associations between bullying victimization and mental health problems such as depression, anxiety, poor general health and suicidal ideation and behaviours. Probable causal associations existed between bullying victimization and tobacco and illicit drug use. CONCLUSION Strong evidence exists for a causal relationship between bullying victimization, mental health problems and substance use. Evidence also exists for associations between bullying victimization and other adverse health and psychosocial problems, however, there is insufficient evidence to conclude causality. The strong evidence that bullying victimization is causative of mental illness highlights the need for schools to implement effective interventions to address bullying behaviours. PMID:28401049

  14. INTEGRATION OF RELIABILITY WITH MECHANISTIC THERMALHYDRAULICS: REPORT ON APPROACH AND TEST PROBLEM RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. S. Schroeder; R. W. Youngblood

    The Risk-Informed Safety Margin Characterization (RISMC) pathway of the Light Water Reactor Sustainability Program is developing simulation-based methods and tools for analyzing safety margin from a modern perspective. [1] There are multiple definitions of 'margin.' One class of definitions defines margin in terms of the distance between a point estimate of a given performance parameter (such as peak clad temperature), and a point-value acceptance criterion defined for that parameter (such as 2200 F). The present perspective on margin is that it relates to the probability of failure, and not just the distance between a nominal operating point and a criterion.more » In this work, margin is characterized through a probabilistic analysis of the 'loads' imposed on systems, structures, and components, and their 'capacity' to resist those loads without failing. Given the probabilistic load and capacity spectra, one can assess the probability that load exceeds capacity, leading to component failure. Within the project, we refer to a plot of these probabilistic spectra as 'the logo.' Refer to Figure 1 for a notional illustration. The implications of referring to 'the logo' are (1) RISMC is focused on being able to analyze loads and spectra probabilistically, and (2) calling it 'the logo' tacitly acknowledges that it is a highly simplified picture: meaningful analysis of a given component failure mode may require development of probabilistic spectra for multiple physical parameters, and in many practical cases, 'load' and 'capacity' will not vary independently.« less

  15. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  16. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  17. Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  18. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  19. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  20. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  1. Inferring action structure and causal relationships in continuous sequences of human action.

    PubMed

    Buchsbaum, Daphna; Griffiths, Thomas L; Plunkett, Dillon; Gopnik, Alison; Baldwin, Dare

    2015-02-01

    In the real world, causal variables do not come pre-identified or occur in isolation, but instead are embedded within a continuous temporal stream of events. A challenge faced by both human learners and machine learning algorithms is identifying subsequences that correspond to the appropriate variables for causal inference. A specific instance of this problem is action segmentation: dividing a sequence of observed behavior into meaningful actions, and determining which of those actions lead to effects in the world. Here we present a Bayesian analysis of how statistical and causal cues to segmentation should optimally be combined, as well as four experiments investigating human action segmentation and causal inference. We find that both people and our model are sensitive to statistical regularities and causal structure in continuous action, and are able to combine these sources of information in order to correctly infer both causal relationships and segmentation boundaries. Copyright © 2014. Published by Elsevier Inc.

  2. Tsirelson's bound from a generalized data processing inequality

    NASA Astrophysics Data System (ADS)

    Dahlsten, Oscar C. O.; Lercher, Daniel; Renner, Renato

    2012-06-01

    The strength of quantum correlations is bounded from above by Tsirelson's bound. We establish a connection between this bound and the fact that correlations between two systems cannot increase under local operations, a property known as the data processing inequality (DPI). More specifically, we consider arbitrary convex probabilistic theories. These can be equipped with an entropy measure that naturally generalizes the von Neumann entropy, as shown recently in Short and Wehner (2010 New J. Phys. 12 033023) and Barnum et al (2010 New J. Phys. 12 033024). We prove that if the DPI holds with respect to this generalized entropy measure then the underlying theory necessarily respects Tsirelson's bound. We, moreover, generalize this statement to any entropy measure satisfying certain minimal requirements. A consequence of our result is that not all the entropic relations used for deriving Tsirelson's bound via information causality in Pawlowski et al (2009 Nature 461 1101-4) are necessary.

  3. Increased contextual cue utilization with tDCS over the prefrontal cortex during a recognition task

    PubMed Central

    Pergolizzi, Denise; Chua, Elizabeth F.

    2016-01-01

    The precise role of the prefrontal and posterior parietal cortices in recognition performance remains controversial, with questions about whether these regions contribute to recognition via the availability of mnemonic evidence or via decision biases and retrieval orientation. Here we used an explicit memory cueing paradigm, whereby external cues probabilistically predict upcoming memoranda as old or new, in our case with 75% validity, and these cues affect recognition decision biases in the direction of the cue. The present study applied bilateral transcranial direct current stimulation (tDCS) over prefrontal or posterior parietal cortex, or sham tDCS, to test the causal role of these regions in recognition accuracy or decision biasing. Participants who received tDCS over prefrontal cortex showed increased cue utilization compared to tDCS over posterior parietal cortex and sham tDCS, suggesting that the prefrontal cortex is involved in processes that contribute to decision biases in memory. PMID:27845032

  4. Granger causality revisited

    PubMed Central

    Friston, Karl J.; Bastos, André M.; Oswal, Ashwini; van Wijk, Bernadette; Richter, Craig; Litvak, Vladimir

    2014-01-01

    This technical paper offers a critical re-evaluation of (spectral) Granger causality measures in the analysis of biological timeseries. Using realistic (neural mass) models of coupled neuronal dynamics, we evaluate the robustness of parametric and nonparametric Granger causality. Starting from a broad class of generative (state-space) models of neuronal dynamics, we show how their Volterra kernels prescribe the second-order statistics of their response to random fluctuations; characterised in terms of cross-spectral density, cross-covariance, autoregressive coefficients and directed transfer functions. These quantities in turn specify Granger causality — providing a direct (analytic) link between the parameters of a generative model and the expected Granger causality. We use this link to show that Granger causality measures based upon autoregressive models can become unreliable when the underlying dynamics is dominated by slow (unstable) modes — as quantified by the principal Lyapunov exponent. However, nonparametric measures based on causal spectral factors are robust to dynamical instability. We then demonstrate how both parametric and nonparametric spectral causality measures can become unreliable in the presence of measurement noise. Finally, we show that this problem can be finessed by deriving spectral causality measures from Volterra kernels, estimated using dynamic causal modelling. PMID:25003817

  5. Establishing causal coherence across sentences: an ERP study

    PubMed Central

    Kuperberg, Gina R.; Paczynski, Martin; Ditman, Tali

    2011-01-01

    This study examined neural activity associated with establishing causal relationships across sentences during online comprehension. ERPs were measured while participants read and judged the relatedness of three-sentence scenarios in which the final sentence was highly causally related, intermediately related and causally unrelated to its context. Lexico-semantic co-occurrence was matched across the three conditions using a Latent Semantic Analysis. Critical words in causally unrelated scenarios evoked a larger N400 than words in both highly causally related and intermediately related scenarios, regardless of whether they appeared before or at the sentence-final position. At midline sites, the N400 to intermediately related sentence-final words was attenuated to the same degree as to highly causally related words, but otherwise the N400 to intermediately related words fell in between that evoked by highly causally related and intermediately related words. No modulation of the Late Positivity/P600 component was observed across conditions. These results indicate that both simple and complex causal inferences can influence the earliest stages of semantically processing an incoming word. Further, they suggest that causal coherence, at the situation level, can influence incremental word-by-word discourse comprehension, even when semantic relationships between individual words are matched. PMID:20175676

  6. Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions

    PubMed Central

    Joffe, Michael; Mindell, Jennifer

    2006-01-01

    Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586

  7. Are quantum-mechanical-like models possible, or necessary, outside quantum physics?

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2014-12-01

    This article examines some experimental conditions that invite and possibly require recourse to quantum-mechanical-like mathematical models (QMLMs), models based on the key mathematical features of quantum mechanics, in scientific fields outside physics, such as biology, cognitive psychology, or economics. In particular, I consider whether the following two correlative features of quantum phenomena that were decisive for establishing the mathematical formalism of quantum mechanics play similarly important roles in QMLMs elsewhere. The first is the individuality and discreteness of quantum phenomena, and the second is the irreducibly probabilistic nature of our predictions concerning them, coupled to the particular character of the probabilities involved, as different from the character of probabilities found in classical physics. I also argue that these features could be interpreted in terms of a particular form of epistemology that suspends and even precludes a causal and, in the first place, realist description of quantum objects and processes. This epistemology limits the descriptive capacity of quantum theory to the description, classical in nature, of the observed quantum phenomena manifested in measuring instruments. Quantum mechanics itself only provides descriptions, probabilistic in nature, concerning numerical data pertaining to such phenomena, without offering a physical description of quantum objects and processes. While QMLMs share their use of the quantum-mechanical or analogous mathematical formalism, they may differ by the roles, if any, the two features in question play in them and by different ways of interpreting the phenomena they considered and this formalism itself. This article will address those differences as well.

  8. Measurement Models for Reasoned Action Theory.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  9. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.

  10. CADDIS Document: Using Data From Other Sources

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  11. Case Studies Nested in Fuzzy-Set QCA on Sufficiency: Formalizing Case Selection and Causal Inference

    ERIC Educational Resources Information Center

    Schneider, Carsten Q.; Rohlfing, Ingo

    2016-01-01

    Qualitative Comparative Analysis (QCA) is a method for cross-case analyses that works best when complemented with follow-up case studies focusing on the causal quality of the solution and its constitutive terms, the underlying causal mechanisms, and potentially omitted conditions. The anchorage of QCA in set theory demands criteria for follow-up…

  12. ICSNPathway: identify candidate causal SNPs and pathways from genome-wide association study by one analytical framework.

    PubMed

    Zhang, Kunlin; Chang, Suhua; Cui, Sijia; Guo, Liyuan; Zhang, Liuyan; Wang, Jing

    2011-07-01

    Genome-wide association study (GWAS) is widely utilized to identify genes involved in human complex disease or some other trait. One key challenge for GWAS data interpretation is to identify causal SNPs and provide profound evidence on how they affect the trait. Currently, researches are focusing on identification of candidate causal variants from the most significant SNPs of GWAS, while there is lack of support on biological mechanisms as represented by pathways. Although pathway-based analysis (PBA) has been designed to identify disease-related pathways by analyzing the full list of SNPs from GWAS, it does not emphasize on interpreting causal SNPs. To our knowledge, so far there is no web server available to solve the challenge for GWAS data interpretation within one analytical framework. ICSNPathway is developed to identify candidate causal SNPs and their corresponding candidate causal pathways from GWAS by integrating linkage disequilibrium (LD) analysis, functional SNP annotation and PBA. ICSNPathway provides a feasible solution to bridge the gap between GWAS and disease mechanism study by generating hypothesis of SNP → gene → pathway(s). The ICSNPathway server is freely available at http://icsnpathway.psych.ac.cn/.

  13. Formulating and Answering High-Impact Causal Questions in Physiologic Childbirth Science: Concepts and Assumptions.

    PubMed

    Snowden, Jonathan M; Tilden, Ellen L; Odden, Michelle C

    2018-06-08

    In this article, we conclude our 3-part series by focusing on several concepts that have proven useful for formulating causal questions and inferring causal effects. The process of causal inference is of key importance for physiologic childbirth science, so each concept is grounded in content related to women at low risk for perinatal complications. A prerequisite to causal inference is determining that the question of interest is causal rather than descriptive or predictive. Another critical step in defining a high-impact causal question is assessing the state of existing research for evidence of causality. We introduce 2 causal frameworks that are useful for this undertaking, Hill's causal considerations and the sufficient-component cause model. We then provide 3 steps to aid perinatal researchers in inferring causal effects in a given study. First, the researcher should formulate a rigorous and clear causal question. We introduce an example of epidural analgesia and labor progression to demonstrate this process, including the central role of temporality. Next, the researcher should assess the suitability of the given data set to answer this causal question. In randomized controlled trials, data are collected with the express purpose of answering the causal question. Investigators using observational data should also ensure that their chosen causal question is answerable with the available data. Finally, investigators should design an analysis plan that targets the causal question of interest. Some data structures (eg, time-dependent confounding by labor progress when estimating the effect of epidural analgesia on postpartum hemorrhage) require specific analytical tools to control for bias and estimate causal effects. The assumptions of consistency, exchangeability, and positivity may be especially useful in carrying out these steps. Drawing on appropriate causal concepts and considering relevant assumptions strengthens our confidence that research has reduced the likelihood of alternative explanations (eg bias, chance) and estimated a causal effect. © 2018 by the American College of Nurse-Midwives.

  14. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  15. A probabilistic framework for assessing vulnerability to climate variability and change: The case of the US water supply system

    Treesearch

    Romano Foti; Jorge A. Ramirez; Thomas C. Brown

    2014-01-01

    We introduce a probabilistic framework for vulnerability analysis and use it to quantify current and future vulnerability of the US water supply system. We also determine the contributions of hydro-climatic and socio-economic drivers to the changes in projected vulnerability. For all scenarios and global climatemodels examined, the US Southwest including California and...

  16. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the relevance of accounting for the full range of flood events and their relation to both potential damages and benefits in risk assessments. Management measures may thus be designed to reflect local contexts and support benefits of natural hydrologic processes, while minimizing flood damage.

  17. Noninvasive Electromagnetic Source Imaging and Granger Causality Analysis: An Electrophysiological Connectome (eConnectome) Approach

    PubMed Central

    Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo

    2016-01-01

    Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473

  18. Noninvasive Electromagnetic Source Imaging and Granger Causality Analysis: An Electrophysiological Connectome (eConnectome) Approach.

    PubMed

    Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin

    2016-12-01

    Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.

  19. School Connectedness and Chinese Adolescents' Sleep Problems: A Cross-Lagged Panel Analysis

    ERIC Educational Resources Information Center

    Bao, Zhenzhou; Chen, Chuansheng; Zhang, Wei; Jiang, Yanping; Zhu, Jianjun; Lai, Xuefen

    2018-01-01

    Background: Although previous research indicates an association between school connectedness and adolescents' sleep quality, its causal direction has not been determined. This study used a 2-wave cross-lagged panel analysis to explore the likely causal direction between these 2 constructs. Methods: Participants were 888 Chinese adolescents (43.80%…

  20. Comments: Improving Weighting Methods for Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke

    2012-01-01

    The author begins this discussion by thanking Larry Hedges, the editor of the journal, for giving him an opportunity to provide a commentary on this stimulating article. He also would like to congratulate the authors of the article for their insightful discussion on causal mediation analysis, which is one of the most important and challenging…

  1. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  2. Generalized causal mediation and path analysis: Extensions and practical considerations.

    PubMed

    Albert, Jeffrey M; Cho, Jang Ik; Liu, Yiying; Nelson, Suchitra

    2018-01-01

    Causal mediation analysis seeks to decompose the effect of a treatment or exposure among multiple possible paths and provide casually interpretable path-specific effect estimates. Recent advances have extended causal mediation analysis to situations with a sequence of mediators or multiple contemporaneous mediators. However, available methods still have limitations, and computational and other challenges remain. The present paper provides an extended causal mediation and path analysis methodology. The new method, implemented in the new R package, gmediation (described in a companion paper), accommodates both a sequence (two stages) of mediators and multiple mediators at each stage, and allows for multiple types of outcomes following generalized linear models. The methodology can also handle unsaturated models and clustered data. Addressing other practical issues, we provide new guidelines for the choice of a decomposition, and for the choice of a reference group multiplier for the reduction of Monte Carlo error in mediation formula computations. The new method is applied to data from a cohort study to illuminate the contribution of alternative biological and behavioral paths in the effect of socioeconomic status on dental caries in adolescence.

  3. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  4. Comparison of bias analysis strategies applied to a large data set.

    PubMed

    Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M

    2014-07-01

    Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.

  5. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  6. Causal transfer function analysis to describe closed loop interactions between cardiovascular and cardiorespiratory variability signals.

    PubMed

    Faes, L; Porta, A; Cucino, R; Cerutti, S; Antolini, R; Nollo, G

    2004-06-01

    Although the concept of transfer function is intrinsically related to an input-output relationship, the traditional and widely used estimation method merges both feedback and feedforward interactions between the two analyzed signals. This limitation may endanger the reliability of transfer function analysis in biological systems characterized by closed loop interactions. In this study, a method for estimating the transfer function between closed loop interacting signals was proposed and validated in the field of cardiovascular and cardiorespiratory variability. The two analyzed signals x and y were described by a bivariate autoregressive model, and the causal transfer function from x to y was estimated after imposing causality by setting to zero the model coefficients representative of the reverse effects from y to x. The method was tested in simulations reproducing linear open and closed loop interactions, showing a better adherence of the causal transfer function to the theoretical curves with respect to the traditional approach in presence of non-negligible reverse effects. It was then applied in ten healthy young subjects to characterize the transfer functions from respiration to heart period (RR interval) and to systolic arterial pressure (SAP), and from SAP to RR interval. In the first two cases, the causal and non-causal transfer function estimates were comparable, indicating that respiration, acting as exogenous signal, sets an open loop relationship upon SAP and RR interval. On the contrary, causal and traditional transfer functions from SAP to RR were significantly different, suggesting the presence of a considerable influence on the opposite causal direction. Thus, the proposed causal approach seems to be appropriate for the estimation of parameters, like the gain and the phase lag from SAP to RR interval, which have a large clinical and physiological relevance.

  7. Disease causality extraction based on lexical semantics and document-clause frequency from biomedical literature.

    PubMed

    Lee, Dong-Gi; Shin, Hyunjung

    2017-05-18

    Recently, research on human disease network has succeeded and has become an aid in figuring out the relationship between various diseases. In most disease networks, however, the relationship between diseases has been simply represented as an association. This representation results in the difficulty of identifying prior diseases and their influence on posterior diseases. In this paper, we propose a causal disease network that implements disease causality through text mining on biomedical literature. To identify the causality between diseases, the proposed method includes two schemes: the first is the lexicon-based causality term strength, which provides the causal strength on a variety of causality terms based on lexicon analysis. The second is the frequency-based causality strength, which determines the direction and strength of causality based on document and clause frequencies in the literature. We applied the proposed method to 6,617,833 PubMed literature, and chose 195 diseases to construct a causal disease network. From all possible pairs of disease nodes in the network, 1011 causal pairs of 149 diseases were extracted. The resulting network was compared with that of a previous study. In terms of both coverage and quality, the proposed method showed outperforming results; it determined 2.7 times more causalities and showed higher correlation with associated diseases than the existing method. This research has novelty in which the proposed method circumvents the limitations of time and cost in applying all possible causalities in biological experiments and it is a more advanced text mining technique by defining the concepts of causality term strength.

  8. Carbon emissions-income relationships with structural breaks: the case of the Middle Eastern and North African countries.

    PubMed

    El Montasser, Ghassen; Ajmi, Ahdi Noomen; Nguyen, Duc Khuong

    2018-01-01

    This article revisits the carbon dioxide (CO 2 ) emissions-GDP causal relationships in the Middle Eastern and North African (MENA) countries by employing the Rossi (Economet Theor 21:962-990, 2005) instability-robust causality test. We show evidence of significant causality relationships for all considered countries within the instability context, whereas the standard Granger causality test fails to detect causal links in any direction, except for Egypt, Iran, and Morocco. An important policy implication resulting from this robust analysis is that the income is not affected by the cuts in the CO 2 emissions for only two MENA countries, the UAE and Syria.

  9. Inferring hidden causal relations between pathway members using reduced Google matrix of directed biological networks

    PubMed Central

    2018-01-01

    Signaling pathways represent parts of the global biological molecular network which connects them into a seamless whole through complex direct and indirect (hidden) crosstalk whose structure can change during development or in pathological conditions. We suggest a novel methodology, called Googlomics, for the structural analysis of directed biological networks using spectral analysis of their Google matrices, using parallels with quantum scattering theory, developed for nuclear and mesoscopic physics and quantum chaos. We introduce analytical “reduced Google matrix” method for the analysis of biological network structure. The method allows inferring hidden causal relations between the members of a signaling pathway or a functionally related group of genes. We investigate how the structure of hidden causal relations can be reprogrammed as a result of changes in the transcriptional network layer during cancerogenesis. The suggested Googlomics approach rigorously characterizes complex systemic changes in the wiring of large causal biological networks in a computationally efficient way. PMID:29370181

  10. How causal analysis can reveal autonomy in models of biological systems

    NASA Astrophysics Data System (ADS)

    Marshall, William; Kim, Hyunju; Walker, Sara I.; Tononi, Giulio; Albantakis, Larissa

    2017-11-01

    Standard techniques for studying biological systems largely focus on their dynamical or, more recently, their informational properties, usually taking either a reductionist or holistic perspective. Yet, studying only individual system elements or the dynamics of the system as a whole disregards the organizational structure of the system-whether there are subsets of elements with joint causes or effects, and whether the system is strongly integrated or composed of several loosely interacting components. Integrated information theory offers a theoretical framework to (1) investigate the compositional cause-effect structure of a system and to (2) identify causal borders of highly integrated elements comprising local maxima of intrinsic cause-effect power. Here we apply this comprehensive causal analysis to a Boolean network model of the fission yeast (Schizosaccharomyces pombe) cell cycle. We demonstrate that this biological model features a non-trivial causal architecture, whose discovery may provide insights about the real cell cycle that could not be gained from holistic or reductionist approaches. We also show how some specific properties of this underlying causal architecture relate to the biological notion of autonomy. Ultimately, we suggest that analysing the causal organization of a system, including key features like intrinsic control and stable causal borders, should prove relevant for distinguishing life from non-life, and thus could also illuminate the origin of life problem. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  11. Temperature, Not Fine Particulate Matter (PM2.5), is Causally Associated with Short-Term Acute Daily Mortality Rates: Results from One Hundred United States Cities

    PubMed Central

    Cox, Tony; Popken, Douglas; Ricci, Paolo F

    2013-01-01

    Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662

  12. Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erin V.

    2007-01-01

    A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for a 2-bearing shaft assembly in each body flap actuator established a reliability level of 99.6 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.

  13. Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erwin V.

    2008-01-01

    A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for the four actuators on one shuttle, each with a 2-bearing shaft assembly, established a reliability level of 96.9 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.

  14. Probabilistic Common Spatial Patterns for Multichannel EEG Analysis

    PubMed Central

    Chen, Zhe; Gao, Xiaorong; Li, Yuanqing; Brown, Emery N.; Gao, Shangkai

    2015-01-01

    Common spatial patterns (CSP) is a well-known spatial filtering algorithm for multichannel electroencephalogram (EEG) analysis. In this paper, we cast the CSP algorithm in a probabilistic modeling setting. Specifically, probabilistic CSP (P-CSP) is proposed as a generic EEG spatio-temporal modeling framework that subsumes the CSP and regularized CSP algorithms. The proposed framework enables us to resolve the overfitting issue of CSP in a principled manner. We derive statistical inference algorithms that can alleviate the issue of local optima. In particular, an efficient algorithm based on eigendecomposition is developed for maximum a posteriori (MAP) estimation in the case of isotropic noise. For more general cases, a variational algorithm is developed for group-wise sparse Bayesian learning for the P-CSP model and for automatically determining the model size. The two proposed algorithms are validated on a simulated data set. Their practical efficacy is also demonstrated by successful applications to single-trial classifications of three motor imagery EEG data sets and by the spatio-temporal pattern analysis of one EEG data set recorded in a Stroop color naming task. PMID:26005228

  15. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  16. The application of probabilistic design theory to high temperature low cycle fatigue

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1981-01-01

    Metal fatigue under stress and thermal cycling is a principal mode of failure in gas turbine engine hot section components such as turbine blades and disks and combustor liners. Designing for fatigue is subject to considerable uncertainty, e.g., scatter in cycles to failure, available fatigue test data and operating environment data, uncertainties in the models used to predict stresses, etc. Methods of analyzing fatigue test data for probabilistic design purposes are summarized. The general strain life as well as homo- and hetero-scedastic models are considered. Modern probabilistic design theory is reviewed and examples are presented which illustrate application to reliability analysis of gas turbine engine components.

  17. Probabilistic micromechanics for metal matrix composites

    NASA Astrophysics Data System (ADS)

    Engelstad, S. P.; Reddy, J. N.; Hopkins, Dale A.

    A probabilistic micromechanics-based nonlinear analysis procedure is developed to predict and quantify the variability in the properties of high temperature metal matrix composites. Monte Carlo simulation is used to model the probabilistic distributions of the constituent level properties including fiber, matrix, and interphase properties, volume and void ratios, strengths, fiber misalignment, and nonlinear empirical parameters. The procedure predicts the resultant ply properties and quantifies their statistical scatter. Graphite copper and Silicon Carbide Titanlum Aluminide (SCS-6 TI15) unidirectional plies are considered to demonstrate the predictive capabilities. The procedure is believed to have a high potential for use in material characterization and selection to precede and assist in experimental studies of new high temperature metal matrix composites.

  18. Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharirli, M.; Rand, J.L.; Sasser, M.K.

    1992-01-01

    The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less

  19. Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharirli, M.; Rand, J.L.; Sasser, M.K.

    1992-12-01

    The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less

  20. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    USGS Publications Warehouse

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    We urge ecologists to consider productivity–richness relationships through the lens of causal networks to advance our understanding beyond bivariate analysis. Further, we emphasize that models based on a causal network conceptualization can also provide more meaningful guidance for conservation management than can a bivariate perspective. Measuring only two variables does not permit the evaluation of complex ideas nor resolve debates about underlying mechanisms.

  1. Analysis of electromagnetic forces and causality in electron microscopy.

    PubMed

    Reyes-Coronado, Alejandro; Ortíz-Solano, Carlos Gael; Zabala, Nerea; Rivacoba, Alberto; Esquivel-Sirvent, Raúl

    2018-09-01

    The non-physical effects on the transverse momentum transfer from fast electrons to gold nanoparticles associated to the use of non-causal dielectric functions are studied. A direct test of the causality based on the surface Kramers-Kronig relations is presented. This test is applied to the different dielectric function used to describe gold nanostructures in electron microscopy. Copyright © 2018. Published by Elsevier B.V.

  2. Using path analysis to examine causal relationships among balanced scorecard performance indicators for general hospitals: the case of a public hospital system in Taiwan.

    PubMed

    Yang, Ming-Chin; Tung, Yu-Chi

    2006-01-01

    Examining whether the causal relationships among the performance indicators of the balanced scorecard (BSC) framework exist in hospitals is the aim of this article. Data were collected from all twenty-one general hospitals in a public hospital system and their supervising agency for the 3-year period, 2000-2002. The results of the path analyses identified significant causal relationships among four perspectives in the BSC model. We also verified the relationships among indicators within each perspective, some of which varied as time changed. We conclude that hospital administrators can use path analysis to help them identify and manage leading indicators when adopting the BSC model. However, they should also validate causal relationships between leading and lagging indicators periodically because the management environment changes constantly.

  3. Granger causality network reconstruction of conductance-based integrate-and-fire neuronal systems.

    PubMed

    Zhou, Douglas; Xiao, Yanyang; Zhang, Yaoyu; Xu, Zhiqin; Cai, David

    2014-01-01

    Reconstruction of anatomical connectivity from measured dynamical activities of coupled neurons is one of the fundamental issues in the understanding of structure-function relationship of neuronal circuitry. Many approaches have been developed to address this issue based on either electrical or metabolic data observed in experiment. The Granger causality (GC) analysis remains one of the major approaches to explore the dynamical causal connectivity among individual neurons or neuronal populations. However, it is yet to be clarified how such causal connectivity, i.e., the GC connectivity, can be mapped to the underlying anatomical connectivity in neuronal networks. We perform the GC analysis on the conductance-based integrate-and-fire (I&F) neuronal networks to obtain their causal connectivity. Through numerical experiments, we find that the underlying synaptic connectivity amongst individual neurons or subnetworks, can be successfully reconstructed by the GC connectivity constructed from voltage time series. Furthermore, this reconstruction is insensitive to dynamical regimes and can be achieved without perturbing systems and prior knowledge of neuronal model parameters. Surprisingly, the synaptic connectivity can even be reconstructed by merely knowing the raster of systems, i.e., spike timing of neurons. Using spike-triggered correlation techniques, we establish a direct mapping between the causal connectivity and the synaptic connectivity for the conductance-based I&F neuronal networks, and show the GC is quadratically related to the coupling strength. The theoretical approach we develop here may provide a framework for examining the validity of the GC analysis in other settings.

  4. Granger Causality Network Reconstruction of Conductance-Based Integrate-and-Fire Neuronal Systems

    PubMed Central

    Zhou, Douglas; Xiao, Yanyang; Zhang, Yaoyu; Xu, Zhiqin; Cai, David

    2014-01-01

    Reconstruction of anatomical connectivity from measured dynamical activities of coupled neurons is one of the fundamental issues in the understanding of structure-function relationship of neuronal circuitry. Many approaches have been developed to address this issue based on either electrical or metabolic data observed in experiment. The Granger causality (GC) analysis remains one of the major approaches to explore the dynamical causal connectivity among individual neurons or neuronal populations. However, it is yet to be clarified how such causal connectivity, i.e., the GC connectivity, can be mapped to the underlying anatomical connectivity in neuronal networks. We perform the GC analysis on the conductance-based integrate-and-fire (IF) neuronal networks to obtain their causal connectivity. Through numerical experiments, we find that the underlying synaptic connectivity amongst individual neurons or subnetworks, can be successfully reconstructed by the GC connectivity constructed from voltage time series. Furthermore, this reconstruction is insensitive to dynamical regimes and can be achieved without perturbing systems and prior knowledge of neuronal model parameters. Surprisingly, the synaptic connectivity can even be reconstructed by merely knowing the raster of systems, i.e., spike timing of neurons. Using spike-triggered correlation techniques, we establish a direct mapping between the causal connectivity and the synaptic connectivity for the conductance-based IF neuronal networks, and show the GC is quadratically related to the coupling strength. The theoretical approach we develop here may provide a framework for examining the validity of the GC analysis in other settings. PMID:24586285

  5. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  6. Renewable energy consumption and economic growth in nine OECD countries: bounds test approach and causality analysis.

    PubMed

    Hung-Pin, Lin

    2014-01-01

    The purpose of this paper is to investigate the short-run and long-run causality between renewable energy (RE) consumption and economic growth (EG) in nine OECD countries from the period between 1982 and 2011. To examine the linkage, this paper uses the autoregressive distributed lag (ARDL) bounds testing approach of cointegration test and vector error-correction models to test the causal relationship between variables. The co-integration and causal relationships are found in five countries-United States of America (USA), Japan, Germany, Italy, and United Kingdom (UK). The overall results indicate that (1) a short-run unidirectional causality runs from EG to RE in Italy and UK; (2) long-run unidirectional causalities run from RE to EG for Germany, Italy, and UK; (3) a long-run unidirectional causality runs from EG to RE in USA, and Japan; (4) both long-run and strong unidirectional causalities run from RE to EG for Germany and UK; and (5) Finally, both long-run and strong unidirectional causalities run from EG to RE in only USA. Further evidence reveals that policies for renewable energy conservation may have no impact on economic growth in France, Denmark, Portugal, and Spain.

  7. Renewable Energy Consumption and Economic Growth in Nine OECD Countries: Bounds Test Approach and Causality Analysis

    PubMed Central

    Hung-Pin, Lin

    2014-01-01

    The purpose of this paper is to investigate the short-run and long-run causality between renewable energy (RE) consumption and economic growth (EG) in nine OECD countries from the period between 1982 and 2011. To examine the linkage, this paper uses the autoregressive distributed lag (ARDL) bounds testing approach of cointegration test and vector error-correction models to test the causal relationship between variables. The co-integration and causal relationships are found in five countries—United States of America (USA), Japan, Germany, Italy, and United Kingdom (UK). The overall results indicate that (1) a short-run unidirectional causality runs from EG to RE in Italy and UK; (2) long-run unidirectional causalities run from RE to EG for Germany, Italy, and UK; (3) a long-run unidirectional causality runs from EG to RE in USA, and Japan; (4) both long-run and strong unidirectional causalities run from RE to EG for Germany and UK; and (5) Finally, both long-run and strong unidirectional causalities run from EG to RE in only USA. Further evidence reveals that policies for renewable energy conservation may have no impact on economic growth in France, Denmark, Portugal, and Spain. PMID:24558343

  8. A qualitative study evaluating causality attribution for serious adverse events during early phase oncology clinical trials.

    PubMed

    Mukherjee, Som D; Coombes, Megan E; Levine, Mitch; Cosby, Jarold; Kowaleski, Brenda; Arnold, Andrew

    2011-10-01

    In early phase oncology trials, novel targeted therapies are increasingly being tested in combination with traditional agents creating greater potential for enhanced and new toxicities. When a patient experiences a serious adverse event (SAE), investigators must determine whether the event is attributable to the investigational drug or not. This study seeks to understand the clinical reasoning, tools used and challenges faced by the researchers who assign causality to SAE's. Thirty-two semi-structured interviews were conducted with medical oncologists and trial coordinators at six Canadian academic cancer centres. Interviews were recorded and transcribed verbatim. Individual interview content analysis was followed by thematic analysis across the interview set. Our study found that causality assessment tends to be a rather complex process, often without complete clinical and investigational data at hand. Researchers described using a common processing strategy whereby they gather pertinent information, eliminate alternative explanations, and consider whether or not the study drug resulted in the SAE. Many of the interviewed participants voiced concern that causality assessments are often conducted quickly and tend to be highly subjective. Many participants were unable to identify any useful tools to help in assigning causality and welcomed more objectivity in the overall process. Attributing causality to SAE's is a complex process. Clinical trial researchers apply a logical system of reasoning, but feel that the current method of assigning causality could be improved. Based on these findings, future research involving the development of a new causality assessment tool specifically for use in early phase oncology clinical trials may be useful.

  9. Causal Imprinting in Causal Structure Learning

    PubMed Central

    Taylor, Eric G.; Ahn, Woo-kyoung

    2012-01-01

    Suppose one observes a correlation between two events, B and C, and infers that B causes C. Later one discovers that event A explains away the correlation between B and C. Normatively, one should now dismiss or weaken the belief that B causes C. Nonetheless, participants in the current study who observed a positive contingency between B and C followed by evidence that B and C were independent given A, persisted in believing that B causes C. The authors term this difficulty in revising initially learned causal structures “causal imprinting.” Throughout four experiments, causal imprinting was obtained using multiple dependent measures and control conditions. A Bayesian analysis showed that causal imprinting may be normative under some conditions, but causal imprinting also occurred in the current study when it was clearly non-normative. It is suggested that causal imprinting occurs due to the influence of prior knowledge on how reasoners interpret later evidence. Consistent with this view, when participants first viewed the evidence showing that B and C are independent given A, later evidence with only B and C did not lead to the belief that B causes C. PMID:22859019

  10. Non-Gaussian Methods for Causal Structure Learning.

    PubMed

    Shimizu, Shohei

    2018-05-22

    Causal structure learning is one of the most exciting new topics in the fields of machine learning and statistics. In many empirical sciences including prevention science, the causal mechanisms underlying various phenomena need to be studied. Nevertheless, in many cases, classical methods for causal structure learning are not capable of estimating the causal structure of variables. This is because it explicitly or implicitly assumes Gaussianity of data and typically utilizes only the covariance structure. In many applications, however, non-Gaussian data are often obtained, which means that more information may be contained in the data distribution than the covariance matrix is capable of containing. Thus, many new methods have recently been proposed for using the non-Gaussian structure of data and inferring the causal structure of variables. This paper introduces prevention scientists to such causal structure learning methods, particularly those based on the linear, non-Gaussian, acyclic model known as LiNGAM. These non-Gaussian data analysis tools can fully estimate the underlying causal structures of variables under assumptions even in the presence of unobserved common causes. This feature is in contrast to other approaches. A simulated example is also provided.

  11. A novel probabilistic framework for event-based speech recognition

    NASA Astrophysics Data System (ADS)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  12. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    PubMed

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  13. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    PubMed Central

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  14. Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.

    DTIC Science & Technology

    1983-12-01

    4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS

  15. Automated Text Data Mining Analysis of Five Decades of Educational Leadership Research Literature: Probabilistic Topic Modeling of "EAQ" Articles From 1965 to 2014

    ERIC Educational Resources Information Center

    Wang, Yinying; Bowers, Alex J.; Fikis, David J.

    2017-01-01

    Purpose: The purpose of this study is to describe the underlying topics and the topic evolution in the 50-year history of educational leadership research literature. Method: We used automated text data mining with probabilistic latent topic models to examine the full text of the entire publication history of all 1,539 articles published in…

  16. A Logical Approach to Multilevel Security of Probabilistic Systems

    DTIC Science & Technology

    1998-01-01

    their usefulness in security analysis. Key Words: Formal modeling, Veri cation, Knowledge, Security, Probabilistic Systems Supported by grant HKUST 608...94E from the Hong Kong Research Grants Council. yAuthor for correspondence (syverson@itd.nrl.navy.mil). Supported by ONR. 1 Report Documentation Page...Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the

  17. Measurement Models for Reasoned Action Theory

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach. PMID:23243315

  18. Probabilistic durability assessment of concrete structures in marine environments: Reliability and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Yu, Bo; Ning, Chao-lie; Li, Bing

    2017-03-01

    A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.

  19. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  20. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  1. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  2. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  3. Probabilistic analysis of bladed turbine disks and the effect of mistuning

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Nagpal, V. K.; Chamis, Christos C.

    1990-01-01

    Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping are included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.

  4. Probabilistic analysis of bladed turbine disks and the effect of mistuning

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin; Nagpal, V. K.; Chamis, C. C.

    1990-01-01

    Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping have been included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  6. Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2003-01-01

    The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.

  7. Probabilistic analysis of structures involving random stress-strain behavior

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Thacker, B. H.; Harren, S. V.

    1991-01-01

    The present methodology for analysis of structures with random stress strain behavior characterizes the uniaxial stress-strain curve in terms of (1) elastic modulus, (2) engineering stress at initial yield, (3) initial plastic-hardening slope, (4) engineering stress at point of ultimate load, and (5) engineering strain at point of ultimate load. The methodology is incorporated into the Numerical Evaluation of Stochastic Structures Under Stress code for probabilistic structural analysis. The illustrative problem of a thick cylinder under internal pressure, where both the internal pressure and the stress-strain curve are random, is addressed by means of the code. The response value is the cumulative distribution function of the equivalent plastic strain at the inner radius.

  8. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Astrophysics Data System (ADS)

    Godines, Cody R.; Manteufel, Randall D.

    2002-12-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  9. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Technical Reports Server (NTRS)

    Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)

    2002-01-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  10. Causal modeling in international migration research: a methodological prolegomenon.

    PubMed

    Papademetriou, D G; Hopple, G W

    1982-10-01

    The authors examine the value of using models to study the migration process. In particular, they demonstrate the potential utility of a partial least squares modeling approach to the causal analysis of international migration.

  11. Causality, mediation and time: a dynamic viewpoint

    PubMed Central

    Aalen, Odd O; Røysland, Kjetil; Gran, Jon Michael; Ledergerber, Bruno

    2012-01-01

    Summary. Time dynamics are often ignored in causal modelling. Clearly, causality must operate in time and we show how this corresponds to a mechanistic, or system, understanding of causality. The established counterfactual definitions of direct and indirect effects depend on an ability to manipulate the mediator which may not hold in practice, and we argue that a mechanistic view may be better. Graphical representations based on local independence graphs and dynamic path analysis are used to facilitate communication as well as providing an overview of the dynamic relations ‘at a glance’. The relationship between causality as understood in a mechanistic and in an interventionist sense is discussed. An example using data from the Swiss HIV Cohort Study is presented. PMID:23193356

  12. Structure and Connectivity Analysis of Financial Complex System Based on G-Causality Network

    NASA Astrophysics Data System (ADS)

    Xu, Chuan-Ming; Yan, Yan; Zhu, Xiao-Wu; Li, Xiao-Teng; Chen, Xiao-Song

    2013-11-01

    The recent financial crisis highlights the inherent weaknesses of the financial market. To explore the mechanism that maintains the financial market as a system, we study the interactions of U.S. financial market from the network perspective. Applied with conditional Granger causality network analysis, network density, in-degree and out-degree rankings are important indicators to analyze the conditional causal relationships among financial agents, and further to assess the stability of U.S. financial systems. It is found that the topological structure of G-causality network in U.S. financial market changed in different stages over the last decade, especially during the recent global financial crisis. Network density of the G-causality model is much higher during the period of 2007-2009 crisis stage, and it reaches the peak value in 2008, the most turbulent time in the crisis. Ranked by in-degrees and out-degrees, insurance companies are listed in the top of 68 financial institutions during the crisis. They act as the hubs which are more easily influenced by other financial institutions and simultaneously influence others during the global financial disturbance.

  13. Exploring Students' Mapping Behaviors and Interactive Discourses in a Case Diagnosis Problem: Sequential Analysis of Collaborative Causal Map Drawing Processes

    ERIC Educational Resources Information Center

    Lee, Woon Jee

    2012-01-01

    The purpose of this study was to explore the nature of students' mapping and discourse behaviors while constructing causal maps to articulate their understanding of a complex, ill-structured problem. In this study, six graduate-level students were assigned to one of three pair groups, and each pair used the causal mapping software program,…

  14. Does Causality Matter More Now? Increase in the Proportion of Causal Language in English Texts.

    PubMed

    Iliev, Rumen; Axelrod, Robert

    2016-05-01

    The vast majority of the work on culture and cognition has focused on cross-cultural comparisons, largely ignoring the dynamic aspects of culture. In this article, we provide a diachronic analysis of causal cognition over time. We hypothesized that the increased role of education, science, and technology in Western societies should be accompanied by greater attention to causal connections. To test this hypothesis, we compared word frequencies in English texts from different time periods and found an increase in the use of causal language of about 40% over the past two centuries. The observed increase was not attributable to general language effects or to changing semantics of causal words. We also found that there was a consistent difference between the 19th and the 20th centuries, and that the increase happened mainly in the 20th century. © The Author(s) 2016.

  15. Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

    PubMed Central

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-01-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717

  16. Stability analysis for discrete-time stochastic memristive neural networks with both leakage and probabilistic delays.

    PubMed

    Liu, Hongjian; Wang, Zidong; Shen, Bo; Huang, Tingwen; Alsaadi, Fuad E

    2018-06-01

    This paper is concerned with the globally exponential stability problem for a class of discrete-time stochastic memristive neural networks (DSMNNs) with both leakage delays as well as probabilistic time-varying delays. For the probabilistic delays, a sequence of Bernoulli distributed random variables is utilized to determine within which intervals the time-varying delays fall at certain time instant. The sector-bounded activation function is considered in the addressed DSMNN. By taking into account the state-dependent characteristics of the network parameters and choosing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are established under which the underlying DSMNN is globally exponentially stable in the mean square. The derived conditions are made dependent on both the leakage and the probabilistic delays, and are therefore less conservative than the traditional delay-independent criteria. A simulation example is given to show the effectiveness of the proposed stability criterion. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Automated liver segmentation using a normalized probabilistic atlas

    NASA Astrophysics Data System (ADS)

    Linguraru, Marius George; Li, Zhixi; Shah, Furhawn; Chin, See; Summers, Ronald M.

    2009-02-01

    Probabilistic atlases of anatomical organs, especially the brain and the heart, have become popular in medical image analysis. We propose the construction of probabilistic atlases which retain structural variability by using a size-preserving modified affine registration. The organ positions are modeled in the physical space by normalizing the physical organ locations to an anatomical landmark. In this paper, a liver probabilistic atlas is constructed and exploited to automatically segment liver volumes from abdominal CT data. The atlas is aligned with the patient data through a succession of affine and non-linear registrations. The overlap and correlation with manual segmentations are 0.91 (0.93 DICE coefficient) and 0.99 respectively. Little work has taken place on the integration of volumetric measures of liver abnormality to clinical evaluations, which rely on linear estimates of liver height. Our application measures the liver height at the mid-hepatic line (0.94 correlation with manual measurements) and indicates that its combination with volumetric estimates could assist the development of a noninvasive tool to assess hepatomegaly.

  18. Non-Deterministic Dynamic Instability of Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2004-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.

  19. Genetically determined schizophrenia is not associated with impaired glucose homeostasis.

    PubMed

    Polimanti, Renato; Gelernter, Joel; Stein, Dan J

    2018-05-01

    Here, we used data from large genome-wide association studies to test the presence of causal relationships, conducting a Mendelian randomization analysis; and shared molecular mechanisms, calculating the genetic correlation, among schizophrenia, type 2 diabetes (T2D), and impaired glucose homeostasis. Although our Mendelian randomization analysis was well-powered, no causal relationship was observed between schizophrenia and T2D, or traits related to glucose impaired homeostasis. Similarly, we did not observe any global genetic overlap among these traits. These findings indicate that there is no causal relationships or shared mechanisms between schizophrenia and impaired glucose homeostasis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  1. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  2. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  3. Functional clustering of time series gene expression data by Granger causality

    PubMed Central

    2012-01-01

    Background A common approach for time series gene expression data analysis includes the clustering of genes with similar expression patterns throughout time. Clustered gene expression profiles point to the joint contribution of groups of genes to a particular cellular process. However, since genes belong to intricate networks, other features, besides comparable expression patterns, should provide additional information for the identification of functionally similar genes. Results In this study we perform gene clustering through the identification of Granger causality between and within sets of time series gene expression data. Granger causality is based on the idea that the cause of an event cannot come after its consequence. Conclusions This kind of analysis can be used as a complementary approach for functional clustering, wherein genes would be clustered not solely based on their expression similarity but on their topological proximity built according to the intensity of Granger causality among them. PMID:23107425

  4. The DOZZ formula from the path integral

    NASA Astrophysics Data System (ADS)

    Kupiainen, Antti; Rhodes, Rémi; Vargas, Vincent

    2018-05-01

    We present a rigorous proof of the Dorn, Otto, Zamolodchikov, Zamolodchikov formula (the DOZZ formula) for the 3 point structure constants of Liouville Conformal Field Theory (LCFT) starting from a rigorous probabilistic construction of the functional integral defining LCFT given earlier by the authors and David. A crucial ingredient in our argument is a probabilistic derivation of the reflection relation in LCFT based on a refined tail analysis of Gaussian multiplicative chaos measures.

  5. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGES

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  6. A pre-crisis vs. crisis analysis of peripheral EU stock markets by means of wavelet transform and a nonlinear causality test

    NASA Astrophysics Data System (ADS)

    Polanco-Martínez, J. M.; Fernández-Macho, J.; Neumann, M. B.; Faria, S. H.

    2018-01-01

    This paper presents an analysis of EU peripheral (so-called PIIGS) stock market indices and the S&P Europe 350 index (SPEURO), as a European benchmark market, over the pre-crisis (2004-2007) and crisis (2008-2011) periods. We computed a rolling-window wavelet correlation for the market returns and applied a non-linear Granger causality test to the wavelet decomposition coefficients of these stock market returns. Our results show that the correlation is stronger for the crisis than for the pre-crisis period. The stock market indices from Portugal, Italy and Spain were more interconnected among themselves during the crisis than with the SPEURO. The stock market from Portugal is the most sensitive and vulnerable PIIGS member, whereas the stock market from Greece tends to move away from the European benchmark market since the 2008 financial crisis till 2011. The non-linear causality test indicates that in the first three wavelet scales (intraweek, weekly and fortnightly) the number of uni-directional and bi-directional causalities is greater during the crisis than in the pre-crisis period, because of financial contagion. Furthermore, the causality analysis shows that the direction of the Granger cause-effect for the pre-crisis and crisis periods is not invariant in the considered time-scales, and that the causality directions among the studied stock markets do not seem to have a preferential direction. These results are relevant to better understand the behaviour of vulnerable stock markets, especially for investors and policymakers.

  7. A causal analysis framework for land-use change and the potential role of bioenergy policy

    DOE PAGES

    Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild; ...

    2016-10-05

    Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less

  8. A causal analysis framework for land-use change and the potential role of bioenergy policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild

    Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less

  9. Frontal–Occipital Connectivity During Visual Search

    PubMed Central

    Pantazatos, Spiro P.; Yanagihara, Ted K.; Zhang, Xian; Meitzler, Thomas

    2012-01-01

    Abstract Although expectation- and attention-related interactions between ventral and medial prefrontal cortex and stimulus category-selective visual regions have been identified during visual detection and discrimination, it is not known if similar neural mechanisms apply to other tasks such as visual search. The current work tested the hypothesis that high-level frontal regions, previously implicated in expectation and visual imagery of object categories, interact with visual regions associated with object recognition during visual search. Using functional magnetic resonance imaging, subjects searched for a specific object that varied in size and location within a complex natural scene. A model-free, spatial-independent component analysis isolated multiple task-related components, one of which included visual cortex, as well as a cluster within ventromedial prefrontal cortex (vmPFC), consistent with the engagement of both top-down and bottom-up processes. Analyses of psychophysiological interactions showed increased functional connectivity between vmPFC and object-sensitive lateral occipital cortex (LOC), and results from dynamic causal modeling and Bayesian Model Selection suggested bidirectional connections between vmPFC and LOC that were positively modulated by the task. Using image-guided diffusion-tensor imaging, functionally seeded, probabilistic white-matter tracts between vmPFC and LOC, which presumably underlie this effective interconnectivity, were also observed. These connectivity findings extend previous models of visual search processes to include specific frontal–occipital neuronal interactions during a natural and complex search task. PMID:22708993

  10. The co-integration analysis of relationship between urban infrastructure and urbanization - A case of Shanghai

    NASA Astrophysics Data System (ADS)

    Wang, Qianlu

    2017-10-01

    Urban infrastructure and urbanization influence each other, and quantitative analysis of the relationship between them will play a significant role in promoting the social development. The paper based on the data of infrastructure and the proportion of urban population in Shanghai from 1988 to 2013, use the econometric analysis of co-integration test, error correction model and Granger causality test method, and empirically analyze the relationship between Shanghai's infrastructure and urbanization. The results show that: 1) Shanghai Urban infrastructure has a positive effect for the development of urbanization and narrowing the population gap; 2) when the short-term fluctuations deviate from long-term equilibrium, the system will pull the non-equilibrium state back to equilibrium with an adjust intensity 0.342670. And hospital infrastructure is not only an important variable for urban development in short-term, but also a leading infrastructure in the process of urbanization in Shanghai; 3) there has Granger causality between road infrastructure and urbanization; and there is no Granger causality between water infrastructure and urbanization, hospital and school infrastructures of social infrastructure have unidirectional Granger causality with urbanization.

  11. Causal imprinting in causal structure learning.

    PubMed

    Taylor, Eric G; Ahn, Woo-Kyoung

    2012-11-01

    Suppose one observes a correlation between two events, B and C, and infers that B causes C. Later one discovers that event A explains away the correlation between B and C. Normatively, one should now dismiss or weaken the belief that B causes C. Nonetheless, participants in the current study who observed a positive contingency between B and C followed by evidence that B and C were independent given A, persisted in believing that B causes C. The authors term this difficulty in revising initially learned causal structures "causal imprinting." Throughout four experiments, causal imprinting was obtained using multiple dependent measures and control conditions. A Bayesian analysis showed that causal imprinting may be normative under some conditions, but causal imprinting also occurred in the current study when it was clearly non-normative. It is suggested that causal imprinting occurs due to the influence of prior knowledge on how reasoners interpret later evidence. Consistent with this view, when participants first viewed the evidence showing that B and C are independent given A, later evidence with only B and C did not lead to the belief that B causes C. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    PubMed

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Revisiting the investor sentiment-stock returns relationship: A multi-scale perspective using wavelets

    NASA Astrophysics Data System (ADS)

    Lao, Jiashun; Nie, He; Jiang, Yonghong

    2018-06-01

    This paper employs SBW proposed by Baker and Wurgler (2006) to investigate the nonlinear asymmetric Granger causality between investor sentiment and stock returns for US economy while considering different time-scales. The wavelet method is utilized to decompose time series of investor sentiment and stock returns at different time-scales to focus on the local analysis of different time horizons of investors. The linear and nonlinear asymmetric Granger methods are employed to examine the Granger causal relationship on similar time-scales. We find evidence of strong bilateral linear and nonlinear asymmetric Granger causality between longer-term investor sentiment and stock returns. Furthermore, we observe the positive nonlinear causal relationship from stock returns to investor sentiment and the negative nonlinear causal relationship from investor sentiment to stock returns.

  14. Model based inference from microvascular measurements: Combining experimental measurements and model predictions using a Bayesian probabilistic approach

    PubMed Central

    Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif

    2017-01-01

    Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383

  15. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  16. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  17. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  18. Re-Ranking Sequencing Variants in the Post-GWAS Era for Accurate Causal Variant Identification

    PubMed Central

    Faye, Laura L.; Machiela, Mitchell J.; Kraft, Peter; Bull, Shelley B.; Sun, Lei

    2013-01-01

    Next generation sequencing has dramatically increased our ability to localize disease-causing variants by providing base-pair level information at costs increasingly feasible for the large sample sizes required to detect complex-trait associations. Yet, identification of causal variants within an established region of association remains a challenge. Counter-intuitively, certain factors that increase power to detect an associated region can decrease power to localize the causal variant. First, combining GWAS with imputation or low coverage sequencing to achieve the large sample sizes required for high power can have the unintended effect of producing differential genotyping error among SNPs. This tends to bias the relative evidence for association toward better genotyped SNPs. Second, re-use of GWAS data for fine-mapping exploits previous findings to ensure genome-wide significance in GWAS-associated regions. However, using GWAS findings to inform fine-mapping analysis can bias evidence away from the causal SNP toward the tag SNP and SNPs in high LD with the tag. Together these factors can reduce power to localize the causal SNP by more than half. Other strategies commonly employed to increase power to detect association, namely increasing sample size and using higher density genotyping arrays, can, in certain common scenarios, actually exacerbate these effects and further decrease power to localize causal variants. We develop a re-ranking procedure that accounts for these adverse effects and substantially improves the accuracy of causal SNP identification, often doubling the probability that the causal SNP is top-ranked. Application to the NCI BPC3 aggressive prostate cancer GWAS with imputation meta-analysis identified a new top SNP at 2 of 3 associated loci and several additional possible causal SNPs at these loci that may have otherwise been overlooked. This method is simple to implement using R scripts provided on the author's website. PMID:23950724

  19. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  20. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

Top