Sample records for basic event probabilities

  1. [Biometric bases: basic concepts of probability calculation].

    PubMed

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  2. Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    2011-01-01

    Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less

  3. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  4. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  5. Introducing Disjoint and Independent Events in Probability.

    ERIC Educational Resources Information Center

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  6. Uncertainty analysis in fault tree models with dependent basic events.

    PubMed

    Pedroni, Nicola; Zio, Enrico

    2013-06-01

    In general, two types of dependence need to be considered when estimating the probability of the top event (TE) of a fault tree (FT): "objective" dependence between the (random) occurrences of different basic events (BEs) in the FT and "state-of-knowledge" (epistemic) dependence between estimates of the epistemically uncertain probabilities of some BEs of the FT model. In this article, we study the effects on the TE probability of objective and epistemic dependences. The well-known Frèchet bounds and the distribution envelope determination (DEnv) method are used to model all kinds of (possibly unknown) objective and epistemic dependences, respectively. For exemplification, the analyses are carried out on a FT with six BEs. Results show that both types of dependence significantly affect the TE probability; however, the effects of epistemic dependence are likely to be overwhelmed by those of objective dependence (if present). © 2012 Society for Risk Analysis.

  7. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  8. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  9. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions

    PubMed Central

    Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348

  10. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions.

    PubMed

    Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.

  11. Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana Kelly; Song-Hua Shen; Gary DeMoss

    2010-06-01

    Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components aremore » assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.« less

  12. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  13. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mumpower, J.L.

    There are strong structural similarities between risks from technological hazards and big-purse state lottery games. Risks from technological hazards are often described as low-probability, high-consequence negative events. State lotteries could be equally well characterized as low-probability, high-consequence positive events. Typical communications about state lotteries provide a virtual strategic textbook for opponents of risky technologies. The same techniques can be used to sell lottery tickets or sell opposition to risky technologies. Eight basic principles are enumerated.

  15. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  16. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  17. Fault tree analysis for urban flooding.

    PubMed

    ten Veldhuis, J A E; Clemens, F H L R; van Gelder, P H A J M

    2009-01-01

    Traditional methods to evaluate flood risk generally focus on heavy storm events as the principal cause of flooding. Conversely, fault tree analysis is a technique that aims at modelling all potential causes of flooding. It quantifies both overall flood probability and relative contributions of individual causes of flooding. This paper presents a fault model for urban flooding and an application to the case of Haarlem, a city of 147,000 inhabitants. Data from a complaint register, rainfall gauges and hydrodynamic model calculations are used to quantify probabilities of basic events in the fault tree. This results in a flood probability of 0.78/week for Haarlem. It is shown that gully pot blockages contribute to 79% of flood incidents, whereas storm events contribute only 5%. This implies that for this case more efficient gully pot cleaning is a more effective strategy to reduce flood probability than enlarging drainage system capacity. Whether this is also the most cost-effective strategy can only be decided after risk assessment has been complemented with a quantification of consequences of both types of events. To do this will be the next step in this study.

  18. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation.

    PubMed

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.

  19. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Modeling and simulation of count data.

    PubMed

    Plan, E L

    2014-08-13

    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.

  1. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  2. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation

    PubMed Central

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545

  3. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  4. Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro

    2013-07-01

    There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.

  5. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  6. Hoeffding Type Inequalities and their Applications in Statistics and Operations Research

    NASA Astrophysics Data System (ADS)

    Daras, Tryfon

    2007-09-01

    Large Deviation theory is the branch of Probability theory that deals with rare events. Sometimes, these events can be described by the sum of random variables that deviates from its mean more than a "normal" amount. A precise calculation of the probabilities of such events turns out to be crucial in a variety of different contents (e.g. in Probability Theory, Statistics, Operations Research, Statistical Physics, Financial Mathematics e.t.c.). Recent applications of the theory deal with random walks in random environments, interacting diffusions, heat conduction, polymer chains [1]. In this paper we prove an inequality of exponential type, namely theorem 2.1, which gives a large deviation upper bound for a specific sequence of r.v.s. Inequalities of this type have many applications in Combinatorics [2]. The inequality generalizes already proven results of this type, in the case of symmetric probability measures. We get as consequences to the inequality: (a) large deviations upper bounds for exchangeable Bernoulli sequences of random variables, generalizing results proven for independent and identically distributed Bernoulli sequences of r.v.s. and (b) a general form of Bernstein's inequality. We compare the inequality with large deviation results already proven by the author and try to see its advantages. Finally, using the inequality, we solve one of the basic problems of Operations Research (bin packing problem) in the case of exchangeable r.v.s.

  7. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  8. Reliability analysis of a wastewater treatment plant using fault tree analysis and Monte Carlo simulation.

    PubMed

    Taheriyoun, Masoud; Moradinejad, Saber

    2015-01-01

    The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.

  9. Conversion of Questionnaire Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less

  10. Deterministic versus evidence-based attitude towards clinical diagnosis.

    PubMed

    Soltani, Akbar; Moayyeri, Alireza

    2007-08-01

    Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.

  11. Maximum likelihood inference implies a high, not a low, ancestral haploid chromosome number in Araceae, with a critique of the bias introduced by ‘x’

    PubMed Central

    Cusimano, Natalie; Sousa, Aretuza; Renner, Susanne S.

    2012-01-01

    Background and Aims For 84 years, botanists have relied on calculating the highest common factor for series of haploid chromosome numbers to arrive at a so-called basic number, x. This was done without consistent (reproducible) reference to species relationships and frequencies of different numbers in a clade. Likelihood models that treat polyploidy, chromosome fusion and fission as events with particular probabilities now allow reconstruction of ancestral chromosome numbers in an explicit framework. We have used a modelling approach to reconstruct chromosome number change in the large monocot family Araceae and to test earlier hypotheses about basic numbers in the family. Methods Using a maximum likelihood approach and chromosome counts for 26 % of the 3300 species of Araceae and representative numbers for each of the other 13 families of Alismatales, polyploidization events and single chromosome changes were inferred on a genus-level phylogenetic tree for 113 of the 117 genera of Araceae. Key Results The previously inferred basic numbers x = 14 and x = 7 are rejected. Instead, maximum likelihood optimization revealed an ancestral haploid chromosome number of n = 16, Bayesian inference of n = 18. Chromosome fusion (loss) is the predominant inferred event, whereas polyploidization events occurred less frequently and mainly towards the tips of the tree. Conclusions The bias towards low basic numbers (x) introduced by the algebraic approach to inferring chromosome number changes, prevalent among botanists, may have contributed to an unrealistic picture of ancestral chromosome numbers in many plant clades. The availability of robust quantitative methods for reconstructing ancestral chromosome numbers on molecular phylogenetic trees (with or without branch length information), with confidence statistics, makes the calculation of x an obsolete approach, at least when applied to large clades. PMID:22210850

  12. Probabilistic analysis on the failure of reactivity control for the PWR

    NASA Astrophysics Data System (ADS)

    Sony Tjahyani, D. T.; Deswandri; Sunaryo, G. R.

    2018-02-01

    The fundamental safety function of the power reactor is to control reactivity, to remove heat from the reactor, and to confine radioactive material. The safety analysis is used to ensure that each parameter is fulfilled during the design and is done by deterministic and probabilistic method. The analysis of reactivity control is important to be done because it will affect the other of fundamental safety functions. The purpose of this research is to determine the failure probability of the reactivity control and its failure contribution on a PWR design. The analysis is carried out by determining intermediate events, which cause the failure of reactivity control. Furthermore, the basic event is determined by deductive method using the fault tree analysis. The AP1000 is used as the object of research. The probability data of component failure or human error, which is used in the analysis, is collected from IAEA, Westinghouse, NRC and other published documents. The results show that there are six intermediate events, which can cause the failure of the reactivity control. These intermediate events are uncontrolled rod bank withdrawal at low power or full power, malfunction of boron dilution, misalignment of control rod withdrawal, malfunction of improper position of fuel assembly and ejection of control rod. The failure probability of reactivity control is 1.49E-03 per year. The causes of failures which are affected by human factor are boron dilution, misalignment of control rod withdrawal and malfunction of improper position for fuel assembly. Based on the assessment, it is concluded that the failure probability of reactivity control on the PWR is still within the IAEA criteria.

  13. Mass extinctions: Persistent problems and new directions

    NASA Technical Reports Server (NTRS)

    Jablonski, D.

    1994-01-01

    Few contest that mass extinctions have punctuated the history of life, or that those events were so pervasive environmentally, taxonomically, and geographically that physical forcing factors were probably involved. However, consensus remains elusive on the nature of those factors, and on how a given perturbation - impact, volcanism, sea-level change, or ocean anoxic event - could actually generate the observed intensity and selectivity of biotic losses. At least two basic problems underlie these long-standing disagreements: difficulties in resolving the fine details of taxon ranges and abundances immediately prior to and after an extinction boundary and the scarcity of simple, unitary cause-and-effect relations in complex biological systems.

  14. Causal illusions in children when the outcome is frequent

    PubMed Central

    2017-01-01

    Causal illusions occur when people perceive a causal relation between two events that are actually unrelated. One factor that has been shown to promote these mistaken beliefs is the outcome probability. Thus, people tend to overestimate the strength of a causal relation when the potential consequence (i.e. the outcome) occurs with a high probability (outcome-density bias). Given that children and adults differ in several important features involved in causal judgment, including prior knowledge and basic cognitive skills, developmental studies can be considered an outstanding approach to detect and further explore the psychological processes and mechanisms underlying this bias. However, the outcome density bias has been mainly explored in adulthood, and no previous evidence for this bias has been reported in children. Thus, the purpose of this study was to extend outcome-density bias research to childhood. In two experiments, children between 6 and 8 years old were exposed to two similar setups, both showing a non-contingent relation between the potential cause and the outcome. These two scenarios differed only in the probability of the outcome, which could either be high or low. Children judged the relation between the two events to be stronger in the high probability of the outcome setting, revealing that, like adults, they develop causal illusions when the outcome is frequent. PMID:28898294

  15. Risk Importance Measures in the Designand Operation of Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrbanic I.; Samanta P.; Basic, I

    This monograph presents and discusses risk importance measures as quantified by the probabilistic risk assessment (PRA) models of nuclear power plants (NPPs) developed according to the current standards and practices. Usually, PRA tools calculate risk importance measures related to a single ?basic event? representing particular failure mode. This is, then, reflected in many current PRA applications. The monograph focuses on the concept of ?component-level? importance measures that take into account different failure modes of the component including common-cause failures (CCFs). In opening sections the roleof risk assessment in safety analysis of an NPP is introduced and discussion given of ?traditional?,more » mainly deterministic, design principles which have been established to assign a level of importance to a particular system, structure or component. This is followed by an overview of main risk importance measures for risk increase and risk decrease from current PRAs. Basic relations which exist among the measures are shown. Some of the current practical applications of risk importancemeasures from the field of NPP design, operation and regulation are discussed. The core of the monograph provides a discussion on theoreticalbackground and practical aspects of main risk importance measures at the level of ?component? as modeled in a PRA, starting from the simplest case, single basic event, and going toward more complexcases with multiple basic events and involvements in CCF groups. The intent is to express the component-level importance measures via theimportance measures and probabilities of the underlying single basic events, which are the inputs readily available from a PRA model andits results. Formulas are derived and discussed for some typical cases. The formulas and their results are demonstrated through some practicalexamples, done by means of a simplified PRA model developed in and run by RiskSpectrum? tool, which are presented in the appendices. The monograph concludes with discussion of limitations of the use of risk importance measures and a summary of component-level importance cases evaluated.« less

  16. Modulation of cognitive control levels via manipulation of saccade trial-type probability assessed with event-related BOLD fMRI.

    PubMed

    Pierce, Jordan E; McDowell, Jennifer E

    2016-02-01

    Cognitive control supports flexible behavior adapted to meet current goals and can be modeled through investigation of saccade tasks with varying cognitive demands. Basic prosaccades (rapid glances toward a newly appearing stimulus) are supported by neural circuitry, including occipital and posterior parietal cortex, frontal and supplementary eye fields, and basal ganglia. These trials can be contrasted with complex antisaccades (glances toward the mirror image location of a stimulus), which are characterized by greater functional magnetic resonance imaging (MRI) blood oxygenation level-dependent (BOLD) signal in the aforementioned regions and recruitment of additional regions such as dorsolateral prefrontal cortex. The current study manipulated the cognitive demands of these saccade tasks by presenting three rapid event-related runs of mixed saccades with a varying probability of antisaccade vs. prosaccade trials (25, 50, or 75%). Behavioral results showed an effect of trial-type probability on reaction time, with slower responses in runs with a high antisaccade probability. Imaging results exhibited an effect of probability in bilateral pre- and postcentral gyrus, bilateral superior temporal gyrus, and medial frontal gyrus. Additionally, the interaction between saccade trial type and probability revealed a strong probability effect for prosaccade trials, showing a linear increase in activation parallel to antisaccade probability in bilateral temporal/occipital, posterior parietal, medial frontal, and lateral prefrontal cortex. In contrast, antisaccade trials showed elevated activation across all runs. Overall, this study demonstrated that improbable performance of a typically simple prosaccade task led to augmented BOLD signal to support changing cognitive control demands, resulting in activation levels similar to the more complex antisaccade task. Copyright © 2016 the American Physiological Society.

  17. A Robust Response of Precipitation to Global Warming from CMIP5 Models

    NASA Technical Reports Server (NTRS)

    Lau, K. -M.; Wu, H. -T.; Kim, K. -M.

    2012-01-01

    How precipitation responds to global warming is a major concern to society and a challenge to climate change research. Based on analyses of rainfall probability distribution functions of 14 state-of-the-art climate models, we find a robust, canonical global rainfall response to a triple CO2 warming scenario, featuring 100 250% more heavy rain, 5-10% less moderate rain, and 10-15% more very light or no-rain events. Regionally, a majority of the models project a consistent response with more heavy rain events over climatologically wet regions of the deep tropics, and more dry events over subtropical and tropical land areas. Results suggest that increased CO2 emissions induce basic structural changes in global rain systems, increasing risks of severe floods and droughts in preferred geographic locations worldwide.

  18. Directly observable optical properties of sprites in Central Europe

    NASA Astrophysics Data System (ADS)

    Bór, József

    2013-04-01

    Luminous optical emissions accompanying streamer-based natural electric breakdown processes initiating in the mesosphere are called sprites. 489 sprite events have been observed with a TV frame rate video system in Central Europe from Sopron (47.68N, 16.58E, 230 m MSL), Hungary between 2007 and 2009. On the basis of these observations, characteristic morphological properties of sprites, i.e. basic forms (e.g. column, carrot, angel, etc.) as well as common morphological features (e.g. tendrils, glows, puffs, beads, etc.), have been identified. Probable time sequences of streamer propagation directions were associated with each of the basic sprite forms. It is speculated that different sequences of streamer propagation directions can result in very similar final sprite shapes. The number and type variety of sprite elements appearing in an event as well as the total optical duration of an event was analyzed statistically. Jellyfish and dancing sprite events were considered as special subsets of sprite clusters. It was found that more than 90% of the recorded sprite elements appeared in clusters rather than alone and more than half of the clusters contained more than one basic sprite forms. The analysis showed that jellyfish sprites and clusters of column sprites featuring glows and tendrils do not tend to have optical lifetimes longer than 80 ms. Such very long optical lifetimes have not been observed in sprite clusters containing more than 25 elements of any type, either. In contrast to clusters containing sprite entities of only one form, sprite events showing more sprite forms seem to have extended optical durations more likely. The need for further investigation and for finding theoretical concepts to link these observations to electric conditions ambient for sprite formation is emphasized.

  19. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  20. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  1. Unraveling multiple changes in complex climate time series using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias

    2016-04-01

    Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established global climate events.

  2. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    ERIC Educational Resources Information Center

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  3. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  4. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    ERIC Educational Resources Information Center

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  5. Political violence and mental health of Bedouin children in the West Bank, Palestine: a cross-sectional study.

    PubMed

    Massad, Salwa; Khammash, Umaiyeh; Shute, Rosalyn

    2017-09-01

    The Bedouin population is among the most vulnerable in Palestine, subject to forced relocation and lacking basic necessities, including water and electricity. To our knowledge, there are no studies on the mental health of Palestinian Bedouin children. A cross-sectional household survey was conducted examining exposures to traumatic events and mental health among 455 refugee children between the ages of 5-16 years old, and randomly selected from 18 Bedouin communities throughout the West Bank, including East Jerusalem. Mental health status was measured using the Strength and Difficulties Questionnaire. Based on reports by mothers, teachers and children, 44% of the participants in the study had a probable psychiatric disorder. Exposure to traumatic events, fair/poor maternal self-rated mental health, and younger age were positively associated with child mental health problems. The findings highlight the importance of maternal mental health as a contributing factor affecting children's vulnerability. Bedouin mothers and their children need immediate psychosocial intervention, as well as the protection of their basic human rights.

  6. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    ERIC Educational Resources Information Center

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miall, A.D.

    The basic premise of the recent Exxon cycle chart, that there exists a globally correlatable suite of third-order eustatic cycles, remains unproven. Many of the tests of this premise are based on circular reasoning. The implied precision of the Exxon global cycle chart is not supportable, because it is greater than that of the best available chronostratigraphic techniques, such as those used to construct the global standard time scale. Correlations of new stratigraphic sections with the Exxon chart will almost always succeed, because there are so many Exxon sequence-boundary events from which to choose. This is demonstrated by the usemore » of four synthetic sections constructed from tables of random numbers. A minimum of 77% successful correlations of random events with the Exxon chart was achieved. The existing cycle chart represents an amalgam of regional and local tectonic events and probably also includes unrecognized miscorrelations. It is of questionable value as an independent standard of geologic time.« less

  8. Rogue waves and entropy consumption

    NASA Astrophysics Data System (ADS)

    Hadjihoseini, Ali; Lind, Pedro G.; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim

    2017-11-01

    Based on data from the Sea of Japan and the North Sea the occurrence of rogue waves is analyzed by a scale-dependent stochastic approach, which interlinks fluctuations of waves for different spacings. With this approach we are able to determine a stochastic cascade process, which provides information of the general multipoint statistics. Furthermore the evolution of single trajectories in scale, which characterize wave height fluctuations in the surroundings of a chosen location, can be determined. The explicit knowledge of the stochastic process enables to assign entropy values to all wave events. We show that for these entropies the integral fluctuation theorem, a basic law of non-equilibrium thermodynamics, is valid. This implies that positive and negative entropy events must occur. Extreme events like rogue waves are characterized as negative entropy events. The statistics of these entropy fluctuations changes with the wave state, thus for the Sea of Japan the statistics of the entropies has a more pronounced tail for negative entropy values, indicating a higher probability of rogue waves.

  9. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.

  10. One hundred years of return period: Strengths and limitations

    NASA Astrophysics Data System (ADS)

    Volpi, E.; Fiori, A.; Grimaldi, S.; Lombardo, F.; Koutsoyiannis, D.

    2015-10-01

    One hundred years from its original definition by Fuller, the probabilistic concept of return period is widely used in hydrology as well as in other disciplines of geosciences to give an indication on critical event rareness. This concept gains its popularity, especially in engineering practice for design and risk assessment, due to its ease of use and understanding; however, return period relies on some basic assumptions that should be satisfied for a correct application of this statistical tool. Indeed, conventional frequency analysis in hydrology is performed by assuming as necessary conditions that extreme events arise from a stationary distribution and are independent of one another. The main objective of this paper is to investigate the properties of return period when the independence condition is omitted; hence, we explore how the different definitions of return period available in literature affect results of frequency analysis for processes correlated in time. We demonstrate that, for stationary processes, the independence condition is not necessary in order to apply the classical equation of return period (i.e., the inverse of exceedance probability). On the other hand, we show that the time-correlation structure of hydrological processes modifies the shape of the distribution function of which the return period represents the first moment. This implies that, in the context of time-dependent processes, the return period might not represent an exhaustive measure of the probability of failure, and that its blind application could lead to misleading results. To overcome this problem, we introduce the concept of Equivalent Return Period, which controls the probability of failure still preserving the virtue of effectively communicating the event rareness.

  11. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    DOT National Transportation Integrated Search

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  12. Evaluation of anthropogenic influence in probabilistic forecasting of coastal change

    NASA Astrophysics Data System (ADS)

    Hapke, C. J.; Wilson, K.; Adams, P. N.

    2014-12-01

    Prediction of large scale coastal behavior is especially challenging in areas of pervasive human activity. Many coastal zones on the Gulf and Atlantic coasts are moderately to highly modified through the use of soft sediment and hard stabilization techniques. These practices have the potential to alter sediment transport and availability, as well as reshape the beach profile, ultimately transforming the natural evolution of the coastal system. We present the results of a series of probabilistic models, designed to predict the observed geomorphic response to high wave events at Fire Island, New York. The island comprises a variety of land use types, including inhabited communities with modified beaches, where beach nourishment and artificial dune construction (scraping) occur, unmodified zones, and protected national seashore. This variation in land use presents an opportunity for comparison of model accuracy across highly modified and rarely modified stretches of coastline. Eight models with basic and expanded structures were developed, resulting in sixteen models, informed with observational data from Fire Island. The basic model type does not include anthropogenic modification. The expanded model includes records of nourishment and scraping, designed to quantify the improved accuracy when anthropogenic activity is represented. Modification was included as frequency of occurrence divided by the time since the most recent event, to distinguish between recent and historic events. All but one model reported improved predictive accuracy from the basic to expanded form. The addition of nourishment and scraping parameters resulted in a maximum reduction in predictive error of 36%. The seven improved models reported an average 23% reduction in error. These results indicate that it is advantageous to incorporate the human forcing into a coastal hazards probability model framework.

  13. The predictive value of chronic kidney disease for assessing cardiovascular events under consideration of pretest probability for coronary artery disease in patients who underwent stress myocardial perfusion imaging.

    PubMed

    Furuhashi, Tatsuhiko; Moroi, Masao; Joki, Nobuhiko; Hase, Hiroki; Masai, Hirofumi; Kunimasa, Taeko; Fukuda, Hiroshi; Sugi, Kaoru

    2013-02-01

    Pretest probability of coronary artery disease (CAD) facilitates diagnosis and risk stratification of CAD. Stress myocardial perfusion imaging (MPI) and chronic kidney disease (CKD) are established major predictors of cardiovascular events. However, the role of CKD to assess pretest probability of CAD has been unclear. This study evaluates the role of CKD to assess the predictive value of cardiovascular events under consideration of pretest probability in patients who underwent stress MPI. Patients with no history of CAD underwent stress MPI (n = 310; male = 166; age = 70; CKD = 111; low/intermediate/high pretest probability = 17/194/99) and were followed for 24 months. Cardiovascular events included cardiac death and nonfatal acute coronary syndrome. Cardiovascular events occurred in 15 of the 310 patients (4.8 %), but not in those with low pretest probability which included 2 CKD patients. In patients with intermediate to high pretest probability (n = 293), multivariate Cox regression analysis identified only CKD [hazard ratio (HR) = 4.88; P = 0.022) and summed stress score of stress MPI (HR = 1.50; P < 0.001) as independent and significant predictors of cardiovascular events. Cardiovascular events were not observed in patients with low pretest probability. In patients with intermediate to high pretest probability, CKD and stress MPI are independent predictors of cardiovascular events considering the pretest probability of CAD in patients with no history of CAD. In assessing pretest probability of CAD, CKD might be an important factor for assessing future cardiovascular prognosis.

  14. Forecasting Tidal Disruption Events for Binary Black Holes with an Outer Tertiary.

    PubMed

    Seto, Naoki; Kyutoku, Koutarou

    2017-04-14

    We discuss the gravitational wave (GW) emission and the orbital evolution of a hierarchical triple system composed of an inner binary black hole (BBH) and an outer tertiary. Depending on the kick velocity at the merger, the merged BBH could tidally disrupt the tertiary. Even though the fraction of BBH mergers accompanied by such disruptions is expected to be much smaller than unity, the existence of a tertiary and its basic parameters (e.g., semimajor axis, projected mass) can be examined for more than 10^{3} BBHs with the follow-on missions to the space GW detector LISA. This allows us to efficiently prescreen the targets for the follow-up searches for the tidal disruption events (TDEs). The TDE probability would be significantly higher for triple systems with aligned orbital- and spin-angular momenta, compared with random configurations.

  15. Assessing performance and validating finite element simulations using probabilistic knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolin, Ronald M.; Rodriguez, E. A.

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less

  16. Improving online risk assessment with equipment prognostics and health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie B.; Liu, Xiaotong; Briere, Chris

    The current approach to evaluating the risk of nuclear power plant (NPP) operation relies on static probabilities of component failure, which are based on industry experience with the existing fleet of nominally similar light water reactors (LWRs). As the nuclear industry looks to advanced reactor designs that feature non-light water coolants (e.g., liquid metal, high temperature gas, molten salt), this operating history is not available. Many advanced reactor designs use advanced components, such as electromagnetic pumps, that have not been used in the US commercial nuclear fleet. Given the lack of rich operating experience, we cannot accurately estimate the evolvingmore » probability of failure for basic components to populate the fault trees and event trees that typically comprise probabilistic risk assessment (PRA) models. Online equipment prognostics and health management (PHM) technologies can bridge this gap to estimate the failure probabilities for components under operation. The enhanced risk monitor (ERM) incorporates equipment condition assessment into the existing PRA and risk monitor framework to provide accurate and timely estimates of operational risk.« less

  17. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  18. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  19. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  20. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  1. On Replacing "Quantum Thinking" with Counterfactual Reasoning

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    The probability theory used in quantum mechanics is currently being employed by psychologists to model the impact of context on decision. Its event space consists of closed subspaces of a Hilbert space, and its probability function sometimes violate the law of the finite additivity of probabilities. Results from the quantum mechanics literature indicate that such a "Hilbert space probability theory" cannot be extended in a useful way to standard, finitely additive, probability theory by the addition of new events with specific probabilities. This chapter presents a new kind of probability theory that shares many fundamental algebraic characteristics with Hilbert space probability theory but does extend to standard probability theory by adjoining new events with specific probabilities. The new probability theory arises from considerations about how psychological experiments are related through counterfactual reasoning.

  2. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia.

    PubMed

    Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.

  3. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia

    PubMed Central

    Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054

  4. A short note on probability in clinical medicine.

    PubMed

    Upshur, Ross E G

    2013-06-01

    Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine. © 2013 John Wiley & Sons Ltd.

  5. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  6. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  7. Global assessment of surfing conditions: seasonal, interannual and long-term variability

    NASA Astrophysics Data System (ADS)

    Espejo, A.; Losada, I.; Mendez, F.

    2012-12-01

    International surfing destinations owe a great debt to specific combinations of wind-wave, thermal conditions and local bathymetry. As surf quality depends on a vast number of geophysical variables, a multivariable standardized index on the basis of expert judgment is proposed to analyze surf resource in a worldwide domain. Data needed is obtained by combining several datasets (reanalyses): 60-year satellite-calibrated spectral wave hindcast (GOW, WaveWatchIII), wind fields from NCEP/NCAR, global sea surface temperature from ERSST.v3b, and global tides from TPXO7.1. A summary of the global surf resource is presented, which highlights the high degree of variability in surfable events. According to general atmospheric circulation, results show that west facing low to middle latitude coasts are more suitable for surfing, especially those in Southern Hemisphere. Month to month analysis reveals strong seasonal changes in the occurrence of surfable events, enhancing those in North Atlantic or North Pacific. Interannual variability is investigated by comparing occurrence values with global and regional climate patterns showing a great influence at both, global and regional scales. Analysis of long term trends shows an increase in the probability of surfable events over the west facing coasts on the planet (i.e. + 30 hours/year in California). The resulting maps provide useful information for surfers and surf related stakeholders, coastal planning, education, and basic research.; Figure 1. Global distribution of medium quality (a) and high quality surf conditions probability (b).

  8. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  9. Topological and Orthomodular Modeling of Context in Behavioral Science

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    2017-02-01

    Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.

  10. Sensor data monitoring and decision level fusion scheme for early fire detection

    NASA Astrophysics Data System (ADS)

    Rizogiannis, Constantinos; Thanos, Konstantinos Georgios; Astyakopoulos, Alkiviadis; Kyriazanos, Dimitris M.; Thomopoulos, Stelios C. A.

    2017-05-01

    The aim of this paper is to present the sensor monitoring and decision level fusion scheme for early fire detection which has been developed in the context of the AF3 Advanced Forest Fire Fighting European FP7 research project, adopted specifically in the OCULUS-Fire control and command system and tested during a firefighting field test in Greece with prescribed real fire, generating early-warning detection alerts and notifications. For this purpose and in order to improve the reliability of the fire detection system, a two-level fusion scheme is developed exploiting a variety of observation solutions from air e.g. UAV infrared cameras, ground e.g. meteorological and atmospheric sensors and ancillary sources e.g. public information channels, citizens smartphone applications and social media. In the first level, a change point detection technique is applied to detect changes in the mean value of each measured parameter by the ground sensors such as temperature, humidity and CO2 and then the Rate-of-Rise of each changed parameter is calculated. In the second level the fire event Basic Probability Assignment (BPA) function is determined for each ground sensor using Fuzzy-logic theory and then the corresponding mass values are combined in a decision level fusion process using Evidential Reasoning theory to estimate the final fire event probability.

  11. The relationship between negative life events and suicidal behavior: moderating role of basic psychological needs.

    PubMed

    Rowe, Catherine A; Walker, Kristin L; Britton, Peter C; Hirsch, Jameson K

    2013-01-01

    Individuals who experience negative life events may be at increased risk for suicidal behavior. Intrapersonal characteristics, such as basic psychological needs, however, may buffer this association. To assess the potential moderating role of overall basic psychological needs, and the separate components of autonomy, competence, and relatedness, on the association between negative life events and suicidal behavior. Our sample of 439 college students (311 females, 71%) completed the following self-report surveys: Life Events Scale, Basic Psychological Needs Scale, Beck Depression Inventory - II, and the Suicide Behaviors Questionnaire-Revised. In support of our hypotheses, negative life events were associated with greater levels of suicidal ideation and attempts, and satisfaction of basic psychological needs, including autonomy, relatedness, and competence, significantly moderated this relationship, over and above the effects of the covariates of age, sex, and depressive symptoms. Suicidal behavior associated with the experience of negative life events is not inevitable. Therapeutically bolstering competence, autonomy, and relatedness may be an important suicide prevention strategy for individuals experiencing life stressors.

  12. Seeing the Forest when Entry Is Unlikely: Probability and the Mental Representation of Events

    ERIC Educational Resources Information Center

    Wakslak, Cheryl J.; Trope, Yaacov; Liberman, Nira; Alony, Rotem

    2006-01-01

    Conceptualizing probability as psychological distance, the authors draw on construal level theory (Y. Trope & N. Liberman, 2003) to propose that decreasing an event's probability leads individuals to represent the event by its central, abstract, general features (high-level construal) rather than by its peripheral, concrete, specific features…

  13. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  14. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    PubMed

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  15. Probability hazard map for future vent opening at Etna volcano (Sicily, Italy).

    NASA Astrophysics Data System (ADS)

    Brancato, Alfonso; Tusa, Giuseppina; Coltelli, Mauro; Proietti, Cristina

    2014-05-01

    Mount Etna is a composite stratovolcano located along the Ionian coast of eastern Sicily. The frequent flank eruptions occurrence (at an interval of years, mostly concentrated along the NE, S and W rift zones) lead to a high volcanic hazard that, linked with intense urbanization, poses a high volcanic risk. A long-term volcanic hazard assessment, mainly based on the past behaviour of the Etna volcano, is the basic tool for the evaluation of this risk. Then, a reliable forecast where the next eruption will occur is needed. A computer-assisted analysis and probabilistic evaluations will provide the relative map, thus allowing identification of the areas prone to the highest hazard. Based on these grounds, the use of a code such BET_EF (Bayesian Event Tree_Eruption Forecasting) showed that a suitable analysis can be explored (Selva et al., 2012). Following an analysis we are performing, a total of 6886 point-vents referring to the last 4.0 ka of Etna flank activity, and spread over an area of 744 km2 (divided into N=2976 squared cell, with side of 500 m), allowed us to estimate a pdf by applying a Gaussian kernel. The probability values represent a complete set of outcomes mutually exclusive and the relative sum is normalized to one over the investigated area; then, the basic assumptions of a Dirichlet distribution (the prior distribution set in the BET_EF code (Marzocchi et al., 2004, 2008)) still hold. One fundamental parameter is the the equivalent number of data, that depicts our confidence on the best guess probability. The BET_EF code also works with a likelihood function. This is modelled by a Multinomial distribution, with parameters representing the number of vents in each cell and the total number of past data (i.e. the 6886 point-vents). Given the grid of N cells, the final posterior distribution will be evaluated by multiplying the a priori Dirichlet probability distribution with the past data in each cell through the likelihood. The probability hazard map shows a tendency to concentrate along the NE and S rifts, as well as Valle del Bove, increasing the difference in probability between these areas and the rest of the volcano edifice. It is worthy notice that a higher significance is still evident along the W rift, even if not comparable with the ones of the above mentioned areas. References Marzocchi W., Sandri L., Gasparini P., Newhall C. and Boschi E.; 2004: Quantifying probabilities of volcanic events: The example of volcanic hazard at Mount Vesuvius, J. Geophys. Res., 109, B11201, doi:10.1029/2004JB00315U. Marzocchi W., Sandri, L. and Selva, J.; 2008: BET_EF: a probabilistic tool for long- and short-term eruption forecasting, Bull. Volcanol., 70, 623 - 632, doi: 10.1007/s00445-007-0157-y. Selva J., Orsi G., Di Vito M.A., Marzocchi W. And Sandri L.; 2012: Probability hazard mapfor future vent opening atthe Campi Flegrei caldera, Italy, Bull. Volcanol., 74, 497 - 510, doi: 10.1007/s00445-011-0528-2.

  16. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  17. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  18. Estimating the probability of rare events: addressing zero failure data.

    PubMed

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  19. Harmonic Analysis of Sedimentary Cyclic Sequences in Kansas, Midcontinent, USA

    USGS Publications Warehouse

    Merriam, D.F.; Robinson, J.E.

    1997-01-01

    Several stratigraphic sequences in the Upper Carboniferous (Pennsylvanian) in Kansas (Midcontinent, USA) were analyzed quantitatively for periodic repetitions. The sequences were coded by lithologic type into strings of datasets. The strings then were analyzed by an adaptation of a one-dimensional Fourier transform analysis and examined for evidence of periodicity. The method was tested using different states in coding to determine the robustness of the method and data. The most persistent response is in multiples of 8-10 ft (2.5-3.0 m) and probably is dependent on the depositional thickness of the original lithologic units. Other cyclicities occurred in multiples of the basic frequency of 8-10 with persistent ones at 22 and 30 feet (6.5-9.0 m) and large ones at 80 and 160 feet (25-50 m). These levels of thickness relate well to the basic cyclothem and megacyclothem as measured on outcrop. We propose that this approach is a suitable one for analyzing cyclic events in the stratigraphic record.

  20. A Framework to Understand Extreme Space Weather Event Probability.

    PubMed

    Jonas, Seth; Fronczyk, Kassandra; Pratt, Lucas M

    2018-03-12

    An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well-being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments. © 2018 Society for Risk Analysis.

  1. Protocol for evaluation of the cost-effectiveness of ePrescribing systems and candidate prototype for other related health information technologies

    PubMed Central

    2014-01-01

    Background This protocol concerns the assessment of cost-effectiveness of hospital health information technology (HIT) in four hospitals. Two of these hospitals are acquiring ePrescribing systems incorporating extensive decision support, while the other two will implement systems incorporating more basic clinical algorithms. Implementation of an ePrescribing system will have diffuse effects over myriad clinical processes, so the protocol has to deal with a large amount of information collected at various ‘levels’ across the system. Methods/Design The method we propose is use of Bayesian ideas as a philosophical guide. Assessment of cost-effectiveness requires a number of parameters in order to measure incremental cost utility or benefit – the effectiveness of the intervention in reducing frequency of preventable adverse events; utilities for these adverse events; costs of HIT systems; and cost consequences of adverse events averted. There is no single end-point that adequately and unproblematically captures the effectiveness of the intervention; we therefore plan to observe changes in error rates and adverse events in four error categories (death, permanent disability, moderate disability, minimal effect). For each category we will elicit and pool subjective probability densities from experts for reductions in adverse events, resulting from deployment of the intervention in a hospital with extensive decision support. The experts will have been briefed with quantitative and qualitative data from the study and external data sources prior to elicitation. Following this, there will be a process of deliberative dialogues so that experts can “re-calibrate” their subjective probability estimates. The consolidated densities assembled from the repeat elicitation exercise will then be used to populate a health economic model, along with salient utilities. The credible limits from these densities can define thresholds for sensitivity analyses. Discussion The protocol we present here was designed for evaluation of ePrescribing systems. However, the methodology we propose could be used whenever research cannot provide a direct and unbiased measure of comparative effectiveness. PMID:25038609

  2. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  3. Probabilistic attribution of individual unprecedented extreme events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2016-12-01

    The last decade has seen a rapid increase in efforts to understand the influence of global warming on individual extreme climate events. Although trends in the distributions of climate observations have been thoroughly analyzed, rigorously quantifying the contribution of global-scale warming to individual events that are unprecedented in the observed record presents a particular challenge. This paper describes a method for leveraging observations and climate model ensembles to quantify the influence of historical global warming on the severity and probability of unprecedented events. This approach uses formal inferential techniques to quantify four metrics: (1) the contribution of the observed trend to the event magnitude, (2) the contribution of the observed trend to the event probability, (3) the probability of the observed trend in the current climate and a climate without human influence, and (4) the probability of the event magnitude in the current climate and a climate without human influence. Illustrative examples are presented, spanning a range of climate variables, timescales, and regions. These examples illustrate that global warming can influence the severity and probability of unprecedented extremes. In some cases - particularly high temperatures - this change is indicated by changes in the mean. However, changes in probability do not always arise from changes in the mean, suggesting that global warming can alter the frequency with which complex physical conditions co-occur. Because our framework is transparent and highly generalized, it can be readily applied to a range of climate events, regions, and levels of climate forcing.

  4. Factorization of Observables

    NASA Astrophysics Data System (ADS)

    Eliaš, Peter; Frič, Roman

    2017-12-01

    Categorical approach to probability leads to better understanding of basic notions and constructions in generalized (fuzzy, operational, quantum) probability, where observables—dual notions to generalized random variables (statistical maps)—play a major role. First, to avoid inconsistencies, we introduce three categories L, S, and P, the objects and morphisms of which correspond to basic notions of fuzzy probability theory and operational probability theory, and describe their relationships. To illustrate the advantages of categorical approach, we show that two categorical constructions involving observables (related to the representation of generalized random variables via products, or smearing of sharp observables, respectively) can be described as factorizing a morphism into composition of two morphisms having desired properties. We close with a remark concerning products.

  5. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.

  6. Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.

    PubMed

    Jaspersen, Johannes G; Montibeller, Gilberto

    2015-07-01

    Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. © 2015 Society for Risk Analysis.

  7. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  8. A risk assessment methodology to evaluate the risk failure of managed aquifer recharge in the Mediterranean Basin

    NASA Astrophysics Data System (ADS)

    Rodríguez-Escales, Paula; Canelles, Arnau; Sanchez-Vila, Xavier; Folch, Albert; Kurtzman, Daniel; Rossetto, Rudy; Fernández-Escalante, Enrique; Lobo-Ferreira, João-Paulo; Sapiano, Manuel; San-Sebastián, Jon; Schüth, Christoph

    2018-06-01

    Managed aquifer recharge (MAR) can be affected by many risks. Those risks are related to different technical and non-technical aspects of recharge, like water availability, water quality, legislation, social issues, etc. Many other works have acknowledged risks of this nature theoretically; however, their quantification and definition has not been developed. In this study, the risk definition and quantification has been performed by means of fault trees and probabilistic risk assessment (PRA). We defined a fault tree with 65 basic events applicable to the operation phase. After that, we have applied this methodology to six different managed aquifer recharge sites located in the Mediterranean Basin (Portugal, Spain, Italy, Malta, and Israel). The probabilities of the basic events were defined by expert criteria, based on the knowledge of the different managers of the facilities. From that, we conclude that in all sites, the perception of the expert criteria of the non-technical aspects were as much or even more important than the technical aspects. Regarding the risk results, we observe that the total risk in three of the six sites was equal to or above 0.90. That would mean that the MAR facilities have a risk of failure equal to or higher than 90 % in the period of 2-6 years. The other three sites presented lower risks (75, 29, and 18 % for Malta, Menashe, and Serchio, respectively).

  9. Usefulness of the novel risk estimation software, Heart Risk View, for the prediction of cardiac events in patients with normal myocardial perfusion SPECT.

    PubMed

    Sakatani, Tomohiko; Shimoo, Satoshi; Takamatsu, Kazuaki; Kyodo, Atsushi; Tsuji, Yumika; Mera, Kayoko; Koide, Masahiro; Isodono, Koji; Tsubakimoto, Yoshinori; Matsuo, Akiko; Inoue, Keiji; Fujita, Hiroshi

    2016-12-01

    Myocardial perfusion single-photon emission-computed tomography (SPECT) can predict cardiac events in patients with coronary artery disease with high accuracy; however, pseudo-negative cases sometimes occur. Heart Risk View, which is based on the prospective cohort study (J-ACCESS), is a software for evaluating cardiac event probability. We examined whether Heart Risk View was useful to evaluate the cardiac risk in patients with normal myocardial perfusion SPECT (MPS). We studied 3461 consecutive patients who underwent MPS to detect myocardial ischemia and those who had normal MPS were enrolled in this study (n = 698). We calculated cardiac event probability by Heart Risk View and followed-up for 3.8 ± 2.4 years. The cardiac events were defined as cardiac death, non-fatal myocardial infarction, and heart failure requiring hospitalization. During the follow-up period, 21 patients (3.0 %) had cardiac events. The event probability calculated by Heart Risk View was higher in the event group (5.5 ± 2.6 vs. 2.9 ± 2.6 %, p < 0.001). According to the receiver-operating characteristics curve, the cut-off point of the event probability for predicting cardiac events was 3.4 % (sensitivity 0.76, specificity 0.72, and AUC 0.85). Kaplan-Meier curves revealed that a higher event rate was observed in the high-event probability group by the log-rank test (p < 0.001). Although myocardial perfusion SPECT is useful for the prediction of cardiac events, risk estimation by Heart Risk View adds more prognostic information, especially in patients with normal MPS.

  10. A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Litt, Jonathan S.

    2004-01-01

    A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.

  11. Quantifying Dental Office-Originating Adverse Events: The Dental Practice Study Methods.

    PubMed

    Tokede, Oluwabunmi; Walji, Muhammad; Ramoni, Rachel; Rindal, Donald B; Worley, Donald; Hebballi, Nutan; Kumar, Krishna; van Strien, Claire; Chen, Mengxia; Navat-Pelli, Shaked; Liu, Hongchun; Etolue, Jini; Yansane, Alfa; Obadan-Udoh, Enihomo; Easterday, Casey; Enstad, Chris; Kane, Sheryl; Rush, William; Kalenderian, Elsbeth

    2017-12-05

    Preventable medical errors in hospital settings are the third leading cause of deaths in the United States. However, less is known about harm that occurs in patients in outpatient settings, where the majority of care is delivered. We do not know the likelihood that a patient sitting in a dentist chair will experience harm. Additionally, we do not know if patients of certain race, age, sex, or socioeconomic status disproportionately experience iatrogenic harm. We initiated the Dental Practice Study (DPS) with the aim of determining the frequency and types of adverse events (AEs) that occur in dentistry on the basis of retrospective chart audit. This article discusses the 6-month pilot phase of the DPS during which we explored the feasibility and efficiency of our multistaged review process to detect AEs. At sites 1, 2, and 3, respectively, 2 reviewers abstracted 21, 11, and 23 probable AEs, respectively, from the 100 patient charts audited per site. At site 2, a third reviewer audited the same 100 charts and found only 1 additional probable AE. Of the total 56 probable AEs (from 300 charts), the expert panel confirmed 9 AE cases. This equals 3 AEs per 100 patients per year. Patients who experienced an AE tended to be male and older and to have undergone more procedures within the study year. This article presents an overview of the DPS. It describes the methods used and summarizes the results of its pilot phase. To minimize threats to dental patient safety, a starting point is to understand their basic epidemiology, both in terms of their frequency and the extent to which they affect different populations.

  12. The development and validation of the AMPREDICT model for predicting mobility outcome after dysvascular lower extremity amputation.

    PubMed

    Czerniecki, Joseph M; Turner, Aaron P; Williams, Rhonda M; Thompson, Mary Lou; Landry, Greg; Hakimi, Kevin; Speckman, Rebecca; Norvell, Daniel C

    2017-01-01

    The objective of this study was the development of AMPREDICT-Mobility, a tool to predict the probability of independence in either basic or advanced (iBASIC or iADVANCED) mobility 1 year after dysvascular major lower extremity amputation. Two prospective cohort studies during consecutive 4-year periods (2005-2009 and 2010-2014) were conducted at seven medical centers. Multiple demographic and biopsychosocial predictors were collected in the periamputation period among individuals undergoing their first major amputation because of complications of peripheral arterial disease or diabetes. The primary outcomes were iBASIC and iADVANCED mobility, as measured by the Locomotor Capabilities Index. Combined data from both studies were used for model development and internal validation. Backwards stepwise logistic regression was used to develop the final prediction models. The discrimination and calibration of each model were assessed. Internal validity of each model was assessed with bootstrap sampling. Twelve-month follow-up was reached by 157 of 200 (79%) participants. Among these, 54 (34%) did not achieve iBASIC mobility, 103 (66%) achieved at least iBASIC mobility, and 51 (32%) also achieved iADVANCED mobility. Predictive factors associated with reduced odds of achieving iBASIC mobility were increasing age, chronic obstructive pulmonary disease, dialysis, diabetes, prior history of treatment for depression or anxiety, and very poor to fair self-rated health. Those who were white, were married, and had at least a high-school degree had a higher probability of achieving iBASIC mobility. The odds of achieving iBASIC mobility increased with increasing body mass index up to 30 kg/m 2 and decreased with increasing body mass index thereafter. The prediction model of iADVANCED mobility included the same predictors with the exception of diabetes, chronic obstructive pulmonary disease, and education level. Both models showed strong discrimination with C statistics of 0.85 and 0.82, respectively. The mean difference in predicted probabilities for those who did and did not achieve iBASIC and iADVANCED mobility was 33% and 29%, respectively. Tests for calibration and observed vs predicted plots suggested good fit for both models; however, the precision of the estimates of the predicted probabilities was modest. Internal validation through bootstrapping demonstrated some overoptimism of the original model development, with the optimism-adjusted C statistic for iBASIC and iADVANCED mobility being 0.74 and 0.71, respectively, and the discrimination slope 19% and 16%, respectively. AMPREDICT-Mobility is a user-friendly prediction tool that can inform the patient undergoing a dysvascular amputation and the patient's provider about the probability of independence in either basic or advanced mobility at each major lower extremity amputation level. Copyright © 2016 Society for Vascular Surgery. All rights reserved.

  13. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    NASA Astrophysics Data System (ADS)

    Frigm, R.; Johnson, L.

    The Probability of Collision (Pc) has become a universal metric and statement of on-orbit collision risk. Although several flavors of the computation exist and are well-documented in the literature, the basic calculation requires the same input: estimates for the position, position uncertainty, and sizes of the two objects involved. The Pc is used operationally to make decisions on whether a given conjunction poses significant collision risk to the primary object (or space asset of concern). It is also used to determine necessity and degree of mitigative action (typically in the form of an orbital maneuver) to be performed. The predicted post-maneuver Pc also informs the maneuver planning process into regarding the timing, direction, and magnitude of the maneuver needed to mitigate the collision risk. Although the data sources, techniques, decision calculus, and workflows vary for different agencies and organizations, they all have a common thread. The standard conjunction assessment and collision risk concept of operations (CONOPS) predicts conjunctions, assesses the collision risk (typically, via the Pc), and plans and executes avoidance activities for conjunctions as a discrete events. As the space debris environment continues to increase and improvements are made to remote sensing capabilities and sensitivities to detect, track, and predict smaller debris objects, the number of conjunctions will in turn continue to increase. The expected order-of-magnitude increase in the number of predicted conjunctions will challenge the paradigm of treating each conjunction as a discrete event. The challenge will not be limited to workload issues, such as manpower and computing performance, but also the ability for satellite owner/operators to successfully execute their mission while also managing on-orbit collision risk. Executing a propulsive maneuver occasionally can easily be absorbed into the mission planning and operations tempo; whereas, continuously planning evasive maneuvers for multiple conjunction events is time-consuming and would disrupt mission and science operations beyond what is tolerable. At the point when the number of conjunctions is so large that it is no longer possible to consider each individually, some sort of an amalgamation of events and risk must be considered. This shift is to one where each conjunction cannot be treated individually and the effects of all conjunctions within a given period of time must be considered together. This new paradigm is called finite Conjunction Assessment (CA) risk management. This paper considers the use of the Total Probability of Collision (TPc) as an analogous collision risk metric in the finite CA paradigm. TPc is expressed by the equation below and provides an aggregate probability of colliding with any one of the predicted conjunctions under consideration. TPc=1-?(1-Pc,i) While the TPc computation is straightforward and its physical meaning is understandable, the implications of its usage operationally requires a change in mindset and approach to collision risk management. This paper explores the necessary changes to evolve the basic CA and collision risk management CONOPS from discrete to finite CA, including aspects of collision risk assessment and collision risk mitigation. It proposes numerical and graphical decision aids to understand both the “risk outlook” for a given primary as well as mitigation options for the total collision risk. Both concepts make use of the TPc as a metric for finite collision risk management. Several operational scenarios are used to demonstrate the proposed concepts in practice.

  14. A comparison of cost effectiveness using data from randomized trials or actual clinical practice: selective cox-2 inhibitors as an example.

    PubMed

    van Staa, Tjeerd-Pieter; Leufkens, Hubert G; Zhang, Bill; Smeeth, Liam

    2009-12-01

    Data on absolute risks of outcomes and patterns of drug use in cost-effectiveness analyses are often based on randomised clinical trials (RCTs). The objective of this study was to evaluate the external validity of published cost-effectiveness studies by comparing the data used in these studies (typically based on RCTs) to observational data from actual clinical practice. Selective Cox-2 inhibitors (coxibs) were used as an example. The UK General Practice Research Database (GPRD) was used to estimate the exposure characteristics and individual probabilities of upper gastrointestinal (GI) events during current exposure to nonsteroidal anti-inflammatory drugs (NSAIDs) or coxibs. A basic cost-effectiveness model was developed evaluating two alternative strategies: prescription of a conventional NSAID or coxib. Outcomes included upper GI events as recorded in GPRD and hospitalisation for upper GI events recorded in the national registry of hospitalisations (Hospital Episode Statistics) linked to GPRD. Prescription costs were based on the prescribed number of tables as recorded in GPRD and the 2006 cost data from the British National Formulary. The study population included over 1 million patients prescribed conventional NSAIDs or coxibs. Only a minority of patients used the drugs long-term and daily (34.5% of conventional NSAIDs and 44.2% of coxibs), whereas coxib RCTs required daily use for at least 6-9 months. The mean cost of preventing one upper GI event as recorded in GPRD was US$104k (ranging from US$64k with long-term daily use to US$182k with intermittent use) and US$298k for hospitalizations. The mean costs (for GPRD events) over calendar time were US$58k during 1990-1993 and US$174k during 2002-2005. Using RCT data rather than GPRD data for event probabilities, the mean cost was US$16k with the VIGOR RCT and US$20k with the CLASS RCT. The published cost-effectiveness analyses of coxibs lacked external validity, did not represent patients in actual clinical practice, and should not have been used to inform prescribing policies. External validity should be an explicit requirement for cost-effectiveness analyses.

  15. Stochastic summation of empirical Green's functions

    USGS Publications Warehouse

    Wennerberg, Leif

    1990-01-01

    Two simple strategies are presented that use random delay times for repeatedly summing the record of a relatively small earthquake to simulate the effects of a larger earthquake. The simulations do not assume any fault plane geometry or rupture dynamics, but realy only on the ω−2 spectral model of an earthquake source and elementary notions of source complexity. The strategies simulate ground motions for all frequencies within the bandwidth of the record of the event used as a summand. The first strategy, which introduces the basic ideas, is a single-stage procedure that consists of simply adding many small events with random time delays. The probability distribution for delays has the property that its amplitude spectrum is determined by the ratio of ω−2 spectra, and its phase spectrum is identically zero. A simple expression is given for the computation of this zero-phase scaling distribution. The moment rate function resulting from the single-stage simulation is quite simple and hence is probably not realistic for high-frequency (>1 Hz) ground motion of events larger than ML∼ 4.5 to 5. The second strategy is a two-stage summation that simulates source complexity with a few random subevent delays determined using the zero-phase scaling distribution, and then clusters energy around these delays to get an ω−2 spectrum for the sum. Thus, the two-stage strategy allows simulations of complex events of any size for which the ω−2 spectral model applies. Interestingly, a single-stage simulation with too few ω−2records to get a good fit to an ω−2 large-event target spectrum yields a record whose spectral asymptotes are consistent with the ω−2 model, but that includes a region in its spectrum between the corner frequencies of the larger and smaller events reasonably approximated by a power law trend. This spectral feature has also been discussed as reflecting the process of partial stress release (Brune, 1970), an asperity failure (Boatwright, 1984), or the breakdown of ω−2 scaling due to rupture significantly longer than the width of the seismogenic zone (Joyner, 1984).

  16. Engineering risk assessment for emergency disposal projects of sudden water pollution incidents.

    PubMed

    Shi, Bin; Jiang, Jiping; Liu, Rentao; Khan, Afed Ullah; Wang, Peng

    2017-06-01

    Without an engineering risk assessment for emergency disposal in response to sudden water pollution incidents, responders are prone to be challenged during emergency decision making. To address this gap, the concept and framework of emergency disposal engineering risks are reported in this paper. The proposed risk index system covers three stages consistent with the progress of an emergency disposal project. Fuzzy fault tree analysis (FFTA), a logical and diagrammatic method, was developed to evaluate the potential failure during the process of emergency disposal. The probability of basic events and their combination, which caused the failure of an emergency disposal project, were calculated based on the case of an emergency disposal project of an aniline pollution incident in the Zhuozhang River, Changzhi, China, in 2014. The critical events that can cause the occurrence of a top event (TE) were identified according to their contribution. Finally, advices on how to take measures using limited resources to prevent the failure of a TE are given according to the quantified results of risk magnitude. The proposed approach could be a potential useful safeguard for the implementation of an emergency disposal project during the process of emergency response.

  17. Operational foreshock forecasting: Fifteen years after

    NASA Astrophysics Data System (ADS)

    Ogata, Y.

    2010-12-01

    We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to about 40% depending on the discrimination features among the events in the cluster. This conditional forecasting further performs significantly better than the unconditional foreshock probability of 7.3%, which is the average probability of the plural events in the earthquake clusters. Indeed, the frequency ratios of the actual foreshocks are consistent with the forecasted probabilities. Reference: Ogata, Y., Utsu, T. and Katsura, K. (1996). Statistical discrimination of foreshocks from other earthquake clusters, Geophys. J. Int. 127, 17-30.

  18. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    ERIC Educational Resources Information Center

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  19. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  20. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    ERIC Educational Resources Information Center

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  1. Normal myocardial perfusion scan portends a benign prognosis independent from the pretest probability of coronary artery disease. Sub-analysis of the J-ACCESS study.

    PubMed

    Imamura, Yosihiro; Fukuyama, Takaya; Nishimura, Sigeyuki; Nishimura, Tsunehiko

    2009-08-01

    We assessed the usefulness of gated stress/rest 99mTc-tetrofosmin myocardial perfusion single photon emission computed tomography (SPECT) to predict ischemic cardiac events in Japanese patients with various estimated pretest probabilities of coronary artery disease (CAD). Of the 4031 consecutively registered patients for a J-ACCESS (Japanese Assessment of Cardiac Events and Survival Study by Quantitative Gated SPECT) study, 1904 patients without prior cardiac events were selected. Gated stress/rest myocardial perfusion SPECT was performed and segmental perfusion scores and quantitative gated SPECT results were derived. The pretest probability for having CAD was estimated using the American College of Cardiology/American Heart Association/American College of Physicians-American Society of Internal Medicine guideline data for the management of patients with chronic stable angina, which includes age, gender, and type of chest discomfort. The patients were followed up for three years. During the three-year follow-up period, 96 developed ischemic cardiac events: 17 cardiac deaths, 8 nonfatal myocardial infarction, and 71 clinically driven revascularization. The summed stress score (SSS) was the most powerful independent predictor of all ischemic cardiac events (hazard ratio 1.077, CI 1.045-1.110). Abnormal SSS (> 3) was associated with a significantly higher cardiac event rate in patients with an intermediate to high pretest probability of CAD. Normal SSS (< or = 3) was associated with a low event rate in patients with any pretest probability of CAD. Myocardial perfusion SPECT is useful for further risk-stratification of patients with suspected CAD. The abnormal scan result (SSS > 3) is discriminative for subsequent cardiac events only in the groups with an intermediate to high pretest probability of CAD. The salient result is that normal scan results portend a benign prognosis independent from the pretest probability of CAD.

  2. A rational decision rule with extreme events.

    PubMed

    Basili, Marcello

    2006-12-01

    Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.

  3. Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Hunt, Ronderio LaDavis

    In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.

  4. Inconvenient Truth or Convenient Fiction? Probable Maximum Precipitation and Nonstationarity

    NASA Astrophysics Data System (ADS)

    Nielsen-Gammon, J. W.

    2017-12-01

    According to the inconvenient truth that Probable Maximum Precipitation (PMP) represents a non-deterministic, statistically very rare event, future changes in PMP involve a complex interplay between future frequencies of storm type, storm morphology, and environmental characteristics, many of which are poorly constrained by global climate models. On the other hand, according to the convenient fiction that PMP represents an estimate of the maximum possible precipitation that can occur at a given location, as determined by storm maximization and transposition, the primary climatic driver of PMP change is simply a change in maximum moisture availability. Increases in boundary-layer and total-column moisture have been observed globally, are anticipated from basic physical principles, and are robustly projected to continue by global climate models. Thus, using the same techniques that are used within the PMP storm maximization process itself, future PMP values may be projected. The resulting PMP trend projections are qualitatively consistent with observed trends of extreme rainfall within Texas, suggesting that in this part of the world the inconvenient truth is congruent with the convenient fiction.

  5. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  6. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE PAGES

    Butler, Troy; Wildey, Timothy

    2018-01-01

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  7. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Troy; Wildey, Timothy

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  8. Impact of gender, co-morbidity and social factors on labour market affiliation after first admission for acute coronary syndrome. A cohort study of Danish patients 2001-2009.

    PubMed

    Osler, Merete; Mårtensson, Solvej; Prescott, Eva; Carlsen, Kathrine

    2014-01-01

    Over the last decades survival after acute coronary syndrome (ACS) has improved, leading to an increasing number of patients returning to work, but little is known about factors that may influence their labour market affiliation. This study examines the impact of gender, co-morbidity and socio-economic position on subsequent labour market affiliation and transition between various social services in patients admitted for the first time with ACS. From 2001 to 2009 all first-time hospitalisations for ACS were identified in the Danish National Patient Registry (n = 79,714). For this population, data on sick leave, unemployment and retirement were obtained from an administrative register covering all citizens. The 21,926 patients, aged 18-63 years, who had survived 30 days and were part of the workforce at the time of diagnosis were included in the analyses where subsequent transition between the above labour market states was examined using Kaplan-Meier estimates and Cox proportional hazards models. A total of 37% of patients were in work 30 days after first ACS diagnosis, while 55% were on sick leave and 8% were unemployed. Seventy-nine per cent returned to work once during follow-up. This probability was highest among males, those below 50 years, living with a partner, the highest educated, with higher occupations, having specific events (NSTEMI, and percutaneous coronary intervention) and with no co-morbidity. During five years follow-up, 43% retired due to disability or voluntary early pension. Female gender, low education, basic occupation, co-morbidity and having a severer event (invasive procedures) and receiving sickness benefits or being unemployed 30 days after admission were associated with increased probability of early retirement. About half of patients with first-time ACS stay in or return to work shortly after the event. Women, the socially disadvantaged, those with presumed severer events and co-morbidity have lower rates of return.

  9. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2010-04-01 2010-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  10. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2011-04-01 2011-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  11. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  12. Event Discrimination Using Seismoacoustic Catalog Probabilities

    NASA Astrophysics Data System (ADS)

    Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.

    2017-12-01

    Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  13. Comparing the Probability Related Misconceptions of Pupils at Different Education Levels=Usporedba pogrješnih predodžaba o konceptima vjerojatnosti kod ucenika razlicitog stupnja obrazovanja

    ERIC Educational Resources Information Center

    Gürbüz, Ramazan; Birgin, Osman; Çatlioglu, Hakan

    2012-01-01

    The aim of the paper is to compare and evaluate the probability-related misconceptions of pupils at different education levels. A cross-sectional/age study was thus conducted with 540 pupils in 5th-8th grades. An instrument, comprising six questions on the concepts of compound events, probability of an event and probability comparisons, was used.…

  14. Summary of U.S. Geological Survey reports documenting flood profiles of streams in Iowa, 1963-2012

    USGS Publications Warehouse

    Eash, David A.

    2014-01-01

    This report is part of an ongoing program that is publishing flood profiles of streams in Iowa. The program is managed by the U.S. Geological Survey in cooperation with the Iowa Department of Transportation and the Iowa Highway Research Board (Project HR-140). Information from flood profiles is used by engineers to analyze and design bridges, culverts, and roadways. This report summarizes 47 U.S. Geological Survey flood-profile reports that were published for streams in Iowa during a 50-year period from 1963 to 2012. Flood events profiled in the reports range from 1903 to 2010. Streams in Iowa that have been selected for the preparation of flood-profile reports typically have drainage areas of 100 square miles or greater, and the documented flood events have annual exceedance probabilities of less than 2 to 4 percent. This report summarizes flood-profile measurements, changes in flood-profile report content throughout the years, streams that were profiled in the reports, the occurrence of flood events profiled, and annual exceedance-probability estimates of observed flood events. To develop flood profiles for selected flood events for selected stream reaches, the U.S. Geological Survey measured high-water marks and river miles at selected locations. A total of 94 stream reaches have been profiled in U.S. Geological Survey flood-profile reports. Three rivers in Iowa have been profiled along the same stream reach for five different flood events and six rivers in Iowa have been profiled along the same stream reach for four different flood events. Floods were profiled for June flood events for 18 different years, followed by July flood events for 13 years, May flood events for 11 years, and April flood events for 9 years. Most of the flood-profile reports include estimates of annual exceedance probabilities of observed flood events at streamgages located along profiled stream reaches. Comparisons of 179 historic and updated annual exceedance-probability estimates indicate few differences that are considered substantial between the historic and updated estimates for the observed flood events. Overall, precise comparisons for 114 observed flood events indicate that updated annual exceedance probabilities have increased for most of the observed flood events compared to the historic annual exceedance probabilities. Multiple large flood events exceeding the 2-percent annual exceedance-probability discharge estimate occurred at 37 of 98 selected streamgages during 1960–2012. Five large flood events were recorded at two streamgages in Ames during 1990–2010 and four large flood events were recorded at four other streamgages during 1973–2010. Results of Kendall’s tau trend-analysis tests for 35 of 37 selected streamgages indicate that a statistically significant trend is not evident for the 1963–2012 period of record; nor is an overall clear positive or negative trend evident for the 37 streamgages.

  15. Active Learning? Not with My Syllabus!

    ERIC Educational Resources Information Center

    Ernst, Michael D.

    2012-01-01

    We describe an approach to teaching probability that minimizes the amount of class time spent on the topic while also providing a meaningful (dice-rolling) activity to get students engaged. The activity, which has a surprising outcome, illustrates the basic ideas of informal probability and how probability is used in statistical inference.…

  16. The World According to de Finetti: On de Finetti's Theory of Probability and Its Application to Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Berkovitz, Joseph

    Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.

  17. Technology-Enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like "What is the chance of event A occurring, given that event B was observed?" This generic question arises in discussions of many intriguing scientific questions such as "What is the probability that an adolescent weighs between 120 and 140 pounds given that…

  18. Attribution of extreme weather and climate-related events.

    PubMed

    Stott, Peter A; Christidis, Nikolaos; Otto, Friederike E L; Sun, Ying; Vanderlinden, Jean-Paul; van Oldenborgh, Geert Jan; Vautard, Robert; von Storch, Hans; Walton, Peter; Yiou, Pascal; Zwiers, Francis W

    2016-01-01

    Extreme weather and climate-related events occur in a particular place, by definition, infrequently. It is therefore challenging to detect systematic changes in their occurrence given the relative shortness of observational records. However, there is a clear interest from outside the climate science community in the extent to which recent damaging extreme events can be linked to human-induced climate change or natural climate variability. Event attribution studies seek to determine to what extent anthropogenic climate change has altered the probability or magnitude of particular events. They have shown clear evidence for human influence having increased the probability of many extremely warm seasonal temperatures and reduced the probability of extremely cold seasonal temperatures in many parts of the world. The evidence for human influence on the probability of extreme precipitation events, droughts, and storms is more mixed. Although the science of event attribution has developed rapidly in recent years, geographical coverage of events remains patchy and based on the interests and capabilities of individual research groups. The development of operational event attribution would allow a more timely and methodical production of attribution assessments than currently obtained on an ad hoc basis. For event attribution assessments to be most useful, remaining scientific uncertainties need to be robustly assessed and the results clearly communicated. This requires the continuing development of methodologies to assess the reliability of event attribution results and further work to understand the potential utility of event attribution for stakeholder groups and decision makers. WIREs Clim Change 2016, 7:23-41. doi: 10.1002/wcc.380 For further resources related to this article, please visit the WIREs website.

  19. Estimating the Probability of Rare Events Occurring Using a Local Model Averaging.

    PubMed

    Chen, Jin-Hua; Chen, Chun-Shu; Huang, Meng-Fan; Lin, Hung-Chih

    2016-10-01

    In statistical applications, logistic regression is a popular method for analyzing binary data accompanied by explanatory variables. But when one of the two outcomes is rare, the estimation of model parameters has been shown to be severely biased and hence estimating the probability of rare events occurring based on a logistic regression model would be inaccurate. In this article, we focus on estimating the probability of rare events occurring based on logistic regression models. Instead of selecting a best model, we propose a local model averaging procedure based on a data perturbation technique applied to different information criteria to obtain different probability estimates of rare events occurring. Then an approximately unbiased estimator of Kullback-Leibler loss is used to choose the best one among them. We design complete simulations to show the effectiveness of our approach. For illustration, a necrotizing enterocolitis (NEC) data set is analyzed. © 2016 Society for Risk Analysis.

  20. Constructing event trees for volcanic crises

    USGS Publications Warehouse

    Newhall, C.; Hoblitt, R.

    2002-01-01

    Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.

  1. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  2. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  3. Changes in the probability of co-occurring extreme climate events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2017-12-01

    Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.

  4. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  5. An application of probability to combinatorics: a proof of Vandermonde identity

    NASA Astrophysics Data System (ADS)

    Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni

    2017-08-01

    In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using 'induction', 'polynomial identities' nor 'generating functions', and will give a proof of the 'Vandermonde Identity' using elementary notions of probability.

  6. Unsupervised Spatio-Temporal Data Mining Framework for Burned Area Mapping

    NASA Technical Reports Server (NTRS)

    Kumar, Vipin (Inventor); Boriah, Shyam (Inventor); Mithal, Varun (Inventor); Khandelwal, Ankush (Inventor)

    2016-01-01

    A method reduces processing time required to identify locations burned by fire by receiving a feature value for each pixel in an image, each pixel representing a sub-area of a location. Pixels are then grouped based on similarities of the feature values to form candidate burn events. For each candidate burn event, a probability that the candidate burn event is a true burn event is determined based on at least one further feature value for each pixel in the candidate burn event. Candidate burn events that have a probability below a threshold are removed from further consideration as burn events to produce a set of remaining candidate burn events.

  7. Mapping Vulnerability to Disasters in Latin America and the Caribbean, 1900-2007

    USGS Publications Warehouse

    Maynard-Ford, Miriam C.; Phillips, Emily C.; Chirico, Peter G.

    2008-01-01

    The vulnerability of a population and its infrastructure to disastrous events is a factor of both the probability of a hazardous event occurring and the community's ability to cope with the resulting impacts. Therefore, the ability to accurately identify vulnerable populations and places in order to prepare for future hazards is of critical importance for disaster mitigation programs. This project created maps of higher spatial resolution of vulnerability to disaster in Latin America and the Caribbean from 1900 to 2007 by mapping disaster data by first-level administrative boundaries with the objective of identifying geographic trends in regional occurrences of disasters and vulnerable populations. The method of mapping by administrative level is an improvement on displaying and analyzing disasters at the country level and shows the relative intensity of vulnerability within and between countries in the region. Disaster mapping at the country level produces only a basic view of which countries experience various types of natural disasters. Through disaggregation, the data show which geographic areas of these countries, including populated areas, are historically most susceptible to different hazard types.

  8. The role of the density gradient on intermittent cross-field transport events in a simple magnetized toroidal plasma

    NASA Astrophysics Data System (ADS)

    Theiler, C.; Diallo, A.; Fasoli, A.; Furno, I.; Labit, B.; Podestà, M.; Poli, F. M.; Ricci, P.

    2008-04-01

    Intermittent cross-field particle transport events (ITEs) are studied in the basic toroidal device TORPEX [TORoidal Plasma EXperiment, A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], with focus on the role of the density gradient. ITEs are due to the intermittent radial elongation of an interchange mode. The elongating positive wave crests can break apart and form blobs. This is not necessary, however, for plasma particles to be convected a considerable distance across the magnetic field lines. Conditionally sampled data reveal two different scenarios leading to ITEs. In the first case, the interchange mode grows radially from a slab-like density profile and leads to the ITE. A novel analysis technique reveals a monotonic dependence between the vertically averaged inverse radial density scale length and the probability for a subsequent ITE. In the second case, the mode is already observed before the start of the ITE. It does not elongate radially in a first stage, but at a later time. It is shown that this elongation is preceded by a steepening of the density profile as well.

  9. Hunting for Stellar Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Korhonen, Heidi; Vida, Krisztián; Leitzinger, Martin; Odert, Petra; Kovács, Orsolya Eszter

    2017-10-01

    Coronal mass ejections (CMEs) are explosive events that occur basically daily on the Sun. It is thought that these events play a crucial role in the angular momentum and mass loss of late-type stars, and also shape the environment in which planets form and live. Stellar CMEs can be detected in optical spectra in the Balmer lines, especially in Hα, as blue-shifted extra emission/absorption. To increase the detection probability one can monitor young open clusters, in which the stars are due to their youth still rapid rotators, and thus magnetically active and likely to exhibit a large number of CMEs. Using ESO facilities and the Nordic Optical Telescope we have obtained time series of multi-object spectroscopic observations of late-type stars in six open clusters with ages ranging from 15 Myrs to 300 Myrs. Additionally, we have studied archival data of numerous active stars. These observations will allow us to obtain information on the occurrence rate of CMEs in late-type stars with different ages and spectral types. Here we report on the preliminary outcome of our studies.

  10. Current Events in Basic Business Education

    ERIC Educational Resources Information Center

    Van Hook, Barry L.

    1974-01-01

    The author suggests the use of current events to stimulate student interest in basic business courses. Suggested topics described are monetary devaluation, interest rate adjustments, Illinois no-fault automobile insurance, labor-management disputes, Dow-Jones average, Picasso's death, energy crisis, sale of surplus wheat, local consumer assistance…

  11. Boolean logic tree of graphene-based chemical system for molecular computation and intelligent molecular search query.

    PubMed

    Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing

    2014-05-06

    The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.

  12. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  13. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  14. Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California

    USGS Publications Warehouse

    Biasi, G.P.; Weldon, R.J.; Fumal, T.E.; Seitz, G.G.

    2002-01-01

    We introduce a quantitative approach to paleoearthquake dating and apply it to paleoseismic data from the Wrightwood and Pallett Creek sites on the southern San Andreas fault. We illustrate how stratigraphic ordering, sedimentological, and historical data can be used quantitatively in the process of estimating earthquake ages. Calibrated radiocarbon age distributions are used directly from layer dating through recurrence intervals and recurrence probability estimation. The method does not eliminate subjective judgements in event dating, but it does provide a means of systematically and objectively approaching the dating process. Date distributions for the most recent 14 events at Wrightwood are based on sample and contextual evidence in Fumal et al. (2002) and site context and slip history in Weldon et al. (2002). Pallett Creek event and dating descriptions are from published sources. For the five most recent events at Wrightwood, our results are consistent with previously published estimates, with generally comparable or narrower uncertainties. For Pallett Creek, our earthquake date estimates generally overlap with previous results but typically have broader uncertainties. Some event date estimates are very sensitive to details of data interpretation. The historical earthquake in 1857 ruptured the ground at both sites but is not constrained by radiocarbon data. Radiocarbon ages, peat accumulation rates, and historical constraints at Pallett Creek for event X yield a date estimate in the earliest 1800s and preclude a date in the late 1600s. This event is almost certainly the historical 1812 earthquake, as previously concluded by Sieh et al. (1989). This earthquake also produced ground deformation at Wrightwood. All events at Pallett Creek, except for event T, about A.D. 1360, and possibly event I, about A.D. 960, have corresponding events at Wrightwood with some overlap in age ranges. Event T falls during a period of low sedimentation at Wrightwood when conditions were not favorable for recording earthquake evidence. Previously proposed correlations of Pallett Creek X with Wrightwood W3 in the 1690s and Pallett Creek event V with W5 around 1480 (Fumal et al., 1993) appear unlikely after our dating reevaluation. Apparent internal inconsistencies among event, layer, and dating relationships around events R and V identify them as candidates for further investigation at the site. Conditional probabilities of earthquake recurrence were estimated using Poisson, lognormal, and empirical models. The presence of 12 or 13 events at Wrightwood during the same interval that 10 events are reported at Pallett Creek is reflected in mean recurrence intervals of 105 and 135 years, respectively. Average Poisson model 30-year conditional probabilities are about 20% at Pallett Creek and 25% at Wrightwood. The lognormal model conditional probabilities are somewhat higher, about 25% for Pallett Creek and 34% for Wrightwood. Lognormal variance ??ln estimates of 0.76 and 0.70, respectively, imply only weak time predictability. Conditional probabilities of 29% and 46%, respectively, were estimated for an empirical distribution derived from the data alone. Conditional probability uncertainties are dominated by the brevity of the event series; dating uncertainty contributes only secondarily. Wrightwood and Pallett Creek event chronologies both suggest variations in recurrence interval with time, hinting that some form of recurrence rate modulation may be at work, but formal testing shows that neither series is more ordered than might be produced by a Poisson process.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudsen, J.K.; Smith, C.L.

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more thanmore » one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.« less

  16. Suggestions for Teaching Mathematics Using Laboratory Approaches. 6. Probability. Experimental Edition.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Elementary Curriculum Development.

    This guide is the sixth in a series of publications to assist teachers in using a laboratory approach to mathematics. Twenty activities on probability and statistics for the elementary grades are described in terms of purpose, materials needed, and procedures to be used. Objectives of these activities include basic probability concepts; gathering,…

  17. Building a Database for a Quantitative Model

    NASA Technical Reports Server (NTRS)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  18. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  20. Risk of cardiovascular events in mothers of women with polycystic ovary syndrome.

    PubMed

    Cheang, Kai I; Nestler, John E; Futterweit, Walter

    2008-12-01

    To assess the prevalence of cardiovascular events in an older population of women with polycystic ovary syndrome (PCOS). We took advantage of the high heritability of PCOS and determined the probable PCOS status of mothers of women with PCOS. The prevalence of cardiovascular events was then determined in these mothers with and without PCOS. In a single endocrine clinic, 308 women with PCOS were interviewed about their mothers' medical history, and the mothers themselves were interviewed if available. The interview addressed menstrual history, fertility, clinical signs of hyperandrogenism, age at incident cardiovascular event, and age at death as reported by daughters. Presence of PCOS in the mothers was defined as a history of infertility, irregular menses, or clinical signs of hyperandrogenism. A cardiovascular event was defined as fatal or nonfatal myocardial infarction, any coronary intervention, angina necessitating emergency department visits, or a cerebrovascular event. The mothers were predominantly post-menopausal. Among 182 interviewed (n = 157) or deceased (n = 25) mothers, 59 had probable PCOS. Cardiovascular events were more common (P = .011) among mothers with PCOS (11 of 59 or 18.6%) than among non-PCOS mothers (5 of 123 or 4.1%). After adjustments were made for age and race, probable PCOS was an independent predictor of cardiovascular events (odds ratio, 5.41; 95% confidence interval, 1.78 to 16.40). Cardiovascular events occurred at an early age in mothers of women with PCOS, particularly mothers with probable PCOS themselves. PCOS-affected mothers of women with PCOS have a higher risk for cardiovascular events in comparison with non-PCOS mothers, and cardiovascular events appear to occur at an earlier than expected age in mothers with PCOS.

  1. Quantifying the probability of record-setting heat events in the historical record and at different levels of climate forcing

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2017-12-01

    Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.

  2. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the public. VDAP trees evaluate probabilities of: magmatic intrusion, likelihood of eruption, magnitude of eruption, and types of associated hazardous events and their extents. In a few cases, trees have been extended to also assess and communicate vulnerability and relative risk.

  3. Assessing flight safety differences between the United States regional and major airlines

    NASA Astrophysics Data System (ADS)

    Sharp, Broderick H.

    During 2008, the U.S. domestic airline departures exceeded 28,000 flights per day. Thirty-nine or less than 0.2 of 1% of these flights resulted in operational incidents or accidents. However, even a low percentage of airline accidents and incidents continue to cause human suffering and property loss. The charge of this study was the comparison of U.S. major and regional airline safety histories. The study spans safety events from January 1982 through December 2008. In this quantitative analysis, domestic major and regional airlines were statistically tested for their flight safety differences. Four major airlines and thirty-seven regional airlines qualified for the safety study which compared the airline groups' fatal accidents, incidents, non-fatal accidents, pilot errors, and the remaining six safety event probable cause types. The six other probable cause types are mechanical failure, weather, air traffic control, maintenance, other, and unknown causes. The National Transportation Safety Board investigated each airline safety event, and assigned a probable cause to each event. A sample of 500 events was randomly selected from the 1,391 airlines' accident and incident population. The airline groups' safety event probabilities were estimated using the least squares linear regression. A probability significance level of 5% was chosen to conclude the appropriate research question hypothesis. The airline fatal accidents and incidents probability levels were 1.2% and 0.05% respectively. These two research questions did not reach the 5% significance level threshold. Therefore, the airline groups' fatal accidents and non-destructive incidents probabilities favored the airline groups' safety differences hypothesis. The linear progression estimates for the remaining three research questions were 71.5% for non-fatal accidents, 21.8% for the pilot errors, and 7.4% significance level for the six probable causes. These research questions' linear regressions are greater than the 5% level. Consequently, these three research questions favored airline groups' safety similarities hypothesis. The study indicates the U.S. domestic major airlines were safer than the regional airlines. Ideas for potential airline safety progress can examine pilot fatigue, the airline groups' hiring policies, the government's airline oversight personnel, or the comparison of individual airline's operational policies.

  4. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  5. Using transportation accident databases to investigate ignition and explosion probabilities of flammable spills.

    PubMed

    Ronza, A; Vílchez, J A; Casal, J

    2007-07-19

    Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.

  6. Measurement of the top-quark mass with dilepton events selected using neuroevolution at CDF.

    PubMed

    Aaltonen, T; Adelman, J; Akimoto, T; Albrow, M G; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Bednar, P; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Copic, K; Cordelli, M; Cortiana, G; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Handler, R; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hauser, J; Hays, C; Heck, M; Heijboer, A; Heinemann, B; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Koay, S A; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kusakabe, Y; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, S W; Leone, S; Lewis, J D; Lin, C S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lu, R-S; Lucchesi, D; Lueck, J; Luci, C; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; Mack, P; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlok, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Osterberg, K; Pagan Griso, S; Pagliarone, C; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Reisert, B; Rekovic, V; Renton, P; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Saarikko, H; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sfyrla, A; Shalhout, S Z; Shears, T; Shekhar, R; Shepard, P F; Sherman, D; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Würthwein, F; Wagner, P; Wagner, R G; Wagner, R L; Wagner-Kuhr, J; Wagner, W; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Wester, W C; Whitehouse, B; Whiteson, D; Whiteson, S; Wicklund, A B; Wicklund, E; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Wynne, S M; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zhang, X; Zheng, Y; Zucchelli, S

    2009-04-17

    We report a measurement of the top-quark mass M_{t} in the dilepton decay channel tt[over ] --> bl;{'+} nu_{l};{'}b[over ]l;{-}nu[over ]_{l}. Events are selected with a neural network which has been directly optimized for statistical precision in top-quark mass using neuroevolution, a technique modeled on biological evolution. The top-quark mass is extracted from per-event probability densities that are formed by the convolution of leading order matrix elements and detector resolution functions. The joint probability is the product of the probability densities from 344 candidate events in 2.0 fb;{-1} of pp[over ] collisions collected with the CDF II detector, yielding a measurement of M_{t} = 171.2 +/- 2.7(stat) +/- 2.9(syst) GeV / c;{2}.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herberger, Sarah M.; Boring, Ronald L.

    Abstract Objectives: This paper discusses the differences between classical human reliability analysis (HRA) dependence and the full spectrum of probabilistic dependence. Positive influence suggests an error increases the likelihood of subsequent errors or success increases the likelihood of subsequent success. Currently the typical method for dependence in HRA implements the Technique for Human Error Rate Prediction (THERP) positive dependence equations. This assumes that the dependence between two human failure events varies at discrete levels between zero and complete dependence (as defined by THERP). Dependence in THERP does not consistently span dependence values between 0 and 1. In contrast, probabilistic dependencemore » employs Bayes Law, and addresses a continuous range of dependence. Methods: Using the laws of probability, complete dependence and maximum positive dependence do not always agree. Maximum dependence is when two events overlap to their fullest amount. Maximum negative dependence is the smallest amount that two events can overlap. When the minimum probability of two events overlapping is less than independence, negative dependence occurs. For example, negative dependence is when an operator fails to actuate Pump A, thereby increasing his or her chance of actuating Pump B. The initial error actually increases the chance of subsequent success. Results: Comparing THERP and probability theory yields different results in certain scenarios; with the latter addressing negative dependence. Given that most human failure events are rare, the minimum overlap is typically 0. And when the second event is smaller than the first event the max dependence is less than 1, as defined by Bayes Law. As such alternative dependence equations are provided along with a look-up table defining the maximum and maximum negative dependence given the probability of two events. Conclusions: THERP dependence has been used ubiquitously for decades, and has provided approximations of the dependencies between two events. Since its inception, computational abilities have increased exponentially, and alternative approaches that follow the laws of probability dependence need to be implemented. These new approaches need to consider negative dependence and identify when THERP output is not appropriate.« less

  8. Simple Physical Model for the Probability of a Subduction- Zone Earthquake Following Slow Slip Events and Earthquakes: Application to the Hikurangi Megathrust, New Zealand

    NASA Astrophysics Data System (ADS)

    Kaneko, Yoshihiro; Wallace, Laura M.; Hamling, Ian J.; Gerstenberger, Matthew C.

    2018-05-01

    Slow slip events (SSEs) have been documented in subduction zones worldwide, yet their implications for future earthquake occurrence are not well understood. Here we develop a relatively simple, simulation-based method for estimating the probability of megathrust earthquakes following tectonic events that induce any transient stress perturbations. This method has been applied to the locked Hikurangi megathrust (New Zealand) surrounded on all sides by the 2016 Kaikoura earthquake and SSEs. Our models indicate the annual probability of a M≥7.8 earthquake over 1 year after the Kaikoura earthquake increases by 1.3-18 times relative to the pre-Kaikoura probability, and the absolute probability is in the range of 0.6-7%. We find that probabilities of a large earthquake are mainly controlled by the ratio of the total stressing rate induced by all nearby tectonic sources to the mean stress drop of earthquakes. Our method can be applied to evaluate the potential for triggering a megathrust earthquake following SSEs in other subduction zones.

  9. Monte Carlo Simulation of Markov, Semi-Markov, and Generalized Semi- Markov Processes in Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    English, Thomas

    2005-01-01

    A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.

  10. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    PubMed Central

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2014-01-01

    We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156

  11. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  12. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  13. Mechanisms of stochastic focusing and defocusing in biological reaction networks: insight from accurate chemical master equation (ACME) solutions.

    PubMed

    Gursoy, Gamze; Terebus, Anna; Youfang Cao; Jie Liang

    2016-08-01

    Stochasticity plays important roles in regulation of biochemical reaction networks when the copy numbers of molecular species are small. Studies based on Stochastic Simulation Algorithm (SSA) has shown that a basic reaction system can display stochastic focusing (SF) by increasing the sensitivity of the network as a result of the signal noise. Although SSA has been widely used to study stochastic networks, it is ineffective in examining rare events and this becomes a significant issue when the tails of probability distributions are relevant as is the case of SF. Here we use the ACME method to solve the exact solution of the discrete Chemical Master Equations and to study a network where SF was reported. We showed that the level of SF depends on the degree of the fluctuations of signal molecule. We discovered that signaling noise under certain conditions in the same reaction network can lead to a decrease in the system sensitivities, thus the network can experience stochastic defocusing. These results highlight the fundamental role of stochasticity in biological reaction networks and the need for exact computation of probability landscape of the molecules in the system.

  14. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    PubMed

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Nutrition, fertility and steady-state population dynamics in a pre-industrial community in Penrith, northern England.

    PubMed

    Scott, S; Duncan, C J

    1999-10-01

    The effect of nutrition on fertility and its contribution thereby to population dynamics are assessed in three social groups (elite, tradesmen and subsistence) in a marginal, pre-industrial population in northern England. This community was particularly susceptible to fluctuations in the price of grains, which formed their basic foodstuff. The subsistence class, who formed the largest part of the population, had low levels of fertility and small family sizes, but women from all social groups had a characteristic and marked subfecundity in the early part of their reproductive lives. The health and nutrition of the mother during pregnancy was the most important factor in determining fertility and neonatal mortality. Inadequate nutrition had many subtle effects on reproduction which interacted to produce a complex web of events. A population boom occurred during the second half of the 18th century; fertility did not change but there was a marked improvement in infant mortality and it is suggested that the steadily improving nutritional standards of the population, particularly during crucial periods in pregnancy (i.e. the last trimester), probably made the biggest contribution to the improvement in infant mortality and so was probably the major factor in triggering the boom.

  16. Testing a basic assumption of shrubland fire management: Does the hazard of burning increase with the age of fuels?

    USGS Publications Warehouse

    Moritz, Max A.; Keeley, Jon E.; Johnson, Edward A.; Schaffner, Andrew A.

    2004-01-01

    This year's catastrophic wildfires in southern California highlight the need for effective planning and management for fire-prone landscapes. Fire frequency analysis of several hundred wildfires over a broad expanse of California shrublands reveals that there is generally not, as is commonly assumed, a strong relationship between fuel age and fire probabilities. Instead, the hazard of burning in most locations increases only moderately with time since the last fire, and a marked age effect of fuels is observed only in limited areas. Results indicate a serious need for a re-evaluation of current fire management and policy, which is based largely on eliminating older stands of shrubland vegetation. In many shrubland ecosystems exposed to extreme fire weather, large and intense wildfires may need to be factored in as inevitable events.

  17. Emergence of heat extremes attributable to anthropogenic influences

    NASA Astrophysics Data System (ADS)

    King, Andrew D.; Black, Mitchell T.; Min, Seung-Ki; Fischer, Erich M.; Mitchell, Daniel M.; Harrington, Luke J.; Perkins-Kirkpatrick, Sarah E.

    2016-04-01

    Climate scientists have demonstrated that a substantial fraction of the probability of numerous recent extreme events may be attributed to human-induced climate change. However, it is likely that for temperature extremes occurring over previous decades a fraction of their probability was attributable to anthropogenic influences. We identify the first record-breaking warm summers and years for which a discernible contribution can be attributed to human influence. We find a significant human contribution to the probability of record-breaking global temperature events as early as the 1930s. Since then, all the last 16 record-breaking hot years globally had an anthropogenic contribution to their probability of occurrence. Aerosol-induced cooling delays the timing of a significant human contribution to record-breaking events in some regions. Without human-induced climate change recent hot summers and years would be very unlikely to have occurred.

  18. Multiple-solution problems in a statistics classroom: an example

    NASA Astrophysics Data System (ADS)

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-11-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.

  19. On the Effect of Geomagnetic Storms on Relativistic Electrons in the Outer Radiation Belt: Van Allen Probes Observations

    NASA Astrophysics Data System (ADS)

    Moya, Pablo S.; Pinto, Víctor A.; Sibeck, David G.; Kanekal, Shrikanth G.; Baker, Daniel N.

    2017-11-01

    Using Van Allen Probes Energetic Particle, Composition, and Thermal Plasma-Relativistic Electron-Proton Telescope (ECT-REPT) observations, we performed a statistical study on the effect of geomagnetic storms on relativistic electrons fluxes in the outer radiation belt for 78 storms between September 2012 and June 2016. We found that the probability of enhancement, depletion, and no change in flux values depends strongly on L and energy. Enhancement events are more common for ˜2 MeV electrons at L ˜ 5, and the number of enhancement events decreases with increasing energy at any given L shell. However, considering the percentage of occurrence of each kind of event, enhancements are more probable at higher energies, and the probability of enhancement tends to increases with increasing L shell. Depletion are more probable for 4-5 MeV electrons at the heart of the outer radiation belt, and no-change events are more frequent at L < 3.5 for E ˜ 3 MeV particles. Moreover, for L > 4.5 the probability of enhancement, depletion, or no-change response presents little variation for all energies. Because these probabilities remain relatively constant as a function of radial distance in the outer radiation belt, measurements obtained at geosynchronous orbit may be used as a proxy to monitor E≥1.8 MeV electrons in the outer belt.

  20. Time‐dependent renewal‐model probabilities when date of last earthquake is unknown

    USGS Publications Warehouse

    Field, Edward H.; Jordan, Thomas H.

    2015-01-01

    We derive time-dependent, renewal-model earthquake probabilities for the case in which the date of the last event is completely unknown, and compare these with the time-independent Poisson probabilities that are customarily used as an approximation in this situation. For typical parameter values, the renewal-model probabilities exceed Poisson results by more than 10% when the forecast duration exceeds ~20% of the mean recurrence interval. We also derive probabilities for the case in which the last event is further constrained to have occurred before historical record keeping began (the historic open interval), which can only serve to increase earthquake probabilities for typically applied renewal models.We conclude that accounting for the historic open interval can improve long-term earthquake rupture forecasts for California and elsewhere.

  1. Seniors: Current Events and Consumer Awareness. Final Report.

    ERIC Educational Resources Information Center

    Molek, Carol

    A project was conducted to design and implement a curriculum for older adults in the areas of consumer events and consumer awareness. The project aimed to improve the seniors' basic skills by providing relevant content in a basic education context. During the project a curriculum was developed and presented through workshops held in 2 senior…

  2. A web-based incident reporting system: a two years' experience in an Italian research and teaching hospital.

    PubMed

    Bodina, A; Demarchi, A; Castaldi, S

    2014-01-01

    A web-based incident reporting system (IRS) is a tool allowing healthcare workers to voluntary and anonymously report adverse events/near misses. In 2010, this system was introduced in a research and teaching hospital in metropolitan area in the North part of Italy, in order to detect errors and to learn from failures in care delivery. The aim of this paper is to assess whether and how IRS has proved to be a valuable tool to manage clinical risk and improve healthcare quality. Adverse events are reported anonymously by staff members with the use of an online template form available in the hospital intranet. We retrospectively reviewed the recorded data for each incident/near miss reported between January 2011 and December 2012. The number of reported incidents/near misses was 521 in 2011 and 442 in 2012. In the two years the admissions were 36.974 and 36.107 respectively. We noticed that nursing staff made more use of IRS and that reported errors were basically related to prescription and administration of medications. Much international literature reports that adverse events and near misses are 10% of admissions. Our data are far from that number, thus meaning that a failure in reporting adverse events exists. This consideration, together with the high number of near misses in comparison with occurred errors, leads us to speculate that adverse events with serious consequences for patients are marginally reported. Probably the lack of a strong leadership considering IRS as an instrument for improving quality and operators' reluctance to overcome the culture of blame may negatively affect IRS.

  3. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  4. Pigeons, Facebook and the Birthday Problem

    ERIC Educational Resources Information Center

    Russell, Matthew

    2013-01-01

    The unexpectedness of the birthday problem has long been used by teachers of statistics in discussing basic probability calculation. An activity is described that engages students in understanding probability and sampling using the popular Facebook social networking site. (Contains 2 figures and 1 table.)

  5. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  6. Effects of shifts in the rate of repetitive stimulation on sustained attention

    NASA Technical Reports Server (NTRS)

    Krulewitz, J. E.; Warm, J. S.; Wohl, T. H.

    1975-01-01

    The effects of shifts in the rate of presentation of repetitive neutral events (background event rate) were studied in a visual vigilance task. Four groups of subjects experienced either a high (21 events/min) or a low (6 events/min) event rate for 20 min and then experienced either the same or the alternate event rate for an additional 40 min. The temporal occurrence of critical target signals was identical for all groups, irrespective of event rate. The density of critical signals was 12 signals/20 min. By the end of the session, shifts in event rate were associated with changes in performance which resembled contrast effects found in other experimental situations in which shift paradigms were used. Relative to constant event rate control conditions, a shift from a low to a high event rate depressed the probability of signal detections, while a shift in the opposite direction enhanced the probability of signal detections.

  7. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  8. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  9. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  10. Work probability distribution and tossing a biased coin

    NASA Astrophysics Data System (ADS)

    Saha, Arnab; Bhattacharjee, Jayanta K.; Chakraborty, Sagar

    2011-01-01

    We show that the rare events present in dissipated work that enters Jarzynski equality, when mapped appropriately to the phenomenon of large deviations found in a biased coin toss, are enough to yield a quantitative work probability distribution for the Jarzynski equality. This allows us to propose a recipe for constructing work probability distribution independent of the details of any relevant system. The underlying framework, developed herein, is expected to be of use in modeling other physical phenomena where rare events play an important role.

  11. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERM code GUI, as well as providing training applications.

  12. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERMcode GUI, as well as providing training applications.

  13. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  14. Estimating Single-Event Logic Cross Sections in Advanced Technologies

    NASA Astrophysics Data System (ADS)

    Harrington, R. C.; Kauppila, J. S.; Warren, K. M.; Chen, Y. P.; Maharrey, J. A.; Haeffner, T. D.; Loveless, T. D.; Bhuva, B. L.; Bounasser, M.; Lilja, K.; Massengill, L. W.

    2017-08-01

    Reliable estimation of logic single-event upset (SEU) cross section is becoming increasingly important for predicting the overall soft error rate. As technology scales and single-event transient (SET) pulse widths shrink to widths on the order of the setup-and-hold time of flip-flops, the probability of latching an SET as an SEU must be reevaluated. In this paper, previous assumptions about the relationship of SET pulsewidth to the probability of latching an SET are reconsidered and a model for transient latching probability has been developed for advanced technologies. A method using the improved transient latching probability and SET data is used to predict logic SEU cross section. The presented model has been used to estimate combinational logic SEU cross sections in 32-nm partially depleted silicon-on-insulator (SOI) technology given experimental heavy-ion SET data. Experimental SEU data show good agreement with the model presented in this paper.

  15. Insomnia medication use and the probability of an accidental event in an older adult population

    PubMed Central

    Avidan, Alon Y; Palmer, Liisa A; Doan, Justin F; Baran, Robert W

    2010-01-01

    Objective: This study examined the risk of accidental events in older adults prescribed a sedating antidepressant, long-acting benzodiazepine, short-acting benzodiazepine, and nonbenzodiazepine, relative to a reference group (selective melatonin receptor agonist). Methods: This was a retrospective cohort analysis of older adults (≥65 years) with newly initiated pharmacological treatment of insomnia. Data were collected from the Thomson MarketScan® Medicare Supplemental and Coordination of Benefits databases (January 1, 2000, through June 30, 2006). Probit models were used to evaluate the probability of an accidental event. Results: Data were analyzed for 445,329 patients. Patients taking a long-acting benzodiazepine (1.21 odds ratio [OR]), short-acting benzodiazepine (1.16 OR), or nonbenzodiazepine (1.12 OR) had a significantly higher probability of experiencing an accidental event during the first month following treatment initiation compared with patients taking the reference medication (P < 0.05 for all). A significantly higher probability of experiencing an accidental event was also observed during the 3-month period following the initiation of treatment (1.62 long-acting benzodiazepine, 1.60 short-acting benzodiazepine, 1.48 nonbenzodiazepine, and 1.56 sedating antidepressant; P < 0.05). Conclusions: Older adults taking an SAD or any of the benzodiazepine receptor agonists appear to have a greater risk of an accidental event compared with a reference group taking an MR. PMID:21701634

  16. Expressed Likelihood as Motivator: Creating Value through Engaging What’s Real

    PubMed Central

    Higgins, E. Tory; Franks, Becca; Pavarini, Dana; Sehnert, Steen; Manley, Katie

    2012-01-01

    Our research tested two predictions regarding how likelihood can have motivational effects as a function of how a probability is expressed. We predicted that describing the probability of a future event that could be either A or B using the language of high likelihood (“80% A”) rather than low likelihood (“20% B”), i.e., high rather than low expressed likelihood, would make a present activity more real and engaging, as long as the future event had properties relevant to the present activity. We also predicted that strengthening engagement from the high (vs. low) expressed likelihood of a future event would intensify the value of present positive and negative objects (in opposite directions). Both predictions were supported. There was also evidence that this intensification effect from expressed likelihood was independent of the actual probability or valence of the future event. What mattered was whether high versus low likelihood language was used to describe the future event. PMID:23940411

  17. Flood Frequency Analyses Using a Modified Stochastic Storm Transposition Method

    NASA Astrophysics Data System (ADS)

    Fang, N. Z.; Kiani, M.

    2015-12-01

    Research shows that areas with similar topography and climatic environment have comparable precipitation occurrences. Reproduction and realization of historical rainfall events provide foundations for frequency analysis and the advancement of meteorological studies. Stochastic Storm Transposition (SST) is a method for such a purpose and enables us to perform hydrologic frequency analyses by transposing observed historical storm events to the sites of interest. However, many previous studies in SST reveal drawbacks from simplified Probability Density Functions (PDFs) without considering restrictions for transposing rainfalls. The goal of this study is to stochastically examine the impacts of extreme events on all locations in a homogeneity zone. Since storms with the same probability of occurrence on homogenous areas do not have the identical hydrologic impacts, the authors utilize detailed precipitation parameters including the probability of occurrence of certain depth and the number of occurrence of extreme events, which are both incorporated into a joint probability function. The new approach can reduce the bias from uniformly transposing storms which erroneously increases the probability of occurrence of storms in areas with higher rainfall depths. This procedure is iterated to simulate storm events for one thousand years as the basis for updating frequency analysis curves such as IDF and FFA. The study area is the Upper Trinity River watershed including the Dallas-Fort Worth metroplex with a total area of 6,500 mi2. It is the first time that SST method is examined in such a wide scale with 20 years of radar rainfall data.

  18. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    DOE PAGES

    Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.

    2016-02-16

    Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less

  19. Performance evaluation of an importance sampling technique in a Jackson network

    NASA Astrophysics Data System (ADS)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  20. Simulation studies on the differences between spontaneous and triggered seismicity and on foreshock probabilities

    NASA Astrophysics Data System (ADS)

    Zhuang, J.; Vere-Jones, D.; Ogata, Y.; Christophersen, A.; Savage, M. K.; Jackson, D. D.

    2008-12-01

    In this study we investigate the foreshock probabilities calculated from earthquake catalogs from Japan, Southern California and New Zealand. Unlike conventional studies on foreshocks, we use a probability-based declustering method to separate each catalog into stochastic versions of family trees, such that each event is classified as either having been triggered by a preceding event, or being a spontaneous event. The probabilities are determined from parameters that provide the best fit of the real catalogue using a space- time epidemic-type aftershock sequence (ETAS) model. The model assumes that background and triggered earthquakes have the same magnitude dependent triggering capability. A foreshock here is defined as a spontaneous event that has one or more larger descendants, and a triggered foreshock is a triggered event that has one or more larger descendants. The proportion of foreshocks in spontaneous events of each catalog is found to be lower than the proportion of triggered foreshocks in triggered events. One possibility is that this is due to different triggering productivity in spontaneous versus triggered events, i.e., a triggered event triggers more children than a spontaneous events of the same magnitude. To understand what causes the above differences between spontaneous and triggered events, we apply the same procedures to several synthetic catalogs simulated by using different models. The first simulation is done by using the ETAS model with parameters and spontaneous rate fitted from the JMA catalog. The second synthetic catalog is simulated by using an adjusted ETAS model that takes into account the triggering effect from events lower than the magnitude. That is, we simulated the catalog with a low magnitude threshold with the original ETAS model, and then we remove the events smaller than a higher magnitude threshold. The third model for simulation assumes that different triggering behaviors exist between spontaneous event and triggered events. We repeat the fitting and reconstruction procedures to all those simulated catalogs. The reconstruction results for the first synthetic catalog do not show the difference between spontaneous events and triggered event or the differences in foreshock probabilities. On the other hand, results from the synthetic catalogs simulated with the second and the third models clearly reconstruct such differences. In summary our results implies that one of the causes of such differences may be neglecting the triggering effort from events smaller than the cut-off magnitude or magnitude errors. For the objective of forecasting seismicity, we can use a clustering model in which spontaneous events trigger child events in a different way from triggered events to avoid over-predicting earthquake risks with foreshocks. To understand the physical implication of this study, we need further careful studies to compare the real seismicity and the adjusted ETAS model, which takes the triggering effect from events below the cut-off magnitude into account.

  1. Definition of the Neutrosophic Probability

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2014-03-01

    Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.

  2. On-Orbit Collision Hazard Analysis in Low Earth Orbit Using the Poisson Probability Distribution (Version 1.0)

    DOT National Transportation Integrated Search

    1992-08-26

    This document provides the basic information needed to estimate a general : probability of collision in Low Earth Orbit (LEO). Although the method : described in this primer is a first order approximation, its results are : reasonable. Furthermore, t...

  3. An Application of Probability to Combinatorics: A Proof of Vandermonde Identity

    ERIC Educational Resources Information Center

    Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni

    2017-01-01

    In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using "induction", "polynomial identities" nor "generating functions", and will give a proof of the "Vandermonde…

  4. The Notions of Chance and Probabilities in Preschoolers

    ERIC Educational Resources Information Center

    Nikiforidou, Zoi; Pange, Jenny

    2010-01-01

    Chance, randomness and probability constitute statistical notions that are interrelated and characterize the logicomathematical thinking of children. Traditional theories support that probabilistic thinking evolves after the age of 7. However, recent research has underlined that children, as young as 4, may possess and develop basic notions,…

  5. Investigation of the probability of concurrent drought events between the water source and destination regions of China's water diversion project

    NASA Astrophysics Data System (ADS)

    Liu, Xiaomang; Luo, Yuzhou; Yang, Tiantian; Liang, Kang; Zhang, Minghua; Liu, Changming

    2015-10-01

    In this study, we investigate the concurrent drought probability between the water source and destination regions of the central route of China's South to North Water Diversion Project. We find that both regions have been drying from 1960 to 2013. The estimated return period of concurrent drought events in both regions is 11 years. However, since 1997, these regions have experienced 5 years of simultaneous drought. The projection results of global climate models show that the probability of concurrent drought events is highly likely to increase during 2020 to 2050. The increasing concurrent drought events will challenge the success of the water diversion project, which is a strategic attempt to resolve the water crisis of North China Plain. The data suggest great urgency in preparing adaptive measures to ensure the long-term sustainable operation of the water diversion project.

  6. On the predictability of outliers in ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Siegert, S.; Bröcker, J.; Kantz, H.

    2012-03-01

    In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.

  7. Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.

  8. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.

  9. Physiological Arousal in the Context of a Specified Anticipatory Period and Experimentally Induced Expectancy for Shock

    ERIC Educational Resources Information Center

    Mead, John D.; Dengerink, Harold A.

    1977-01-01

    The major intent of this research was to provide a further test of the relationships between physiological arousal and event probability by experimentally generating subjective expectancies for shock. The relationship of event probability to stress was discussed with respect to length of the anticipatory periods and methods used to establish…

  10. Enhancing the Possibility of Success by Measuring the Probability of Failure in an Educational Program.

    ERIC Educational Resources Information Center

    Brookhart, Susan M.; And Others

    1997-01-01

    Process Analysis is described as a method for identifying and measuring the probability of events that could cause the failure of a program, resulting in a cause-and-effect tree structure of events. The method is illustrated through the evaluation of a pilot instructional program at an elementary school. (SLD)

  11. The Conjunction Fallacy and the Many Meanings of "And"

    ERIC Educational Resources Information Center

    Hertwig, Ralph; Benz, Bjorn; Krauss, Stefan

    2008-01-01

    According to the conjunction rule, the probability of A "and" B cannot exceed the probability of either single event. This rule reads "and" in terms of the logical operator [inverted v], interpreting A and B as an intersection of two events. As linguists have long argued, in natural language "and" can convey a wide range of relationships between…

  12. Evaluating Risk Perception based on Gender Differences for Mountaineering Activity

    NASA Astrophysics Data System (ADS)

    Susanto, Novie; Susatyo, Nugroho W. P.; Rizkiyah, Ega

    2018-02-01

    In average 26 death events in mountaineering per year for the time span from 2003 to 2012 is reported. The number of women dying during the mountaineering is significantly smaller than males (3.5 deaths male for one female death). This study aims to analyze the differences of risk perception based on gender and provide recommendations as education basic to prevent accidents in mountaineering. This study utilizes the Kruskal-Wallis test and the Delphi Method. A total of 200 mountaineer respondents (100 males and 100 females) participated in this study. The independent variable in this study was gender. The dependent variable was risk perception including perception toward the serious accident, perception toward the probability of accident event as well as anxiety level and perception of efficacy and self-efficacy. The study result showed that the risk perception of women is higher than men with significant difference (p-value = 0.019). The recommendations from Delphi method result are by developing a positive mental attitude, showing about the risks that exist in nature, implementing Cognitive Behaviour Therapy (CBT) to raise awareness of the safety of ownself, following the climbing or mountaineer school, and using instructors to give lessons about safety in outdoor activities.

  13. Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.

    PubMed

    Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K

    2017-07-27

    Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.

  14. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  15. Warship Combat System Selection Methodology Based on Discrete Event Simulation

    DTIC Science & Technology

    2010-09-01

    Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the

  16. How psychological framing affects economic market prices in the lab and field.

    PubMed

    Sonnemann, Ulrich; Camerer, Colin F; Fox, Craig R; Langer, Thomas

    2013-07-16

    A fundamental debate in social sciences concerns how individual judgments and choices, resulting from psychological mechanisms, are manifested in collective economic behavior. Economists emphasize the capacity of markets to aggregate information distributed among traders into rational equilibrium prices. However, psychologists have identified pervasive and systematic biases in individual judgment that they generally assume will affect collective behavior. In particular, recent studies have found that judged likelihoods of possible events vary systematically with the way the entire event space is partitioned, with probabilities of each of N partitioned events biased toward 1/N. Thus, combining events into a common partition lowers perceived probability, and unpacking events into separate partitions increases their perceived probability. We look for evidence of such bias in various prediction markets, in which prices can be interpreted as probabilities of upcoming events. In two highly controlled experimental studies, we find clear evidence of partition dependence in a 2-h laboratory experiment and a field experiment on National Basketball Association (NBA) and Federation Internationale de Football Association (FIFA World Cup) sports events spanning several weeks. We also find evidence consistent with partition dependence in nonexperimental field data from prediction markets for economic derivatives (guessing the values of important macroeconomic statistics) and horse races. Results in any one of the studies might be explained by a specialized alternative theory, but no alternative theories can explain the results of all four studies. We conclude that psychological biases in individual judgment can affect market prices, and understanding those effects requires combining a variety of methods from psychology and economics.

  17. How psychological framing affects economic market prices in the lab and field

    PubMed Central

    Sonnemann, Ulrich; Camerer, Colin F.; Fox, Craig R.; Langer, Thomas

    2013-01-01

    A fundamental debate in social sciences concerns how individual judgments and choices, resulting from psychological mechanisms, are manifested in collective economic behavior. Economists emphasize the capacity of markets to aggregate information distributed among traders into rational equilibrium prices. However, psychologists have identified pervasive and systematic biases in individual judgment that they generally assume will affect collective behavior. In particular, recent studies have found that judged likelihoods of possible events vary systematically with the way the entire event space is partitioned, with probabilities of each of N partitioned events biased toward 1/N. Thus, combining events into a common partition lowers perceived probability, and unpacking events into separate partitions increases their perceived probability. We look for evidence of such bias in various prediction markets, in which prices can be interpreted as probabilities of upcoming events. In two highly controlled experimental studies, we find clear evidence of partition dependence in a 2-h laboratory experiment and a field experiment on National Basketball Association (NBA) and Federation Internationale de Football Association (FIFA World Cup) sports events spanning several weeks. We also find evidence consistent with partition dependence in nonexperimental field data from prediction markets for economic derivatives (guessing the values of important macroeconomic statistics) and horse races. Results in any one of the studies might be explained by a specialized alternative theory, but no alternative theories can explain the results of all four studies. We conclude that psychological biases in individual judgment can affect market prices, and understanding those effects requires combining a variety of methods from psychology and economics. PMID:23818628

  18. A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites

    NASA Astrophysics Data System (ADS)

    Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.

    2010-12-01

    Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.

  19. ‘If you are good, I get better’: the role of social hierarchy in perceptual decision-making

    PubMed Central

    Pannunzi, Mario; Ayneto, Alba; Deco, Gustavo; Sebastián-Gallés, Nuria

    2014-01-01

    So far, it was unclear if social hierarchy could influence sensory or perceptual cognitive processes. We evaluated the effects of social hierarchy on these processes using a basic visual perceptual decision task. We constructed a social hierarchy where participants performed the perceptual task separately with two covertly simulated players (superior, inferior). Participants were faster (better) when performing the discrimination task with the superior player. We studied the time course when social hierarchy was processed using event-related potentials and observed hierarchical effects even in early stages of sensory-perceptual processing, suggesting early top–down modulation by social hierarchy. Moreover, in a parallel analysis, we fitted a drift-diffusion model (DDM) to the results to evaluate the decision making process of this perceptual task in the context of a social hierarchy. Consistently, the DDM pointed to nondecision time (probably perceptual encoding) as the principal period influenced by social hierarchy. PMID:23946003

  20. Low dosages: new chemotherapeutic weapons on the battlefield of immune-related disease

    PubMed Central

    Liu, Jing; Zhao, Jie; Hu, Liang; Cao, Yuchun; Huang, Bo

    2011-01-01

    Chemotherapeutic drugs eliminate tumor cells at relatively high doses and are considered weapons against tumors in clinics and hospitals. However, despite their ability to induce cellular apoptosis, chemotherapeutic drugs should probably be regarded more as a class of cell regulators than cell killers, if the dosage used and the fact that their targets are involved in basic molecular events are considered. Unfortunately, the regulatory properties of chemotherapeutic drugs are usually hidden or masked by the massive cell death induced by high doses. Recent evidence has begun to suggest that low dosages of chemotherapeutic drugs might profoundly regulate various intracellular aspects of normal cells, especially immune cells. Here, we discuss the immune regulatory roles of three kinds of chemotherapeutic drugs under low-dose conditions and propose low dosages as potential new chemotherapeutic weapons on the battlefield of immune-related disease. PMID:21423201

  1. The development of indonesian traditional bekel game in android platform

    NASA Astrophysics Data System (ADS)

    Rahmat, R. F.; Fahrani, O. R.; Purnamawati, S.; Pasha, M. F.

    2018-03-01

    Bekel is one of traditional Indonesian game that is rarely played nowadays. Bekel is a game to test dexterity level using a bekel ball and 6 to10 seeds. The game is played by throwing the ball up in the air, spreading the seeds randomly on the floor then picking the seeds up until the ground is clear. This game application is an adaptation of Bekel game focusing on the movements of the ball and the randomization of the seed positions. This game application has three levels of difficulty based on the basic rules of the actual Bekel game. The focus of the study is the free fall method of the ball and the random function of the seeds in the Android environment. The result show the Bekel application has sensitivity level of 71% for the ball movements and the probability rate of the random event occurrence is at 23%.

  2. Imprecise probability assessment of tipping points in the climate system

    PubMed Central

    Kriegler, Elmar; Hall, Jim W.; Held, Hermann; Dawson, Richard; Schellnhuber, Hans Joachim

    2009-01-01

    Major restructuring of the Atlantic meridional overturning circulation, the Greenland and West Antarctic ice sheets, the Amazon rainforest and ENSO, are a source of concern for climate policy. We have elicited subjective probability intervals for the occurrence of such major changes under global warming from 43 scientists. Although the expert estimates highlight large uncertainty, they allocate significant probability to some of the events listed above. We deduce conservative lower bounds for the probability of triggering at least 1 of those events of 0.16 for medium (2–4 °C), and 0.56 for high global mean temperature change (above 4 °C) relative to year 2000 levels. PMID:19289827

  3. The contribution of threat probability estimates to reexperiencing symptoms: a prospective analog study.

    PubMed

    Regambal, Marci J; Alden, Lynn E

    2012-09-01

    Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Extreme events and event size fluctuations in biased random walks on networks.

    PubMed

    Kishore, Vimal; Santhanam, M S; Amritkar, R E

    2012-05-01

    Random walk on discrete lattice models is important to understand various types of transport processes. The extreme events, defined as exceedences of the flux of walkers above a prescribed threshold, have been studied recently in the context of complex networks. This was motivated by the occurrence of rare events such as traffic jams, floods, and power blackouts which take place on networks. In this work, we study extreme events in a generalized random walk model in which the walk is preferentially biased by the network topology. The walkers preferentially choose to hop toward the hubs or small degree nodes. In this setting, we show that extremely large fluctuations in event sizes are possible on small degree nodes when the walkers are biased toward the hubs. In particular, we obtain the distribution of event sizes on the network. Further, the probability for the occurrence of extreme events on any node in the network depends on its "generalized strength," a measure of the ability of a node to attract walkers. The generalized strength is a function of the degree of the node and that of its nearest neighbors. We obtain analytical and simulation results for the probability of occurrence of extreme events on the nodes of a network using a generalized random walk model. The result reveals that the nodes with a larger value of generalized strength, on average, display lower probability for the occurrence of extreme events compared to the nodes with lower values of generalized strength.

  5. Trends Concerning Four Misconceptions in Students' Intuitively-Based Probabilistic Reasoning Sourced in the Heuristic of Representativeness

    ERIC Educational Resources Information Center

    Kustos, Paul Nicholas

    2010-01-01

    Student difficulty in the study of probability arises in intuitively-based misconceptions derived from heuristics. One such heuristic, the one of note for this research study, is that of representativeness, in which an individual informally assesses the probability of an event based on the degree to which the event is similar to the sample from…

  6. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  7. Implicit Learning of Predictive Relationships in Three-element Visual Sequences by Young and Old Adults

    PubMed Central

    Howard, James H.; Howard, Darlene V.; Dennis, Nancy A.; Kelly, Andrew J.

    2008-01-01

    Knowledge of sequential relationships enables future events to be anticipated and processed efficiently. Research with the serial reaction time task (SRTT) has shown that sequence learning often occurs implicitly without effort or awareness. Here we report four experiments that use a triplet-learning task (TLT) to investigate sequence learning in young and older adults. In the TLT people respond only to the last target event in a series of discrete, three-event sequences or triplets. Target predictability is manipulated by varying the triplet frequency (joint probability) and/or the statistical relationships (conditional probabilities) among events within the triplets. Results revealed that both groups learned, though older adults showed less learning of both joint and conditional probabilities. Young people used the statistical information in both cues, but older adults relied primarily on information in the second cue alone. We conclude that the TLT complements and extends the SRTT and other tasks by offering flexibility in the kinds of sequential statistical regularities that may be studied as well as by controlling event timing and eliminating motor response sequencing. PMID:18763897

  8. Non-renewal statistics for electron transport in a molecular junction with electron-vibration interaction

    NASA Astrophysics Data System (ADS)

    Kosov, Daniel S.

    2017-09-01

    Quantum transport of electrons through a molecule is a series of individual electron tunneling events separated by stochastic waiting time intervals. We study the emergence of temporal correlations between successive waiting times for the electron transport in a vibrating molecular junction. Using the master equation approach, we compute the joint probability distribution for waiting times of two successive tunneling events. We show that the probability distribution is completely reset after each tunneling event if molecular vibrations are thermally equilibrated. If we treat vibrational dynamics exactly without imposing the equilibration constraint, the statistics of electron tunneling events become non-renewal. Non-renewal statistics between two waiting times τ1 and τ2 means that the density matrix of the molecule is not fully renewed after time τ1 and the probability of observing waiting time τ2 for the second electron transfer depends on the previous electron waiting time τ1. The strong electron-vibration coupling is required for the emergence of the non-renewal statistics. We show that in the Franck-Condon blockade regime, extremely rare tunneling events become positively correlated.

  9. How Unusual were Hurricane Harvey's Rains?

    NASA Astrophysics Data System (ADS)

    Emanuel, K.

    2017-12-01

    We apply an advanced technique for hurricane risk assessment to evaluate the probability of hurricane rainfall of Harvey's magnitude. The technique embeds a detailed computational hurricane model in the large-scale conditions represented by climate reanalyses and by climate models. We simulate 3700 hurricane events affecting the state of Texas, from each of three climate reanalyses spanning the period 1980-2016, and 2000 events from each of six climate models for each of two periods: the period 1981-2000 from historical simulations, and the period 2081-2100 from future simulations under Representative Concentration Pathway (RCP) 8.5. On the basis of these simulations, we estimate that hurricane rain of Harvey's magnitude in the state of Texas would have had an annual probability of 0.01 in the late twentieth century, and will have an annual probability of 0.18 by the end of this century, with remarkably small scatter among the six climate models downscaled. If the event frequency is changing linearly over time, this would yield an annual probability of 0.06 in 2017.

  10. You Say "Probable" and I Say "Likely": Improving Interpersonal Communication With Verbal Probability Phrases

    ERIC Educational Resources Information Center

    Karelitz, Tzur M.; Budescu, David V.

    2004-01-01

    When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…

  11. Job Displacement and Social Participation over the Lifecourse: Findings for a Cohort of Joiners

    PubMed Central

    Brand, Jennie E.; Burgard, Sarah A.

    2010-01-01

    We examine the effects of job displacement, an involuntary event associated with socioeconomic and psychological decline, on social participation. Using more than 45 years of panel data from the Wisconsin Longitudinal Study, we find that job displacement is associated with significant, long-term lower probabilities of subsequent involvement with various forms of social participation for workers displaced during their prime earnings years; displacement is not associated with lower probabilities of involvement for workers displaced in the years approaching retirement. We also find that post-displacement socioeconomic and psychological decline explain very little of the negative effect of job displacement on social participation, and that a single displacement event, rather than a series of multiple displacement events, is most strongly associated with lower probabilities of social involvement. PMID:20827387

  12. Extreme and superextreme events in a loss-modulated CO2 laser: Nonlinear resonance route and precursors

    NASA Astrophysics Data System (ADS)

    Bonatto, Cristian; Endler, Antonio

    2017-07-01

    We investigate the occurrence of extreme and rare events, i.e., giant and rare light pulses, in a periodically modulated CO2 laser model. Due to nonlinear resonant processes, we show a scenario of interaction between chaotic bands of different orders, which may lead to the formation of extreme and rare events. We identify a crisis line in the modulation parameter space, and we show that, when the modulation amplitude increases, remaining in the vicinity of the crisis, some statistical properties of the laser pulses, such as the average and dispersion of amplitudes, do not change much, whereas the amplitude of extreme events grows enormously, giving rise to extreme events with much larger deviations than usually reported, with a significant probability of occurrence, i.e., with a long-tailed non-Gaussian distribution. We identify recurrent regular patterns, i.e., precursors, that anticipate the emergence of extreme and rare events, and we associate these regular patterns with unstable periodic orbits embedded in a chaotic attractor. We show that the precursors may or may not lead to the emergence of extreme events. Thus, we compute the probability of success or failure (false alarm) in the prediction of the extreme events, once a precursor is identified in the deterministic time series. We show that this probability depends on the accuracy with which the precursor is identified in the laser intensity time series.

  13. Risk Management in Complex Construction Projects that Apply Renewable Energy Sources: A Case Study of the Realization Phase of the Energis Educational and Research Intelligent Building

    NASA Astrophysics Data System (ADS)

    Krechowicz, Maria

    2017-10-01

    Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.

  14. 149 Sources and 15 Years Later: The Navy-NRAO Green Bank Interferometer Monitoring Program

    NASA Astrophysics Data System (ADS)

    Lazio, T. J. W.; Waltman, E. B.; Ghigo, F.; Johnston, K. J.

    2000-12-01

    Flux densities for 149 sources were monitored with the Green Bank Interferometer for durations ranging from 3 to 15 yrs, covering the interval 1979--1996, with most sources observed for 6 yrs. Observations were at two radio frequencies (approximately 2.5 and 8.2 GHz) and have a typical sampling of one flux density measurement every 2 days. We have used these light curves to conduct various variability analysis of the sources. We find suggestive, though not unambiguous evidence, that these sources have a common, broadband mechanism for intrinsic variations. We also find that the extrinsic variation is more consistent with radio-wave scattering in an extended medium rather than in a thin screen. The primary motivation for this monitoring program was the identification of extreme scattering events. In an effort to identify ESEs in a systematic manner, we have taken the wavelet transform of the light curves. We find 15 events in the light curves of 12 sources that we classify as probable ESEs. However, we also find that five ESEs previously identified from these data do not survive our wavelet selection criteria. Future identification of ESEs will probably continue to rely on both visual and systematic methods. We present examples of the light curves and variability analyses. Instructions for obtaining the data are also given. The GBI is a facility of the National Science Foundation and was operated by the National Radio Astronomy Observatory under contract to the USNO and NRL during these observations. A portion of this work was performed while TJWL held a National Research Council-NRL Research Associateship. Basic research in radio astronomy at the NRL is supported by the Office of Naval Research.

  15. Just-in-Time Support

    ERIC Educational Resources Information Center

    Rollins, Suzy Pepper

    2016-01-01

    Most students have gaps in their background knowledge and basic skills-gaps that can stand in the way of learning new concepts. For example, a student may be excited about studying probability--until he realizes that today's lesson on probability will require him to use fractions. As his brain searches frantically for his dim recollection of the…

  16. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  17. Reasoning about conjunctive probabilistic concepts in childhood.

    PubMed

    Fisk, John E; Slattery, Rachel

    2005-09-01

    While adults are known to exhibit biases when making conjunctive probability judgments, little is known about childhood competencies in this area. Participants (aged between four and five years, eight and ten years, and a group of young adults) attempted to select the more likely of two events, a single event, and a conjunctive event containing, as one of its components, the single event. The problems were such that the objective probabilities of the component events were potentially available. Children in both age groups were generally successful when the single event was likely. However, when it was unlikely, a majority of children rejected it, choosing the conjunctive event instead, thereby committing the conjunction fallacy. A substantial minority of adults also committed the fallacy under equivalent conditions. It is concluded that under certain conditions children are capable of normative conjunctive judgments but that the mechanisms underpinning this capacity remain to be fully understood.

  18. Seismic Characterization of EGS Reservoirs

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.

    2014-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  20. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  1. Post-traumatic stress symptoms in Swedish obstetricians and midwives after severe obstetric events: a cross-sectional retrospective survey.

    PubMed

    Wahlberg, Å; Andreen Sachs, M; Johannesson, K; Hallberg, G; Jonsson, M; Skoog Svanberg, A; Högberg, U

    2017-07-01

    To examine post-traumatic stress reactions among obstetricians and midwives, experiences of support and professional consequences after severe events in the labour ward. Cross-sectional online survey from January 7 to March 10, 2014. Members of the Swedish Society of Obstetrics and Gynaecology and the Swedish Association of Midwives. Potentially traumatic events were defined as: the child died or was severely injured during delivery; maternal near-miss; maternal mortality; and other events such as violence or threat. The validated Screen Questionnaire Posttraumatic Stress Disorder (SQ-PTSD), based on DSM-IV (1994) 4th edition, was used to assess partial post-traumatic stress disorder (PTSD) and probable PTSD. Partial or probable PTSD. The response rate was 47% for obstetricians (n = 706) and 40% (n = 1459) for midwives. Eighty-four percent of the obstetricians and 71% of the midwives reported experiencing at least one severe event on the delivery ward. Fifteen percent of both professions reported symptoms indicative of partial PTSD, whereas 7% of the obstetricians and 5% of the midwives indicated symptoms fulfilling PTSD criteria. Having experienced emotions of guilt or perceived insufficient support from friends predicted a higher risk of suffering from partial or probable PTSD. Obstetricians and midwives with partial PTSD symptoms chose to change their work to outpatient care significantly more often than colleagues without these symptoms. A substantial proportion of obstetricians and midwives reported symptoms of partial or probable PTSD after severe traumatic events experienced on the labour ward. Support and resilience training could avoid suffering and consequences for professional carers. In a survey 15% of Swedish obstetricians and midwives reported PTSD symptoms after their worst obstetric event. © 2016 Royal College of Obstetricians and Gynaecologists.

  2. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  3. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  4. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  5. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  6. Quantum probability and Hilbert's sixth problem

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2018-04-01

    With the birth of quantum mechanics, the two disciplines that Hilbert proposed to axiomatize, probability and mechanics, became entangled and a new probabilistic model arose in addition to the classical one. Thus, to meet Hilbert's challenge, an axiomatization should account deductively for the basic features of all three disciplines. This goal was achieved within the framework of quantum probability. The present paper surveys the quantum probabilistic axiomatization. This article is part of the themed issue `Hilbert's sixth problem'.

  7. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  8. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  9. Methodology development for quantitative optimization of security enhancement in medical information systems -Case study in a PACS and a multi-institutional radiotherapy database-.

    PubMed

    Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari

    2002-01-01

    The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.

  10. Spatial distribution and occurrence probability of regional new particle formation events in eastern China

    NASA Astrophysics Data System (ADS)

    Shen, Xiaojing; Sun, Junying; Kivekäs, Niku; Kristensson, Adam; Zhang, Xiaoye; Zhang, Yangmei; Zhang, Lu; Fan, Ruxia; Qi, Xuefei; Ma, Qianli; Zhou, Huaigang

    2018-01-01

    In this work, the spatial extent of new particle formation (NPF) events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD) data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ) in the North China Plain (NCP), Mt. Tai (TS) in central eastern China, and Lin'an (LAN) in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100-200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR) and new particle formation rate (J) than air masses from Inner Mongolia (IM). At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial extent, relative probability of occurrence, and typical evolution of PNSD during NPF events presented in this study provide valuable information to further understand the climate and air quality effects of new particle formation.

  11. The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport

    NASA Astrophysics Data System (ADS)

    Poliaková, Adela

    2015-06-01

    The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.

  12. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  13. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  14. Self-assembly kinetics of microscale components: A parametric evaluation

    NASA Astrophysics Data System (ADS)

    Carballo, Jose M.

    The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.

  15. Rare events in networks with internal and external noise

    NASA Astrophysics Data System (ADS)

    Hindes, J.; Schwartz, I. B.

    2017-12-01

    We study rare events in networks with both internal and external noise, and develop a general formalism for analyzing rare events that combines pair-quenched techniques and large-deviation theory. The probability distribution, shape, and time scale of rare events are considered in detail for extinction in the Susceptible-Infected-Susceptible model as an illustration. We find that when both types of noise are present, there is a crossover region as the network size is increased, where the probability exponent for large deviations no longer increases linearly with the network size. We demonstrate that the form of the crossover depends on whether the endemic state is localized near the epidemic threshold or not.

  16. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  17. Extreme events as foundation of Lévy walks with varying velocity

    NASA Astrophysics Data System (ADS)

    Kutner, Ryszard

    2002-11-01

    In this work we study the role of extreme events [E.W. Montroll, B.J. West, in: J.L. Lebowitz, E.W. Montrell (Eds.), Fluctuation Phenomena, SSM, vol. VII, North-Holland, Amsterdam, 1979, p. 63; J.-P. Bouchaud, M. Potters, Theory of Financial Risks from Statistical Physics to Risk Management, Cambridge University Press, Cambridge, 2001; D. Sornette, Critical Phenomena in Natural Sciences. Chaos, Fractals, Selforganization and Disorder: Concepts and Tools, Springer, Berlin, 2000] in determining the scaling properties of Lévy walks with varying velocity. This model is an extension of the well-known Lévy walks one [J. Klafter, G. Zumofen, M.F. Shlesinger, in M.F. Shlesinger, G.M. Zaslavsky, U. Frisch (Eds.), Lévy Flights and Related Topics ion Physics, Lecture Notes in Physics, vol. 450, Springer, Berlin, 1995, p. 196; G. Zumofen, J. Klafter, M.F. Shlesinger, in: R. Kutner, A. Pȩkalski, K. Sznajd-Weron (Eds.), Anomalous Diffusion. From Basics to Applications, Lecture Note in Physics, vol. 519, Springer, Berlin, 1999, p. 15] introduced in the context of chaotic dynamics where a fixed value of the walker velocity is assumed for simplicity. Such an extension seems to be necessary when the open and/or complex system is studied. The model of Lévy walks with varying velocity is spanned on two coupled velocity-temporal hierarchies: the first one consisting of velocities and the second of corresponding time intervals which the walker spends between the successive turning points. Both these hierarchical structures are characterized by their own self-similar dimensions. The extreme event, which can appear within a given time interval, is defined as a single random step of the walker having largest length. By finding power-laws which describe the time-dependence of this displacement and its statistics we obtained two independent diffusion exponents, which are related to the above-mentioned dimensions and which characterize the extreme event kinetics. In this work we show the principal influence of extreme events on the basic quantities (one-step distributions and moments as well as two-step correlation functions) of the continuous-time random walk formalism. Besides, we construct both the waiting-time distribution and sojourn probability density directly in a real space and time in the scaling form by proper component analysis which takes into account all possible fluctuations of the walker steps in contrast to the extreme event analysis. In this work we pay our attention to the basic quantities, since the summarized multi-step ones were already discussed earlier [Physica A 264 (1999) 107; Comp. Phys. Commun. 147 (2002) 565]. Moreover, we study not only the scaling phenomena but also, assuming a finite number of hierarchy levels, the breaking of scaling and its dependence on control parameters. This seems to be important for studying empirical systems the more so as there are still no closed formulae describing this phenomenon except the one for truncated Lévy flights [Phys. Rev. Lett. 73 (1994) 2946]. Our formulation of the model made possible to develop an efficient Monte Carlo algorithm [Physica A 264 (1999) 107; Comp. Phys. Commun. 147 (2002) 565] where no MC step is lost.

  18. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  19. Quantifying the influence of global warming on unprecedented extreme climate events

    PubMed Central

    Singh, Deepti; Horton, Daniel E.; Swain, Daniel L.; Touma, Danielle; Charland, Allison; Liu, Yunjie; Haugen, Matz; Tsiang, Michael; Rajaratnam, Bala

    2017-01-01

    Efforts to understand the influence of historical global warming on individual extreme climate events have increased over the past decade. However, despite substantial progress, events that are unprecedented in the local observational record remain a persistent challenge. Leveraging observations and a large climate model ensemble, we quantify uncertainty in the influence of global warming on the severity and probability of the historically hottest month, hottest day, driest year, and wettest 5-d period for different areas of the globe. We find that historical warming has increased the severity and probability of the hottest month and hottest day of the year at >80% of the available observational area. Our framework also suggests that the historical climate forcing has increased the probability of the driest year and wettest 5-d period at 57% and 41% of the observed area, respectively, although we note important caveats. For the most protracted hot and dry events, the strongest and most widespread contributions of anthropogenic climate forcing occur in the tropics, including increases in probability of at least a factor of 4 for the hottest month and at least a factor of 2 for the driest year. We also demonstrate the ability of our framework to systematically evaluate the role of dynamic and thermodynamic factors such as atmospheric circulation patterns and atmospheric water vapor, and find extremely high statistical confidence that anthropogenic forcing increased the probability of record-low Arctic sea ice extent. PMID:28439005

  20. Quantifying the influence of global warming on unprecedented extreme climate events.

    PubMed

    Diffenbaugh, Noah S; Singh, Deepti; Mankin, Justin S; Horton, Daniel E; Swain, Daniel L; Touma, Danielle; Charland, Allison; Liu, Yunjie; Haugen, Matz; Tsiang, Michael; Rajaratnam, Bala

    2017-05-09

    Efforts to understand the influence of historical global warming on individual extreme climate events have increased over the past decade. However, despite substantial progress, events that are unprecedented in the local observational record remain a persistent challenge. Leveraging observations and a large climate model ensemble, we quantify uncertainty in the influence of global warming on the severity and probability of the historically hottest month, hottest day, driest year, and wettest 5-d period for different areas of the globe. We find that historical warming has increased the severity and probability of the hottest month and hottest day of the year at >80% of the available observational area. Our framework also suggests that the historical climate forcing has increased the probability of the driest year and wettest 5-d period at 57% and 41% of the observed area, respectively, although we note important caveats. For the most protracted hot and dry events, the strongest and most widespread contributions of anthropogenic climate forcing occur in the tropics, including increases in probability of at least a factor of 4 for the hottest month and at least a factor of 2 for the driest year. We also demonstrate the ability of our framework to systematically evaluate the role of dynamic and thermodynamic factors such as atmospheric circulation patterns and atmospheric water vapor, and find extremely high statistical confidence that anthropogenic forcing increased the probability of record-low Arctic sea ice extent.

  1. Quantifying the Influence of Global Warming on Unprecedented Extreme Climate Events

    NASA Technical Reports Server (NTRS)

    Diffenbaugh, Noah S.; Singh, Deepti; Mankin, Justin S.; Horton, Daniel E.; Swain, Daniel L.; Touma, Danielle; Charland, Allison; Liu, Yunjie; Haugen, Matz; Tsiang, Michael; hide

    2017-01-01

    Efforts to understand the influence of historical global warming on individual extreme climate events have increased over the past decade. However, despite substantial progress, events that are unprecedented in the local observational record remain a persistent challenge. Leveraging observations and a large climate model ensemble, we quantify uncertainty in the influence of global warming on the severity and probability of the historically hottest month, hottest day, driest year, and wettest 5-d period for different areas of the globe. We find that historical warming has increased the severity and probability of the hottest month and hottest day of the year at >80% of the available observational area. Our framework also suggests that the historical climate forcing has increased the probability of the driest year and wettest 5-d period at 57% and 41% of the observed area, respectively, although we note important caveats. For the most protracted hot and dry events, the strongest and most widespread contributions of anthropogenic climate forcing occur in the tropics, including increases in probability of at least a factor of 4 for the hottest month and at least a factor of 2 for the driest year. We also demonstrate the ability of our framework to systematically evaluate the role of dynamic and thermodynamic factors such as atmospheric circulation patterns and atmospheric water vapor, and find extremely high statistical confidence that anthropogenic forcing increased the probability of record-low Arctic sea ice extent.

  2. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  3. Pedigrees, Prizes, and Prisoners: The Misuse of Conditional Probability

    ERIC Educational Resources Information Center

    Carlton, Matthew A.

    2005-01-01

    We present and discuss three examples of misapplication of the notion of conditional probability. In each example, we present the problem along with a published and/or well-known incorrect--but seemingly plausible--solution. We then give a careful treatment of the correct solution, in large part to show how careful application of basic probability…

  4. Blind Students' Learning of Probability through the Use of a Tactile Model

    ERIC Educational Resources Information Center

    Vita, Aida Carvalho; Kataoka, Verônica Yumi

    2014-01-01

    The objective of this paper is to discuss how blind students learn basic concepts of probability using the tactile model proposed by Vita (2012). Among the activities were part of the teaching sequence "Jefferson's Random Walk", in which students built a tree diagram (using plastic trays, foam cards, and toys), and pictograms in 3D…

  5. Predicting traffic load impact of alternative recreation developments

    Treesearch

    Gary H. Elsner; Ronald A. Oliveira

    1973-01-01

    Traffic load changes as a result of expansion of recreation facilities may be predicted through computations based on estimates of (a) drawing power of the recreation attracttions, overnight accommodations, and in- or out-terminals; (b) probable types of travel; (c) probable routes of travel; and (d) total number of cars in the recreation system. Once the basic model...

  6. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  7. A pan-African medium-range ensemble flood forecast system

    NASA Astrophysics Data System (ADS)

    Thiemig, Vera; Bisselink, Bernard; Pappenberger, Florian; Thielen, Jutta

    2015-04-01

    The African Flood Forecasting System (AFFS) is a probabilistic flood forecast system for medium- to large-scale African river basins, with lead times of up to 15 days. The key components are the hydrological model LISFLOOD, the African GIS database, the meteorological ensemble predictions of the ECMWF and critical hydrological thresholds. In this study the predictive capability is investigated, to estimate AFFS' potential as an operational flood forecasting system for the whole of Africa. This is done in a hindcast mode, by reproducing pan-African hydrological predictions for the whole year of 2003 where important flood events were observed. Results were analysed in two ways, each with its individual objective. The first part of the analysis is of paramount importance for the assessment of AFFS as a flood forecasting system, as it focuses on the detection and prediction of flood events. Here, results were verified with reports of various flood archives such as Dartmouth Flood Observatory, the Emergency Event Database, the NASA Earth Observatory and Reliefweb. The number of hits, false alerts and missed alerts as well as the Probability of Detection, False Alarm Rate and Critical Success Index were determined for various conditions (different regions, flood durations, average amount of annual precipitations, size of affected areas and mean annual discharge). The second part of the analysis complements the first by giving a basic insight into the prediction skill of the general streamflow. For this, hydrological predictions were compared against observations at 36 key locations across Africa and the Continuous Rank Probability Skill Score (CRPSS), the limit of predictability and reliability were calculated. Results showed that AFFS detected around 70 % of the reported flood events correctly. In particular, the system showed good performance in predicting riverine flood events of long duration (> 1 week) and large affected areas (> 10 000 km2) well in advance, whereas AFFS showed limitations for small-scale and short duration flood events. Also the forecasts showed on average a good reliability, and the CRPSS helped identifying regions to focus on for future improvements. The case study for the flood event in March 2003 in the Sabi Basin (Zimbabwe and Mozambique) illustrated the good performance of AFFS in forecasting timing and severity of the floods, gave an example of the clear and concise output products, and showed that the system is capable of producing flood warnings even in ungauged river basins. Hence, from a technical perspective, AFFS shows a good prospective as an operational system, as it has demonstrated its significant potential to contribute to the reduction of flood-related losses in Africa by providing national and international aid organizations timely with medium-range flood forecast information. However, issues related to the practical implication will still need to be investigated.

  8. Mining Rare Events Data for Assessing Customer Attrition Risk

    NASA Astrophysics Data System (ADS)

    Au, Tom; Chin, Meei-Ling Ivy; Ma, Guangqin

    Customer attrition refers to the phenomenon whereby a customer leaves a service provider. As competition intensifies, preventing customers from leaving is a major challenge to many businesses such as telecom service providers. Research has shown that retaining existing customers is more profitable than acquiring new customers due primarily to savings on acquisition costs, the higher volume of service consumption, and customer referrals. For a large enterprise, its customer base consists of tens of millions service subscribers, more often the events, such as switching to competitors or canceling services are large in absolute number, but rare in percentage, far less than 5%. Based on a simple random sample, popular statistical procedures, such as logistic regression, tree-based method and neural network, can sharply underestimate the probability of rare events, and often result a null model (no significant predictors). To improve efficiency and accuracy for event probability estimation, a case-based data collection technique is then considered. A case-based sample is formed by taking all available events and a small, but representative fraction of nonevents from a dataset of interest. In this article we showed a consistent prior correction method for events probability estimation and demonstrated the performance of the above data collection techniques in predicting customer attrition with actual telecommunications data.

  9. Geologic implications of the Apollo 14 Fra Mauro breccias and comparison with ejecta from the Ries Crater, Germany

    USGS Publications Warehouse

    Chao, E.C.T.

    1973-01-01

    On the basis of petrographic and laboratory and active seismic data for the Fra Mauro breccias, and by comparison with the nature and distribution of the ejecta from the Ries crater, Germany, some tentative conclusions regarding the geologic significance of the Fra Mauro Formation on the moon can be drawn. The Fra Mauro Formation, as a whole, consists of unwcldcd, porous ejecta, slightly less porous than the regolith. It contains hand-specimen and larger size clasts of strongly annealed complex breccias, partly to slightly annealed breccias, basalts, and perhaps spherule-rich breccias. These clasts are embedded in a matrix of porous aggregate dominated by mineral and breccia fragments and probably largely free of undevitrified glass. All strongly annealed hand-specimen-size breccias are clasts in the Fra Mauro Formation. To account for the porous, unwelded state of the Fra Mauro Formation, the ejecta must have been deposited at a temperature below that required for welding and annealing. Large boulders probably compacted by the Cone crater event occur near the rim of the crater. They probably consist of a similar suite of fragments, but are probably less porous than the formation. The geochronologic clocks of fragments in the Fra Mauro Formation, with textures ranging from unannealed to strongly annealed, were not reset or strongly modified by the Imbrian event. Strongly annealed breccia clasts and basalt clasts are pre-Imbrian, and probably existed as ejecta mixed with basalt flows in the Imbrium Basin prior to the Imbrian event. The Imbrian event probably occurred between 3.90 or 3.88 and 3.65 b.y. ago.

  10. Design of a High Intensity Turbulent Combustion System

    DTIC Science & Technology

    2015-05-01

    nth repetition of a turbulent-flow experiment. [1] .................... 8 Figure 2. 3: Velocity measurement on the n th repetition of a turbulent-flow...measurement on the n th repetition of a turbulent-flow experiment. u(t) = U + u’(t...event such as P ≈ [ U < N ms-1 ]. The random variable U can be characterized by its probability density function (PDF). The probability of an event

  11. Estimating alarm thresholds and the number of components in mixture distributions

    NASA Astrophysics Data System (ADS)

    Burr, Tom; Hamada, Michael S.

    2012-09-01

    Mixtures of probability distributions arise in many nuclear assay and forensic applications, including nuclear weapon detection, neutron multiplicity counting, and in solution monitoring (SM) for nuclear safeguards. SM data is increasingly used to enhance nuclear safeguards in aqueous reprocessing facilities having plutonium in solution form in many tanks. This paper provides background for mixture probability distributions and then focuses on mixtures arising in SM data. SM data can be analyzed by evaluating transfer-mode residuals defined as tank-to-tank transfer differences, and wait-mode residuals defined as changes during non-transfer modes. A previous paper investigated impacts on transfer-mode and wait-mode residuals of event marking errors which arise when the estimated start and/or stop times of tank events such as transfers are somewhat different from the true start and/or stop times. Event marking errors contribute to non-Gaussian behavior and larger variation than predicted on the basis of individual tank calibration studies. This paper illustrates evidence for mixture probability distributions arising from such event marking errors and from effects such as condensation or evaporation during non-transfer modes, and pump carryover during transfer modes. A quantitative assessment of the sample size required to adequately characterize a mixture probability distribution arising in any context is included.

  12. Risk of Cardiovascular Events in Mothers of Women with Polycystic Ovary Syndrome

    PubMed Central

    Cheang, Kai I.; Nestler, John E.; Futterweit, Walter

    2009-01-01

    OBJECTIVE The purpose of this study was to assess the prevalence of cardiovascular events in an older population of women with polycystic ovary syndrome (PCOS). We took advantage of the high heritability of PCOS and determined the probable PCOS status of mothers of women with PCOS. Prevalence of cardiovascular events in PCOS and non-PCOS mothers was determined. METHODS In a single endocrine clinic, 308 women with PCOS were interviewed about their mothers’ medical history, and the mothers themselves were interviewed if available. The interview covered menstrual history, fertility, clinical signs of hyperandrogenism, age of incident cardiovascular event, and age of death as reported by daughters. Presence of PCOS in the mothers was defined as history of infertility, irregular menses, or clinical signs of hyperandrogenism. Cardiovascular event was defined as fatal or nonfatal myocardial infarction, any coronary intervention, angina requiring emergency room visits, or cerebrovascular event. RESULTS The mothers were predominantly postmenopausal. Among 182 interviewed (n=157) or deceased (n=25) mothers, 59 had probable PCOS. Cardiovascular events were more common (p=0.011) among PCOS mothers (11/59 or 18.6%) than non-PCOS mothers (5/123 or 4.1%). Adjusted for age and race, probable PCOS was an independent predictor of cardiovascular events (OR 5.41 95%CI 1.78−16.40). Cardiovascular events occurred at an early age in mothers of PCOS women, particularly mothers with PCOS themselves. CONCLUSION PCOS mothers of women with PCOS are at a higher risk of cardiovascular events compared with non-PCOS mothers, and cardiovascular events appear to occur at an earlier than expected age in PCOS mothers. PMID:19158047

  13. Analyzing time-ordered event data with missed observations.

    PubMed

    Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P

    2017-09-01

    A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p  = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event data.

  14. Clinical decision rules for termination of resuscitation in out-of-hospital cardiac arrest.

    PubMed

    Sherbino, Jonathan; Keim, Samuel M; Davis, Daniel P

    2010-01-01

    Out-of-hospital cardiac arrest (OHCA) has a low probability of survival to hospital discharge. Four clinical decision rules (CDRs) have been validated to identify patients with no probability of survival. Three of these rules focus on exclusive prehospital basic life support care for OHCA, and two of these rules focus on prehospital advanced life support care for OHCA. Can a CDR for the termination of resuscitation identify a patient with no probability of survival in the setting of OHCA? Six validation studies were selected from a PubMed search. A structured review of each of the studies is presented. In OHCA receiving basic life support care, the BLS-TOR (basic life support termination of resuscitation) rule has a positive predictive value for death of 99.5% (95% confidence interval 98.9-99.8%), and decreases the transportation of all patients by 62.6%. This rule has been appropriately validated for widespread use. In OHCA receiving advanced life support care, no current rule has been appropriately validated for widespread use. The BLS-TOR rule is a simple rule that identifies patients who will not survive OHCA. Further research is required to identify similarly robust CDRs for patients receiving advanced life support care in the setting of OHCA. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Medical management of youth baseball and softball tournaments.

    PubMed

    Kanaan, Matthew; Ray, Tracy R

    2013-01-01

    The medical management of youth baseball and softball tournaments requires both proper planning and a basic awareness of commonly seen sport-specific injuries. While youth sporting events are designed to be a fun experience for all, injuries and emergencies will occur. With proper planning, and supplies, the impact of these issues can be minimized. This article will outline some basic principles for the medical personnel that may be involved in youth baseball and softball events.

  16. A framework for performing comparative LCA between repairing flooded houses and construction of dikes in non-stationary climate with changing risk of flooding.

    PubMed

    Hennequin, Thomas; Sørup, Hjalte Jomo Danielsen; Dong, Yan; Arnbjerg-Nielsen, Karsten

    2018-06-13

    Sustainable flood management is a basic societal need. In this article, life cycle assessment is used to compare two ways to maintain the state of a coastal urban area in a changing climate with increasing flood risk. On one side, the construction of a dike, a hard and proactive scenario, is modelled using a bottom up approach. On the other, the systematic repair of houses flooded by sea surges, a post-disaster measure, is assessed using a Monte Carlo simulation allowing for aleatory uncertainties in predicting future sea level rise and occurrences of extreme events. Two metrics are identified, normalized mean impacts and probability of dike being most efficient. The methodology is applied to three case studies in Denmark representing three contrasting areas, Copenhagen, Frederiksværk, and Esbjerg. For all case studies the distribution of the calculated impact of repairing houses is highly right skewed, which in some cases has implications for the comparative LCA. The results show that, in Copenhagen, the scenario of the dike is overwhelmingly favorable for the environment, with a 43 times higher impact for repairing houses and only 0% probability of the repairs being favorable. For Frederiksværk and Esbjerg the corresponding numbers are 5 and 0.9 times and 85% and 32%, respectively. Hence constructing a dike at this point in time is highly recommended in Copenhagen, preferable in Frederiksværk, and probably not recommendable in Esbjerg. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Updating: Learning versus Supposing

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel

    2012-01-01

    Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…

  18. Moments of the Particle Phase-Space Density at Freeze-out and Coincidence Probabilities

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyż, W.; Zalewski, K.

    2005-10-01

    It is pointed out that the moments of phase-space particle density at freeze-out can be determined from the coincidence probabilities of the events observed in multiparticle production. A method to measure the coincidence probabilities is described and its validity examined.

  19. Embedding resilience in the design of the electricity supply for industrial clients

    PubMed Central

    Moura, Márcio das Chagas; Diniz, Helder Henrique Lima; da Cunha, Beatriz Sales; Lins, Isis Didier; Simoni, Vicente Ribeiro

    2017-01-01

    This paper proposes an optimization model, using Mixed-Integer Linear Programming (MILP), to support decisions related to making investments in the design of power grids serving industrial clients that experience interruptions to their energy supply due to disruptive events. In this approach, by considering the probabilities of the occurrence of a set of such disruptive events, the model is used to minimize the overall expected cost by determining an optimal strategy involving pre- and post-event actions. The pre-event actions, which are considered during the design phase, evaluate the resilience capacity (absorption, adaptation and restoration) and are tailored to the context of industrial clients dependent on a power grid. Four cases are analysed to explore the results of different probabilities of the occurrence of disruptions. Moreover, two scenarios, in which the probability of occurrence is lowest but the consequences are most serious, are selected to illustrate the model’s applicability. The results indicate that investments in pre-event actions, if implemented, can enhance the resilience of power grids serving industrial clients because the impacts of disruptions either are experienced only for a short time period or are completely avoided. PMID:29190777

  20. Embedding resilience in the design of the electricity supply for industrial clients.

    PubMed

    Moura, Márcio das Chagas; Diniz, Helder Henrique Lima; Droguett, Enrique López; da Cunha, Beatriz Sales; Lins, Isis Didier; Simoni, Vicente Ribeiro

    2017-01-01

    This paper proposes an optimization model, using Mixed-Integer Linear Programming (MILP), to support decisions related to making investments in the design of power grids serving industrial clients that experience interruptions to their energy supply due to disruptive events. In this approach, by considering the probabilities of the occurrence of a set of such disruptive events, the model is used to minimize the overall expected cost by determining an optimal strategy involving pre- and post-event actions. The pre-event actions, which are considered during the design phase, evaluate the resilience capacity (absorption, adaptation and restoration) and are tailored to the context of industrial clients dependent on a power grid. Four cases are analysed to explore the results of different probabilities of the occurrence of disruptions. Moreover, two scenarios, in which the probability of occurrence is lowest but the consequences are most serious, are selected to illustrate the model's applicability. The results indicate that investments in pre-event actions, if implemented, can enhance the resilience of power grids serving industrial clients because the impacts of disruptions either are experienced only for a short time period or are completely avoided.

  1. Framework for probabilistic flood risk assessment in an Alpine region

    NASA Astrophysics Data System (ADS)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the historic and synthetic flood events have to be spatially interpolated from point scale (i.e. river gauges) to the river network. Therefore, topological kriging (Top-kriging) proposed by Skøien et al. (2006) is applied. Top-kriging considers the nested structure of river networks and is therefore suitable to regionalise flood characteristics. Thus, the characteristics of a large number of possible flood events can be transferred to arbitrary locations (e.g. community level) at the river network within a study region. This framework has been used to generate a set of spatial correlated river flood events in the Austrian Federal Province of Vorarlberg. In addition, loss-probability-curves for each community has been calculated based on official inundation maps of public authorities, elements at risks and their vulnerability. One location along the river network within each community refers as interface between the set of flood events and the individual loss-probability relationships for the individual communities. Consequently, every flood event from the historic and synthetic generated dataset can be monetary evaluated. Thus, a time series comprising a large number of flood events and their corresponding monetary losses serves as basis for a probabilistic flood risk assessment. This includes expected annual losses and estimates of extreme event losses, which occur over the course of a certain time period. The gained results are essential decision-support for primary insurers, reinsurance companies and public authorities in order to setup a scale adequate risk management.

  2. Trial type probability modulates the cost of antisaccades

    PubMed Central

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  3. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  5. Using Atmospheric Circulation Patterns to Detect and Attribute Changes in the Risk of Extreme Climate Events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Horton, D. E.; Singh, D.; Swain, D. L.; Touma, D. E.; Mankin, J. S.

    2015-12-01

    Because of the high cost of extreme events and the growing evidence that global warming is likely to alter the statistical distribution of climate variables, detection and attribution of changes in the probability of extreme climate events has become a pressing topic for the scientific community, elected officials, and the public. While most of the emphasis has thus far focused on analyzing the climate variable of interest (most often temperature or precipitation, but also flooding and drought), there is an emerging emphasis on applying detection and attribution analysis techniques to the underlying physical causes of individual extreme events. This approach is promising in part because the underlying physical causes (such as atmospheric circulation patterns) can in some cases be more accurately represented in climate models than the more proximal climate variable (such as precipitation). In addition, and more scientifically critical, is the fact that the most extreme events result from a rare combination of interacting causes, often referred to as "ingredients". Rare events will therefore always have a strong influence of "natural" variability. Analyzing the underlying physical mechanisms can therefore help to test whether there have been changes in the probability of the constituent conditions of an individual event, or whether the co-occurrence of causal conditions cannot be distinguished from random chance. This presentation will review approaches to applying detection/attribution analysis to the underlying physical causes of extreme events (including both "thermodynamic" and "dynamic" causes), and provide a number of case studies, including the role of frequency of atmospheric circulation patterns in the probability of hot, cold, wet and dry events.

  6. Heavy-tailed distribution of cyber-risks

    NASA Astrophysics Data System (ADS)

    Maillart, T.; Sornette, D.

    2010-06-01

    With the development of the Internet, new kinds of massive epidemics, distributed attacks, virtual conflicts and criminality have emerged. We present a study of some striking statistical properties of cyber-risks that quantify the distribution and time evolution of information risks on the Internet, to understand their mechanisms, and create opportunities to mitigate, control, predict and insure them at a global scale. First, we report an exceptionnaly stable power-law tail distribution of personal identity losses per event, Pr(ID loss ≥ V) ~ 1/Vb, with b = 0.7 ± 0.1. This result is robust against a surprising strong non-stationary growth of ID losses culminating in July 2006 followed by a more stationary phase. Moreover, this distribution is identical for different types and sizes of targeted organizations. Since b < 1, the cumulative number of all losses over all events up to time t increases faster-than-linear with time according to ≃ t1/b, suggesting that privacy, characterized by personal identities, is necessarily becoming more and more insecure. We also show the existence of a size effect, such that the largest possible ID losses per event grow faster-than-linearly as ~S1.3 with the organization size S. The small value b ≃ 0.7 of the power law distribution of ID losses is explained by the interplay between Zipf’s law and the size effect. We also infer that compromised entities exhibit basically the same probability to incur a small or large loss.

  7. 77 FR 40081 - Gulf of Mexico, Outer Continental Shelf (OCS), Western Planning Area (WPA) and Central Planning...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... environmental impacts resulting from the Deepwater Horizon event, BOEM conducted an extensive search for... events, including a low- probability catastrophic event associated with a proposed lease sale and a...

  8. Improbable Outcomes: Infrequent or Extraordinary?

    ERIC Educational Resources Information Center

    Teigen, Karl Halvor; Juanchich, Marie; Riege, Anine H.

    2013-01-01

    Research on verbal probabilities has shown that "unlikely" or "improbable" events are believed to correspond to numerical probability values between 10% and 30%. However, building on a pragmatic approach of verbal probabilities and a new methodology, the present paper shows that unlikely outcomes are most often associated with outcomes that have a…

  9. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  10. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  11. Practices and Problems of Adult Basic Education in Rural Areas.

    ERIC Educational Resources Information Center

    Richardson, E. Gordon

    The percentages of adults needing adult basic education (ABE) programs in rural areas may not differ from those found in metropolitan areas, but the delivery of the system may be different. For example, the rural ABE teaching staff probably will be recruited from the ranks of the regular elementary or high school teachers to teach at night also,…

  12. Robustness of Multiple Objective Decision Analysis Preference Functions

    DTIC Science & Technology

    2002-06-01

    p p′ : The probability of some event. ,i ip q : The probability of event . i Π : An aggregation of proportional data used in calculating a test ...statistical tests of the significance of the term and also is conducted in a multivariate framework rather than the ROSA univariate approach. A...residual error is ˆ−e = y y (45) The coefficient provides a ready indicator of the contribution for the associated variable and statistical tests

  13. [Impact of water pollution risk in water transfer project based on fault tree analysis].

    PubMed

    Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing

    2009-09-15

    The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.

  14. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  15. Post-wildfire natural restoration of riparian vegetation under stable hydro-geomorphic conditions: Nahal Grar, Northern Negev Desert, Israel

    NASA Astrophysics Data System (ADS)

    Egozi, Roey

    2015-04-01

    Wildfires are common to the Mediterranean region due to its defined dry season and long historical anthropogenic activities. Most of post-wildfire studies focus on mountains areas and thus refer to the hill-slope and its physical characteristics, e.g. morphology, length, angles, and aspect; its soil characteristics, e.g. type, infiltration rate, repellency; and its vegetative covers, e.g. planted trees vs. natural forest or native vs. exotic vegetation. In contrary there is very limited literature focusing on ecological and hydro-geomorphic aspects of post-wildfire of riparian vegetation / zone probably because of its negligible burned area relative to the spread of the fire, sometimes, over the whole watershed area. The limited literature on the topic is surprising given the fact that riparian vegetation zone has been acknowledged as a unique and important habitat supporting rich biodiversity. Herein we report on a wildfire event occurred on October 14th 2009 in a river section of Nahal Grar, Northern Negev Desert, Israel. The wildfire although was limited in its area (only 3 hectare) extended over the channel alone from bank to bank and thus provide a unique case study of completely burn down of riparian vegetation, mainly dense stands of Common Red (Australis Phragmites. Therefore a detailed study of this event provides an opportunity to tackle one of the basics questions which is determining the rate of natural restoration process that act at the immediate time after the wildfire event occurred. This type of information is most valuable to professional and stakeholders for better management of post-fire riparian zones. The results of the study suggest that under stable conditions, i.e. no major flood events occurred; disturbance time was short and ranged over 200 days due to, almost, immediate recovery of the riparian vegetation. However the re-growth of the riparian vegetation was not even but rather deferential and more complex then reported in the literature. In addition during that period no morphological changes were measured in the channel bed and banks; similarly no changes observed to base flow discharge though slight changes were measured to water pH probably due to the large quantities of ash on river bed.

  16. An Examination of the Levels of Cognitive Demand Required by Probability Tasks in Middle Grades Mathematics Textbooks

    ERIC Educational Resources Information Center

    Jones, Dustin L.; Tarr, James E.

    2007-01-01

    We analyze probability content within middle grades (6, 7, and 8) mathematics textbooks from a historical perspective. Two series, one popular and the other alternative, from four recent eras of mathematics education (New Math, Back to Basics, Problem Solving, and Standards) were analyzed using the Mathematical Tasks Framework (Stein, Smith,…

  17. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  18. Assessing Aircraft Supply Air to Recommend Compounds for Timely Warning of Contamination

    NASA Astrophysics Data System (ADS)

    Fox, Richard B.

    Taking aircraft out of service for even one day to correct fume-in-cabin events can cost the industry roughly $630 million per year in lost revenue. The quantitative correlation study investigated quantitative relationships between measured concentrations of contaminants in bleed air and probability of odor detectability. Data were collected from 94 aircraft engine and auxiliary power unit (APU) bleed air tests from an archival data set between 1997 and 2011, and no relationships were found. Pearson correlation was followed by regression analysis for individual contaminants. Significant relationships of concentrations of compounds in bleed air to probability of odor detectability were found (p<0.05), as well as between compound concentration and probability of sensory irritancy detectability. Study results may be useful to establish early warning levels. Predictive trend monitoring, a method to identify potential pending failure modes within a mechanical system, may influence scheduled down-time for maintenance as a planned event, rather than repair after a mechanical failure and thereby reduce operational costs associated with odor-in-cabin events. Twenty compounds (independent variables) were found statistically significant as related to probability of odor detectability (dependent variable 1). Seventeen compounds (independent variables) were found statistically significant as related to probability of sensory irritancy detectability (dependent variable 2). Additional research was recommended to further investigate relationships between concentrations of contaminants and probability of odor detectability or probability of sensory irritancy detectability for all turbine oil brands. Further research on implementation of predictive trend monitoring may be warranted to demonstrate how the monitoring process might be applied to in-flight application.

  19. A novel approach for predicting the response of the spectrometer for INTEGRAL satellite.

    PubMed

    Kshetri, R

    2013-05-01

    A basic phenomenological approach has been presented in three recent papers (Kshetri R., 2012. JINST 7, P04008; Kshetri R., 2012. JINST 7, P07006; Kshetri R., 2012. JINST 7, P12007) for understanding the operation of encapsulated type composite detectors including the SPI spectrometer. In the present paper, we have considered the fact that the experimental two-fold events between two detectors include the three and higher fold events between the same two detectors. The formalism has been further developed and the peak-to-total ratio of a general composite detector are predicted for energy region with no direct experimental information about them. At 8MeV, the peak-to-total ratio for the SPI spectrometer and a very large detector (comprising of infinite number of single HPGe modules) are found to be 9% and 12%, respectively. The predictions for fold distribution of the SPI spectrometer are found to be in agreement with experimental data. Our formulation does not include ad-hoc fits, but expressions that are justifiable by probability flow arguments. Instead of using an empirical method or simulation, we present a novel approach for calculating the peak-to-total ratio of the SPI spectrometer for high gamma energies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Hepatitis disease detection using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  1. First Volcanological-Probabilistic Pyroclastic Density Current and Fallout Hazard Map for Campi Flegrei and Somma Vesuvius Volcanoes.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2005-05-01

    Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.

  2. STRIDE: Species Tree Root Inference from Gene Duplication Events.

    PubMed

    Emms, David M; Kelly, Steven

    2017-12-01

    The correct interpretation of any phylogenetic tree is dependent on that tree being correctly rooted. We present STRIDE, a fast, effective, and outgroup-free method for identification of gene duplication events and species tree root inference in large-scale molecular phylogenetic analyses. STRIDE identifies sets of well-supported in-group gene duplication events from a set of unrooted gene trees, and analyses these events to infer a probability distribution over an unrooted species tree for the location of its root. We show that STRIDE correctly identifies the root of the species tree in multiple large-scale molecular phylogenetic data sets spanning a wide range of timescales and taxonomic groups. We demonstrate that the novel probability model implemented in STRIDE can accurately represent the ambiguity in species tree root assignment for data sets where information is limited. Furthermore, application of STRIDE to outgroup-free inference of the origin of the eukaryotic tree resulted in a root probability distribution that provides additional support for leading hypotheses for the origin of the eukaryotes. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  3. Exaggerated risk: prospect theory and probability weighting in risky choice.

    PubMed

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-11-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and Kahneman's (1992) prospect theory, we found that the weighting function required to model precautionary decisions differed from that required for monetary gambles. This result indicates a failure of the descriptive invariance axiom of expected utility theory. For precautionary decisions, people overweighted small, medium-sized, and moderately large probabilities-they exaggerated risks. This effect is not anticipated by prospect theory or experience-based decision research (Hertwig, Barron, Weber, & Erev, 2004). We found evidence that exaggerated risk is caused by the accessibility of events in memory: The weighting function varies as a function of the accessibility of events. This suggests that people's experiences of events leak into decisions even when risk information is explicitly provided. Our findings highlight a need to investigate how variation in decision content produces variation in preferences for risk.

  4. Assessment of a Tsunami Hazard for Mediterranean Coast of Egypt

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Babeyko, Andrey; Yalciner, Ahmet; Pelinovsky, Efim

    2017-04-01

    Analysis of tsunami hazard for Egypt based on historic data and numerical modelling of historic and prognostic events is given. There are 13 historic events for 4000 years, including one instrumental record (1956). Tsunami database includes 12 earthquake tsunamis and 1 event of volcanic origin (Santorini eruption). Tsunami intensity of events (365, 881, 1303, 1870) is estimated as I = 3 led to tsunami wave height more than 6 m. Numerical simulation of some possible scenario of tsunamis of seismic and landslide origin is done with use of NAMI-DANCE software solved the shallow-water equations. The PTHA method (Probabilistic Tsunami Hazard Assessment - Probabilistic assessment of a tsunami hazard) for the Mediterranean Sea developed in (Sorensen M.B., Spada M., Babeyko A., Wiemer S., Grunthal G. Probabilistic tsunami hazard in the Mediterranean Sea. J Geophysical Research, 2012, vol. 117, B01305) is used to evaluate the probability of tsunami occurrence on the Egyptian coast. The synthetic catalogue of prognostic tsunamis of seismic origin with magnitude more than 6.5 includes 84 920 events for 100000 years. For the wave heights more 1 m the curve: exceedance probability - tsunami height can be approximated by exponential Gumbel function with two parameters which are determined for each coastal location in Egypt (totally. 24 points). Prognostic extreme highest events with probability less 10-4 are not satisfied to the Gumbel function (approximately 10 events) and required the special analysis. Acknowledgements: This work was supported EU FP7 ASTARTE Project [603839], and for EP - NS6637.2016.5.

  5. A testable model of earthquake probability based on changes in mean event size

    NASA Astrophysics Data System (ADS)

    Imoto, Masajiro

    2003-02-01

    We studied changes in mean event size using data on microearthquakes obtained from a local network in Kanto, central Japan, from a viewpoint that a mean event size tends to increase as the critical point is approached. A parameter describing changes was defined using a simple weighting average procedure. In order to obtain the distribution of the parameter in the background, we surveyed values of the parameter from 1982 to 1999 in a 160 × 160 × 80 km volume. The 16 events of M5.5 or larger in this volume were selected as target events. The conditional distribution of the parameter was estimated from the 16 values, each of which referred to the value immediately prior to each target event. The distribution of the background becomes a function of symmetry, the center of which corresponds to no change in b value. In contrast, the conditional distribution exhibits an asymmetric feature, which tends to decrease the b value. The difference in the distributions between the two groups was significant and provided us a hazard function for estimating earthquake probabilities. Comparing the hazard function with a Poisson process, we obtained an Akaike Information Criterion (AIC) reduction of 24. This reduction agreed closely with the probability gains of a retrospective study in a range of 2-4. A successful example of the proposed model can be seen in the earthquake of 3 June 2000, which is the only event during the period of prospective testing.

  6. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  7. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  8. A quantum measure of the multiverse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilenkin, Alexander, E-mail: vilenkin@cosmos.phy.tufts.edu

    2014-05-01

    It has been recently suggested that probabilities of different events in the multiverse are given by the frequencies at which these events are encountered along the worldline of a geodesic observer (the ''watcher''). Here I discuss an extension of this probability measure to quantum theory. The proposed extension is gauge-invariant, as is the classical version of this measure. Observations of the watcher are described by a reduced density matrix, and the frequencies of events can be found using the decoherent histories formalism of Quantum Mechanics (adapted to open systems). The quantum watcher measure makes predictions in agreement with the standardmore » Born rule of QM.« less

  9. Liquefaction Hazard Maps for Three Earthquake Scenarios for the Communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos, Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale, Northern Santa Clara County, California

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2008-01-01

    Maps showing the probability of surface manifestations of liquefaction in the northern Santa Clara Valley were prepared with liquefaction probability curves. The area includes the communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale. The probability curves were based on complementary cumulative frequency distributions of the liquefaction potential index (LPI) for surficial geologic units in the study area. LPI values were computed with extensive cone penetration test soundings. Maps were developed for three earthquake scenarios, an M7.8 on the San Andreas Fault comparable to the 1906 event, an M6.7 on the Hayward Fault comparable to the 1868 event, and an M6.9 on the Calaveras Fault. Ground motions were estimated with the Boore and Atkinson (2008) attenuation relation. Liquefaction is predicted for all three events in young Holocene levee deposits along the major creeks. Liquefaction probabilities are highest for the M7.8 earthquake, ranging from 0.33 to 0.37 if a 1.5-m deep water table is assumed, and 0.10 to 0.14 if a 5-m deep water table is assumed. Liquefaction probabilities of the other surficial geologic units are less than 0.05. Probabilities for the scenario earthquakes are generally consistent with observations during historical earthquakes.

  10. ERP effects and perceived exclusion in the Cyberball paradigm: Correlates of expectancy violation?

    PubMed

    Weschke, Sarah; Niedeggen, Michael

    2015-10-22

    A virtual ball-tossing game called Cyberball has allowed the identification of neural structures involved in the processing of social exclusion by using neurocognitive methods. However, there is still an ongoing debate if structures involved are either pain- or exclusion-specific or part of a broader network. In electrophysiological Cyberball studies we have shown that the P3b component is sensitive to exclusion manipulations, possibly modulated by the probability of ball possession of the participant (event "self") or the presumed co-players (event "other"). Since it is known from oddball studies that the P3b is not only modulated by the objective probability of an event, but also by subjective expectancy, we independently manipulated the probability of the events "self" and "other" and the expectancy for these events. Questionnaire data indicate that social need threat is only induced when the expectancy for involvement in the ball-tossing game is violated. Similarly, the P3b amplitude of both "self" and "other" events was a correlate of expectancy violation. We conclude that both the subjective report of exclusion and the P3b effect induced in the Cyberball paradigm are primarily based on a cognitive process sensitive to expectancy violations, and that the P3b is not related to the activation of an exclusion-specific neural alarm system. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Analyzing phenological extreme events over the past five decades in Germany

    NASA Astrophysics Data System (ADS)

    Schleip, Christoph; Menzel, Annette; Estrella, Nicole; Graeser, Philipp

    2010-05-01

    As climate change may alter the frequency and intensity of extreme temperatures, we analysed whether warming of the last 5 decades has already changed the statistics of phenological extreme events. In this context, two extreme value statistical concepts are discussed and applied to existing phenological datasets of German Weather Service (DWD) in order to derive probabilities of occurrence for extreme early or late phenological events. We analyse four phenological groups; "begin of flowering, "leaf foliation", "fruit ripening" and "leaf colouring" as well as DWD indicator phases of the "phenological year". Additionally we put an emphasis on a between-species analysis; a comparison of differences in extreme onsets between three common northern conifers. Furthermore we conducted a within-species analysis with different phases of horse chestnut throughout a year. The first statistical approach fits data to a Gaussian model using traditional statistical techniques, and then analyses the extreme quantile. The key point of this approach is the adoption of an appropriate probability density function (PDF) to the observed data and the assessment of the PDF parameters change in time. The full analytical description in terms of the estimated PDF for defined time steps of the observation period allows probability assessments of extreme values for e.g. annual or decadal time steps. Related with this approach is the possibility of counting out the onsets which fall in our defined extreme percentiles. The estimation of the probability of extreme events on the basis of the whole data set is in contrast to analyses with the generalized extreme value distribution (GEV). The second approach deals with the extreme PDFs itself and fits the GEV distribution to annual minima of phenological series to provide useful estimates about return levels. For flowering and leaf unfolding phases exceptionally early extremes are seen since the mid 1980s and especially for the single years 1961, 1990 and 2007 whereas exceptionally extreme late events are seen in the year 1970. Summer phases such as fruit ripening exhibit stronger shifts to early extremes than spring phases. Leaf colouring phases reveal increasing probability for late extremes. The with GEV estimated 100-year event of Picea, Pinus and Larix amount to extreme early events of about -27, -31.48 and -32.79 days, respectively. If we assume non-stationary minimum data we get a more extreme 100-year event of about -35.40 for Picea but associated with wider confidence intervals. The GEV is simply another probability distribution but for purposes of extreme analysis in phenology it should be considered as equally important as (if not more important than) the Gaussian PDF approach.

  12. Learning difficulties of senior high school students based on probability understanding levels

    NASA Astrophysics Data System (ADS)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  13. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  14. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  15. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  16. On the origin and timing of Zika virus introduction in Brazil.

    PubMed

    Massad, E; Burattini, M Nascimento; Khan, K; Struchiner, C J; Coutinho, F A B; Wilder-Smith, A

    2017-08-01

    The timing and origin of Zika virus (ZIKV) introduction in Brazil has been the subject of controversy. Initially, it was assumed that the virus was introduced during the FIFA World Cup in June-July 2014. Then, it was speculated that ZIKV may have been introduced by athletes from French Polynesia (FP) who competed in a canoe race in Rio de Janeiro in August 2014. We attempted to apply mathematical models to determine the most likely time window of ZIKV introduction in Brazil. Given that the timing and origin of ZIKV introduction in Brazil may be a politically sensitive issue, its determination (or the provision of a plausible hypothesis) may help to prevent undeserved blame. We used a simple mathematical model to estimate the force of infection and the corresponding individual probability of being infected with ZIKV in FP. Taking into account the air travel volume from FP to Brazil between October 2013 and March 2014, we estimated the expected number of infected travellers arriving at Brazilian airports during that period. During the period between December 2013 and February 2014, 51 individuals travelled from FP airports to 11 Brazilian cities. Basing on the calculated force of ZIKV infection (the per capita rate of new infections per time unit) and risk of infection (probability of at least one new infection), we estimated that 18 (95% CI 12-22) individuals who arrived in seven of the evaluated cities were infected. When basic ZIKV reproduction numbers greater than one were assumed in the seven evaluated cities, ZIKV could have been introduced in any one of the cities. Based on the force of infection in FP, basic reproduction ZIKV number in selected Brazilian cities, and estimated travel volume, we concluded that ZIKV was most likely introduced and established in Brazil by infected travellers arriving from FP in the period between October 2013 and March 2014, which was prior to the two aforementioned sporting events.

  17. Lightning Characteristics and Lightning Strike Peak Current Probabilities as Related to Aerospace Vehicle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    1998-01-01

    A summary is presented of basic lightning characteristics/criteria for current and future NASA aerospace vehicles. The paper estimates the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating probabilities of launch vehicles/objects being struck by lightning. This paper presents these results.

  18. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  19. Evaluating the risk of industrial espionage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bott, T.F.

    1998-12-31

    A methodology for estimating the relative probabilities of different compromise paths for protected information by insider and visitor intelligence collectors has been developed based on an event-tree analysis of the intelligence collection operation. The analyst identifies target information and ultimate users who might attempt to gain that information. The analyst then uses an event tree to develop a set of compromise paths. Probability models are developed for each of the compromise paths that user parameters based on expert judgment or historical data on security violations. The resulting probability estimates indicate the relative likelihood of different compromise paths and provide anmore » input for security resource allocation. Application of the methodology is demonstrated using a national security example. A set of compromise paths and probability models specifically addressing this example espionage problem are developed. The probability models for hard-copy information compromise paths are quantified as an illustration of the results using parametric values representative of historical data available in secure facilities, supplemented where necessary by expert judgment.« less

  20. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  1. Contribution of criterion A2 to PTSD screening in the presence of traumatic events.

    PubMed

    Pereda, Noemí; Forero, Carlos G

    2012-10-01

    Criterion A2 according to the Diagnostic and Statistical Manual of Mental Disorders (4(th) ed.; DSM-IV; American Psychiatric Association [APA], 1994) for posttraumatic stress disorder (PTSD) aims to assess the individual's subjective appraisal of an event, but it has been claimed that it might not be sufficiently specific for diagnostic purposes. We analyse the contribution of Criterion A2 and DSM-IV criteria to detect PTSD for the most distressing life events experienced by our subjects. Young adults (N = 1,033) reported their most distressing life events, together with PTSD criteria (Criteria A2, B, C, D, E, and F). PTSD prevalence and criterion specificity and agreement with probable diagnoses were estimated. Our results indicate 80.30% of the individuals experienced traumatic events and met one or more PTSD criteria; 13.22% cases received a positive diagnosis of PTSD. Criterion A2 showed poor agreement with the final probable PTSD diagnosis (correlation with PTSD .13, specificity = .10); excluding it from PTSD diagnosis did not the change the estimated disorder prevalence significantly. Based on these findings it appears that Criterion A2 is scarcely specific and provides little information to confirm a probable PTSD case. Copyright © 2012 International Society for Traumatic Stress Studies.

  2. The impossibility of probabilities

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    2017-11-01

    This paper discusses the problem of assigning probabilities to the likelihood of nuclear terrorism events, in particular examining the limitations of using Bayesian priors for this purpose. It suggests an alternate approach to analyzing the threat of nuclear terrorism.

  3. Sex differences in traumatic events and psychiatric morbidity associated to probable posttraumatic stress disorder among Latino prisoners.

    PubMed

    Pérez-Pedrogo, Coralee; Martínez-Taboas, Alfonso; González, Rafael A; Caraballo, José N; Albizu-García, Carmen E

    2018-04-14

    Latinos comprised 17.1% of the U.S. population and 33.1% of US prisoners, yet they are underrepresented in the psychopathology literature. Despite higher rates of trauma among incarcerated individuals than in the general population, most of the previous research in this area focused primarily on women samples, and very few studies examined sex differences in PTSD and traumatic experiences. In addition, there is a need for research assessing traumatic experiences and probable PTSD in men and women Latino inmates to inform culturally competent care and sex sensitive care for this vulnerable and underserved population. Our study examined whether men and women Latino inmates with probable Posttraumatic Stress Disorder (PTSD), based on the cut off 40 or more symptoms on the Davidson Trauma Scale (DTS), differed significantly by the number of event types experienced, the type of potentially traumatizing event, and in co-occurring psychiatric conditions. A multi-stage sample design was used to select a probabilistic sample of 1,331 inmates from 26 penal institutions in PR of which 1179 participated in the study. Bivariate associations were calculated for each type of traumatic event and probable PTSD. Mean number of types of potentially traumatizing event experienced was comparable for both sexes (F = 3.83, M = 3.74) yet sex differences were found in the nature of the event. Women with probable PTSD had higher rates of experiencing rape and sexual abuse. Men had higher rates of experiencing combat in war, a life-threatening accident, of witnessing violence, and being threatened with a weapon. Men with significant ADHD symptoms in childhood and with Generalized Anxiety Disorder (GAD) during adulthood were almost 5 and 7 times as likely to score above threshold on the DTS whereas women were >3 times as likely in the presence of ADHD symptoms in childhood or depression during adulthood. This study underscores the need to improve understanding of the clinical manifestations of trauma and co-occurring psychiatric conditions for appropriate sex sensitive interventions targeting Latinos living in prisons. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Discrimination between induced, triggered, and natural earthquakes close to hydrocarbon reservoirs: A probabilistic approach based on the modeling of depletion-induced stress changes and seismological source parameters

    NASA Astrophysics Data System (ADS)

    Dahm, Torsten; Cesca, Simone; Hainzl, Sebastian; Braun, Thomas; Krüger, Frank

    2015-04-01

    Earthquakes occurring close to hydrocarbon fields under production are often under critical view of being induced or triggered. However, clear and testable rules to discriminate the different events have rarely been developed and tested. The unresolved scientific problem may lead to lengthy public disputes with unpredictable impact on the local acceptance of the exploitation and field operations. We propose a quantitative approach to discriminate induced, triggered, and natural earthquakes, which is based on testable input parameters. Maxima of occurrence probabilities are compared for the cases under question, and a single probability of being triggered or induced is reported. The uncertainties of earthquake location and other input parameters are considered in terms of the integration over probability density functions. The probability that events have been human triggered/induced is derived from the modeling of Coulomb stress changes and a rate and state-dependent seismicity model. In our case a 3-D boundary element method has been adapted for the nuclei of strain approach to estimate the stress changes outside the reservoir, which are related to pore pressure changes in the field formation. The predicted rate of natural earthquakes is either derived from the background seismicity or, in case of rare events, from an estimate of the tectonic stress rate. Instrumentally derived seismological information on the event location, source mechanism, and the size of the rupture plane is of advantage for the method. If the rupture plane has been estimated, the discrimination between induced or only triggered events is theoretically possible if probability functions are convolved with a rupture fault filter. We apply the approach to three recent main shock events: (1) the Mw 4.3 Ekofisk 2001, North Sea, earthquake close to the Ekofisk oil field; (2) the Mw 4.4 Rotenburg 2004, Northern Germany, earthquake in the vicinity of the Söhlingen gas field; and (3) the Mw 6.1 Emilia 2012, Northern Italy, earthquake in the vicinity of a hydrocarbon reservoir. The three test cases cover the complete range of possible causes: clearly "human induced," "not even human triggered," and a third case in between both extremes.

  5. Statistical primer: propensity score matching and its alternatives.

    PubMed

    Benedetto, Umberto; Head, Stuart J; Angelini, Gianni D; Blackstone, Eugene H

    2018-06-01

    Propensity score (PS) methods offer certain advantages over more traditional regression methods to control for confounding by indication in observational studies. Although multivariable regression models adjust for confounders by modelling the relationship between covariates and outcome, the PS methods estimate the treatment effect by modelling the relationship between confounders and treatment assignment. Therefore, methods based on the PS are not limited by the number of events, and their use may be warranted when the number of confounders is large, or the number of outcomes is small. The PS is the probability for a subject to receive a treatment conditional on a set of baseline characteristics (confounders). The PS is commonly estimated using logistic regression, and it is used to match patients with similar distribution of confounders so that difference in outcomes gives unbiased estimate of treatment effect. This review summarizes basic concepts of the PS matching and provides guidance in implementing matching and other methods based on the PS, such as stratification, weighting and covariate adjustment.

  6. A quantum theory account of order effects and conjunction fallacies in political judgments.

    PubMed

    Yearsley, James M; Trueblood, Jennifer S

    2017-09-06

    Are our everyday judgments about the world around us normative? Decades of research in the judgment and decision-making literature suggest the answer is no. If people's judgments do not follow normative rules, then what rules if any do they follow? Quantum probability theory is a promising new approach to modeling human behavior that is at odds with normative, classical rules. One key advantage of using quantum theory is that it explains multiple types of judgment errors using the same basic machinery, unifying what have previously been thought of as disparate phenomena. In this article, we test predictions from quantum theory related to the co-occurrence of two classic judgment phenomena, order effects and conjunction fallacies, using judgments about real-world events (related to the U.S. presidential primaries). We also show that our data obeys two a priori and parameter free constraints derived from quantum theory. Further, we examine two factors that moderate the effects, cognitive thinking style (as measured by the Cognitive Reflection Test) and political ideology.

  7. 'If you are good, I get better': the role of social hierarchy in perceptual decision-making.

    PubMed

    Santamaría-García, Hernando; Pannunzi, Mario; Ayneto, Alba; Deco, Gustavo; Sebastián-Gallés, Nuria

    2014-10-01

    So far, it was unclear if social hierarchy could influence sensory or perceptual cognitive processes. We evaluated the effects of social hierarchy on these processes using a basic visual perceptual decision task. We constructed a social hierarchy where participants performed the perceptual task separately with two covertly simulated players (superior, inferior). Participants were faster (better) when performing the discrimination task with the superior player. We studied the time course when social hierarchy was processed using event-related potentials and observed hierarchical effects even in early stages of sensory-perceptual processing, suggesting early top-down modulation by social hierarchy. Moreover, in a parallel analysis, we fitted a drift-diffusion model (DDM) to the results to evaluate the decision making process of this perceptual task in the context of a social hierarchy. Consistently, the DDM pointed to nondecision time (probably perceptual encoding) as the principal period influenced by social hierarchy. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. Classification and reduction of pilot error

    NASA Technical Reports Server (NTRS)

    Rogers, W. H.; Logan, A. L.; Boley, G. D.

    1989-01-01

    Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.

  9. Assessing the Risk of International Spread of Yellow Fever Virus: A Mathematical Analysis of an Urban Outbreak in Asunción, 2008

    PubMed Central

    Johansson, Michael A.; Arana-Vizcarrondo, Neysarí; Biggerstaff, Brad J.; Gallagher, Nancy; Marano, Nina; Staples, J. Erin

    2012-01-01

    Yellow fever virus (YFV), a mosquito-borne virus endemic to tropical Africa and South America, is capable of causing large urban outbreaks of human disease. With the ease of international travel, urban outbreaks could lead to the rapid spread and subsequent transmission of YFV in distant locations. We designed a stochastic metapopulation model with spatiotemporally explicit transmissibility scenarios to simulate the global spread of YFV from a single urban outbreak by infected airline travelers. In simulations of a 2008 outbreak in Asunción, Paraguay, local outbreaks occurred in 12.8% of simulations and international spread in 2.0%. Using simple probabilistic models, we found that local incidence, travel rates, and basic transmission parameters are sufficient to assess the probability of introduction and autochthonous transmission events. These models could be used to assess the risk of YFV spread during an urban outbreak and identify locations at risk for YFV introduction and subsequent autochthonous transmission. PMID:22302873

  10. Assessing the risk of international spread of yellow fever virus: a mathematical analysis of an urban outbreak in Asuncion, 2008.

    PubMed

    Johansson, Michael A; Arana-Vizcarrondo, Neysarí; Biggerstaff, Brad J; Gallagher, Nancy; Marano, Nina; Staples, J Erin

    2012-02-01

    Yellow fever virus (YFV), a mosquito-borne virus endemic to tropical Africa and South America, is capable of causing large urban outbreaks of human disease. With the ease of international travel, urban outbreaks could lead to the rapid spread and subsequent transmission of YFV in distant locations. We designed a stochastic metapopulation model with spatiotemporally explicit transmissibility scenarios to simulate the global spread of YFV from a single urban outbreak by infected airline travelers. In simulations of a 2008 outbreak in Asunción, Paraguay, local outbreaks occurred in 12.8% of simulations and international spread in 2.0%. Using simple probabilistic models, we found that local incidence, travel rates, and basic transmission parameters are sufficient to assess the probability of introduction and autochthonous transmission events. These models could be used to assess the risk of YFV spread during an urban outbreak and identify locations at risk for YFV introduction and subsequent autochthonous transmission.

  11. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  12. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  13. Design Life Level: Quantifying risk in a changing climate

    NASA Astrophysics Data System (ADS)

    Rootzén, Holger; Katz, Richard W.

    2013-09-01

    In the past, the concepts of return levels and return periods have been standard and important tools for engineering design. However, these concepts are based on the assumption of a stationary climate and do not apply to a changing climate, whether local or global. In this paper, we propose a refined concept, Design Life Level, which quantifies risk in a nonstationary climate and can serve as the basis for communication. In current practice, typical hydrologic risk management focuses on a standard (e.g., in terms of a high quantile corresponding to the specified probability of failure for a single year). Nevertheless, the basic information needed for engineering design should consist of (i) the design life period (e.g., the next 50 years, say 2015-2064); and (ii) the probability (e.g., 5% chance) of a hazardous event (typically, in the form of the hydrologic variable exceeding a high level) occurring during the design life period. Capturing both of these design characteristics, the Design Life Level is defined as an upper quantile (e.g., 5%) of the distribution of the maximum value of the hydrologic variable (e.g., water level) over the design life period. We relate this concept and variants of it to existing literature and illustrate how they, and some useful complementary plots, may be computed and used. One practically important consideration concerns quantifying the statistical uncertainty in estimating a high quantile under nonstationarity.

  14. Interactions between spontaneous instantiations to the basic level and post-event suggestions.

    PubMed

    Pansky, Ainat; Tenenboim, Einat

    2011-11-01

    Extensive research shows that post-event suggestions can distort the memory for a target event. In this study we examined the effect of such suggestions as they interact with the products of a spontaneous memory process: instantiation of abstract information to an intermediate level of abstractness, the basic level (Pansky & Koriat, 2004 ). Participants read a narrative containing items presented at the superordinate level (e.g., FRUIT), were exposed to suggestions that referred to these items at the basic level (e.g., APPLE), and were finally asked to recall the original items. We found that the tendency to instantiate spontaneously in the control (non-misleading) condition, particularly over time, increased following exposure to suggestions that were likely to coincide with those instantiations. Exposure to such suggestions, either immediately or following a 24-hour delay, reduced subsequent correct recall of the original items only if the suggested information coincided with the information one tends to instantiate spontaneously in a given context. Suggestibility, in this case, was particularly pronounced and phenomenologically compelling in terms of remember/know judgements. The findings are taken to imply that effects of post-event suggestions can be understood in terms of the constructive processes that set the stage for their occurrence.

  15. Empirical estimation of the conditional probability of natech events within the United States.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Aguirra, Gloria Andrea

    2011-06-01

    Natural disasters are the cause of a sizeable number of hazmat releases, referred to as "natechs." An enhanced understanding of natech probability, allowing for predictions of natech occurrence, is an important step in determining how industry and government should mitigate natech risk. This study quantifies the conditional probabilities of natechs at TRI/RMP and SICS 1311 facilities given the occurrence of hurricanes, earthquakes, tornadoes, and floods. During hurricanes, a higher probability of releases was observed due to storm surge (7.3 releases per 100 TRI/RMP facilities exposed vs. 6.2 for SIC 1311) compared to category 1-2 hurricane winds (5.6 TRI, 2.6 SIC 1311). Logistic regression confirms the statistical significance of the greater propensity for releases at RMP/TRI facilities, and during some hurricanes, when controlling for hazard zone. The probability of natechs at TRI/RMP facilities during earthquakes increased from 0.1 releases per 100 facilities at MMI V to 21.4 at MMI IX. The probability of a natech at TRI/RMP facilities within 25 miles of a tornado was small (∼0.025 per 100 facilities), reflecting the limited area directly affected by tornadoes. Areas inundated during flood events had a probability of 1.1 releases per 100 facilities but demonstrated widely varying natech occurrence during individual events, indicating that factors not quantified in this study such as flood depth and speed are important for predicting flood natechs. These results can inform natech risk analysis, aid government agencies responsible for planning response and remediation after natural disasters, and should be useful in raising awareness of natech risk within industry. © 2011 Society for Risk Analysis.

  16. 10th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic, Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2006 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  17. 5th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2005 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  18. 6th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2002 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  19. 9th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2005 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  20. 7th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2003 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  1. 4th International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 2000 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  2. 3rd International Conference on Malignancies in AIDS and Other Acquired Immunodeficiencies: Basic,Epidemiologic and Clinical Research

    Cancer.gov

    Summary of speakers and events from the 1999 ICMAOI conference, focused on presenting basic, epidemiologic, and clinical aspects of research on malignancies in HIV-infected and other immunosuppressed individuals.

  3. Sensitivity towards fear of electric shock in passive threat situations.

    PubMed

    Ring, Patrick; Kaernbach, Christian

    2015-01-01

    Human judgment and decision-making (JDM) requires an assessment of different choice options. While traditional theories of choice argue that cognitive processes are the main driver to reach a decision, growing evidence highlights the importance of emotion in decision-making. Following these findings, it appears relevant to understand how individuals asses the attractiveness or riskiness of a situation in terms of emotional processes. The following study aims at a better understanding of the psychophysiological mechanisms underlying threat sensitivity by measuring skin conductance responses (SCRs) in passive threat situations. While previous studies demonstrate the role of magnitude on emotional body reactions preceding an outcome, this study focuses on probability. In order to analyze emotional body reactions preceding negative events with varying probability of occurrence, we have our participants play a two-stage card game. The first stage of the card game reveals the probability of receiving an unpleasant electric shock. The second stage applies the electric shock with the previously announced probability. For the analysis, we focus on the time interval between the first and second stage. We observe a linear relation between SCRs in anticipation of receiving an electric shock and shock probability. This finding indicates that SCRs are able to code the likelihood of negative events. We outline how this coding function of SCRs during the anticipation of negative events might add to an understanding of human JDM.

  4. Life events and Tourette syndrome.

    PubMed

    Steinberg, Tamar; Shmuel-Baruch, Sharona; Horesh, Netta; Apter, Alan

    2013-07-01

    Tourette syndrome (TS) is a neuropsychiatric developmental disorder characterized by the presence of multiple motor tics and one or more vocal tics. Although TS is primarily biological in origin, stress-diatheses interactions most probably play a role in the course of the illness. The precise influence of the environment on this basically biological disorder is difficult to ascertain, particularly when TS is complicated by comorbidities. Among the many questions that remain unresolved are the differential impact of positive and negative events and specific subtypes of events, and the importance of major crucial events relative to minor daily ones to tic severity. To examine the relationships between life events, tic severity and comorbid disorders in Tourette Syndrome (TS), including OCD, ADHD, anxiety, depression and rage attacks. Life events were classified by quantity, quality (positive or negative) and classification types of events (family, friends etc.). Sixty patients aged 7-17 years with Tourette syndrome or a chronic tic disorder were recruited from Psychological Medicine Clinic in Schneider Children's Medical Center of Israel. Yale Global Tic Severity Scale; Children's Yale Brown Obsessive Compulsive Scale; Life Experiences Survey; Brief Adolescent Life Events Scale; Screen for Child Anxiety Related Emotional Disorders; Child Depression Inventory/Beck Depression Inventory; ADHD Rating Scale IV; Overt Aggression Scale. Regarding tics and minor life events, there was a weak but significant correlation between severity of motor tics and the quantity of negative events. No significant correlation was found between tic severity and quantity of positive events. Analysis of the BALES categories yielded a significant direct correlation between severity of vocal tics and quantity of negative events involving friends. Regarding comorbidities and minor life events, highly significant correlations were found with depression and anxiety. Regarding tics and major life events, significant correlation was found between the quantity of major life events and the severity of motor tics, but not vocal tics. Regarding comorbidities and major life events, significant correlation was found between the severity of compulsions, ADHD, and aggression and the subjects' personal evaluation of the effect of negative major life events on their lives. Minor life events appear to be correlated with tic severity and comorbidities in children and adolescents with Tourette syndrome. The lack of an association between major life events and tic severity further emphasizes the salient impact of minor life events that occur in temporal proximity to the assessment of tic severity. Clinically, the results match our impression from patient narratives wherein they "blamed" the exacerbations in tics on social interactions. The high correlation between negative life events and depression, anxiety and compulsions symptoms, were reported also in previous studies. In conclusion, These findings may have clinical implications for planning supportive psychotherapy or cognitive behavioral therapy for this patient population. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Particles, Waves, and the Interpretation of Quantum Mechanics

    ERIC Educational Resources Information Center

    Christoudouleas, N. D.

    1975-01-01

    Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)

  6. Complex and unpredictable Cardano

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    2008-08-01

    This purely recreational paper is about one of the most colorful characters of the Italian Renaissance, Girolamo Cardano, and the discovery of two basic ingredients of quantum theory, probability and complex numbers.

  7. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    USGS Publications Warehouse

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (<10%) for effective normal stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  8. Related Factors of the Influence on Mental Symptoms of the Recruits in the Basic Military Training

    ERIC Educational Resources Information Center

    Hong-zheng, Li; Mei-ying, Lei; Dong-hai Zhao; Li-qiong, Zhao; Geng, Liu; Hong-kui, Zhou; Mei, Qin; Jie-feng, Li; Jian, Wen; Pin-de, Huang; Yi, Li; Chuang, Wang; Zhou-ran, Wang

    2012-01-01

    The objective of the study is to explore the psychosocial characteristics of recruits for mental health education during the basic military training. A total of 1,366 male recruits were assessed during the basic military training. The psychosocial characteristics, such as effects of LE (life events), mental symptoms, personality trait coping style…

  9. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1977-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are obtained. The approach is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. A general representation for optimum estimates and recursive equations for minimum mean squared error (MMSE) estimates are obtained. MMSE estimates are nonlinear functions of the observations. The problem of estimating the rate of a DTJP when the rate is a random variable with a probability density function of the form cx super K (l-x) super m and show that the MMSE estimates are linear in this case. This class of density functions explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  10. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1978-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are derived. The approach used is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. Thus a general representation is obtained for optimum estimates, and recursive equations are derived for minimum mean-squared error (MMSE) estimates. In general, MMSE estimates are nonlinear functions of the observations. The problem is considered of estimating the rate of a DTJP when the rate is a random variable with a beta probability density function and the jump amplitudes are binomially distributed. It is shown that the MMSE estimates are linear. The class of beta density functions is rather rich and explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  11. A closer look at the probabilities of the notorious three prisoners.

    PubMed

    Falk, R

    1992-06-01

    The "problem of three prisoners", a counterintuitive teaser, is analyzed. It is representative of a class of probability puzzles where the correct solution depends on explication of underlying assumptions. Spontaneous beliefs concerning the problem and intuitive heuristics are reviewed. The psychological background of these beliefs is explored. Several attempts to find a simple criterion to predict whether and how the probability of the target event will change as a result of obtaining evidence are examined. However, despite the psychological appeal of these attempts, none proves to be valid in general. A necessary and sufficient condition for change in the probability of the target event, following observation of new data, is proposed. That criterion is an extension of the likelihood-ratio principle (which holds in the case of only two complementary alternatives) to any number of alternatives. Some didactic implications concerning the significance of the chance set-up and reliance on analogies are discussed.

  12. Modelling the Probability of Landslides Impacting Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m2, which closely matches the value of A¯ L for the triggered landslide inventories. We further find that over the 500 iterations, the probability of a given number of road blocks occurring on any given iteration, p(NBL) as a function of NBL, follows reasonably well a three-parameter inverse gamma probability density distribution with an exponential rollover (i.e., the most frequent value) at NBL = 1.3. In this paper we have begun to calculate the probability of the number of landslides blocking roads during a triggering event, and have found that this follows an inverse-gamma distribution, which is similar to that found for the statistics of landslide areas resulting from triggers. As we progress to model more realistic road networks, this work will aid in both long-term and disaster management for road networks by allowing probabilistic assessment of road network potential damage during different magnitude landslide triggering event scenarios.

  13. Probability concepts in quality risk management.

    PubMed

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management.

  14. SU-F-J-200: An Improved Method for Event Selection in Compton Camera Imaging for Particle Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackin, D; Beddar, S; Polf, J

    2016-06-15

    Purpose: The uncertainty in the beam range in particle therapy limits the conformality of the dose distributions. Compton scatter cameras (CC), which measure the prompt gamma rays produced by nuclear interactions in the patient tissue, can reduce this uncertainty by producing 3D images confirming the particle beam range and dose delivery. However, the high intensity and short time windows of the particle beams limit the number of gammas detected. We attempt to address this problem by developing a method for filtering gamma ray scattering events from the background by applying the known gamma ray spectrum. Methods: We used a 4more » stage Compton camera to record in list mode the energy deposition and scatter positions of gammas from a Co-60 source. Each CC stage contained a 4×4 array of CdZnTe crystal. To produce images, we used a back-projection algorithm and four filtering Methods: basic, energy windowing, delta energy (ΔE), or delta scattering angle (Δθ). Basic filtering requires events to be physically consistent. Energy windowing requires event energy to fall within a defined range. ΔE filtering selects events with the minimum difference between the measured and a known gamma energy (1.17 and 1.33 MeV for Co-60). Δθ filtering selects events with the minimum difference between the measured scattering angle and the angle corresponding to a known gamma energy. Results: Energy window filtering reduced the FWHM from 197.8 mm for basic filtering to 78.3 mm. ΔE and Δθ filtering achieved the best results, FWHMs of 64.3 and 55.6 mm, respectively. In general, Δθ filtering selected events with scattering angles < 40°, while ΔE filtering selected events with angles > 60°. Conclusion: Filtering CC events improved the quality and resolution of the corresponding images. ΔE and Δθ filtering produced similar results but each favored different events.« less

  15. Modelling the occurrence and severity of enoxaparin-induced bleeding and bruising events

    PubMed Central

    Barras, Michael A; Duffull, Stephen B; Atherton, John J; Green, Bruce

    2009-01-01

    AIMS To develop a population pharmacokinetic–pharmacodynamic model to describe the occurrence and severity of bleeding or bruising as a function of enoxaparin exposure. METHODS Data were obtained from a randomized controlled trial (n = 118) that compared conventional dosing of enoxaparin (product label) with an individualized dosing regimen. Anti-Xa concentrations were sampled using a sparse design and the size, location and type of bruising and bleeding event, during enoxaparin therapy, were collected daily. A population pharmacokinetic–pharmacodynamic analysis was performed using nonlinear mixed effects techniques. The final model was used to explore how the probability of events in patients with obesity and/or renal impairment varied under differing dosing strategies. RESULTS Three hundred and forty-nine anti-Xa concentrations were available for analysis. A two-compartment first-order absorption and elimination model best fit the data, with lean body weight describing between-subject variability in clearance and central volume of distribution. A three-category proportional-odds model described the occurrence and severity of events as a function of both cumulative enoxaparin AUC (cAUC) and subject age. Simulations showed that individualized dosing decreased the probability of a bleeding or major bruising event when compared with conventional dosing, which was most noticeable in subjects with obesity and renal impairment. CONCLUSIONS The occurrence and severity of a bleeding or major bruising event to enoxaparin, administered for the treatment of a thromboembolic disease, can be described as a function of both cAUC and subject age. Individualized dosing of enoxaparin will reduce the probability of an event. PMID:19916994

  16. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    PubMed

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  17. Security Threat Assessment of an Internet Security System Using Attack Tree and Vague Sets

    PubMed Central

    2014-01-01

    Security threat assessment of the Internet security system has become a greater concern in recent years because of the progress and diversification of information technology. Traditionally, the failure probabilities of bottom events of an Internet security system are treated as exact values when the failure probability of the entire system is estimated. However, security threat assessment when the malfunction data of the system's elementary event are incomplete—the traditional approach for calculating reliability—is no longer applicable. Moreover, it does not consider the failure probability of the bottom events suffered in the attack, which may bias conclusions. In order to effectively solve the problem above, this paper proposes a novel technique, integrating attack tree and vague sets for security threat assessment. For verification of the proposed approach, a numerical example of an Internet security system security threat assessment is adopted in this paper. The result of the proposed method is compared with the listing approaches of security threat assessment methods. PMID:25405226

  18. Security threat assessment of an Internet security system using attack tree and vague sets.

    PubMed

    Chang, Kuei-Hu

    2014-01-01

    Security threat assessment of the Internet security system has become a greater concern in recent years because of the progress and diversification of information technology. Traditionally, the failure probabilities of bottom events of an Internet security system are treated as exact values when the failure probability of the entire system is estimated. However, security threat assessment when the malfunction data of the system's elementary event are incomplete--the traditional approach for calculating reliability--is no longer applicable. Moreover, it does not consider the failure probability of the bottom events suffered in the attack, which may bias conclusions. In order to effectively solve the problem above, this paper proposes a novel technique, integrating attack tree and vague sets for security threat assessment. For verification of the proposed approach, a numerical example of an Internet security system security threat assessment is adopted in this paper. The result of the proposed method is compared with the listing approaches of security threat assessment methods.

  19. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  20. Meta-analysis for aggregated survival data with competing risks: a parametric approach using cumulative incidence functions.

    PubMed

    Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido

    2016-09-01

    Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Infants Segment Continuous Events Using Transitional Probabilities

    ERIC Educational Resources Information Center

    Stahl, Aimee E.; Romberg, Alexa R.; Roseberry, Sarah; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathryn

    2014-01-01

    Throughout their 1st year, infants adeptly detect statistical structure in their environment. However, little is known about whether statistical learning is a primary mechanism for event segmentation. This study directly tests whether statistical learning alone is sufficient to segment continuous events. Twenty-eight 7- to 9-month-old infants…

  2. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  3. Controlling a stream of paranoia evoking events in a virtual reality environment.

    PubMed

    Isnanda, Reza Giga; Brinkman, Willem-Paul; Veling, Wim; van der Gaag, Mark; Neerincx, Mark

    2014-01-01

    Although virtual reality exposure has been reported as a method to induce paranoid thought, little is known about mechanisms to control specific virtual stressors. This paper reports on a study that examines the effect of controlling the stream of potential paranoia evoking events in a virtual restaurant world. A 2-by-2 experiment with a non-clinical group (n = 24) was conducted with as two within-subject factors: (1) the cycle time (short/long) for when the computer considers activation of a paranoia evoking event and (2) the probability that a paranoia-evoking event (low/high) would be triggered at the completion of a cycle. The results showed a significant main effect for the probability factor and two-way interaction effect with the cycle time factor on the number of paranoid comments participants made and their self-reported anxiety.

  4. Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region

    NASA Astrophysics Data System (ADS)

    Chaudhary, Chhavi; Sharma, Mukat Lal

    2017-12-01

    Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.

  5. Army ants algorithm for rare event sampling of delocalized nonadiabatic transitions by trajectory surface hopping and the estimation of sampling errors by the bootstrap method.

    PubMed

    Nangia, Shikha; Jasper, Ahren W; Miller, Thomas F; Truhlar, Donald G

    2004-02-22

    The most widely used algorithm for Monte Carlo sampling of electronic transitions in trajectory surface hopping (TSH) calculations is the so-called anteater algorithm, which is inefficient for sampling low-probability nonadiabatic events. We present a new sampling scheme (called the army ants algorithm) for carrying out TSH calculations that is applicable to systems with any strength of coupling. The army ants algorithm is a form of rare event sampling whose efficiency is controlled by an input parameter. By choosing a suitable value of the input parameter the army ants algorithm can be reduced to the anteater algorithm (which is efficient for strongly coupled cases), and by optimizing the parameter the army ants algorithm may be efficiently applied to systems with low-probability events. To demonstrate the efficiency of the army ants algorithm, we performed atom-diatom scattering calculations on a model system involving weakly coupled electronic states. Fully converged quantum mechanical calculations were performed, and the probabilities for nonadiabatic reaction and nonreactive deexcitation (quenching) were found to be on the order of 10(-8). For such low-probability events the anteater sampling scheme requires a large number of trajectories ( approximately 10(10)) to obtain good statistics and converged semiclassical results. In contrast by using the new army ants algorithm converged results were obtained by running 10(5) trajectories. Furthermore, the results were found to be in excellent agreement with the quantum mechanical results. Sampling errors were estimated using the bootstrap method, which is validated for use with the army ants algorithm. (c) 2004 American Institute of Physics.

  6. Event probabilities and impact zones for hazardous materials accidents on railroads

    DOT National Transportation Integrated Search

    1983-11-01

    Procedures are presented for evaluating the probability and impacts of hazardous material accidents in rail transportation. The significance of track class for accident frequencies and of train speed for accident severity is quantified. Special atten...

  7. A Contrast-Based Computational Model of Surprise and Its Applications.

    PubMed

    Macedo, Luis; Cardoso, Amílcar

    2017-11-19

    We review our work on a contrast-based computational model of surprise and its applications. The review is contextualized within related research from psychology, philosophy, and particularly artificial intelligence. Influenced by psychological theories of surprise, the model assumes that surprise-eliciting events initiate a series of cognitive processes that begin with the appraisal of the event as unexpected, continue with the interruption of ongoing activity and the focusing of attention on the unexpected event, and culminate in the analysis and evaluation of the event and the revision of beliefs. It is assumed that the intensity of surprise elicited by an event is a nonlinear function of the difference or contrast between the subjective probability of the event and that of the most probable alternative event (which is usually the expected event); and that the agent's behavior is partly controlled by actual and anticipated surprise. We describe applications of artificial agents that incorporate the proposed surprise model in three domains: the exploration of unknown environments, creativity, and intelligent transportation systems. These applications demonstrate the importance of surprise for decision making, active learning, creative reasoning, and selective attention. Copyright © 2017 Cognitive Science Society, Inc.

  8. Springtime ENSO Flavors and Their Impacts on US Regional Tornado Outbreaks

    NASA Astrophysics Data System (ADS)

    Lee, S. K.; Wittenberg, A. T.; Enfield, D. B.; Weaver, S. J.; Wang, C.; Atlas, R. M.

    2015-12-01

    A new method is presented to objectively characterize and explore the differences in the space-time evolution of equatorial Pacific SSTAs observed during El Nino events. An application of this method to the 21 El Nino events during 1949-2013 captured two leading orthogonal modes, which explain more than 60% of the inter-event variance. The first mode distinguishes a strong and persistent El Nino from a weak and early-terminating El Niño. A similar analysis applied to the 22 La Nina events during 1949-2013 also revealed two leading orthogonal modes, with its first mode distinguishing a resurgent La Nina from a transitioning La Nina. This study shows that the four main phases of springtime El Nino-Southern Oscillation (ENSO) evolution (persistent versus early-terminating El Nino, and resurgent versus transitioning La Nina) are linked to distinctive spatial patterns of the probability of U.S. regional tornado outbreaks. In particular, the outbreak probability increases significantly up to 27% over the Ohio Valley, Upper Midwest and Southeast when a La Nina persists into the spring and is followed by another La Nina (i.e., resurgent La Nina). The probability also increases significantly up to 38%, but mainly in the South, when a two-year La Nina transitions to an El Nino (i.e., transitioning La Nna). These changes in outbreak probability are shown to be largely consistent with remotely forced regional changes in the large-scale tropospheric circulation, low-level vertical wind shear, moisture transports and extratropical storm activity.

  9. Global warming precipitation accumulation increases above the current-climate cutoff scale

    PubMed Central

    Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-01-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff. PMID:28115693

  10. Global warming precipitation accumulation increases above the current-climate cutoff scale

    NASA Astrophysics Data System (ADS)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-02-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  11. Fast adaptation of the internal model of gravity for manual interceptions: evidence for event-dependent learning.

    PubMed

    Zago, Myrka; Bosco, Gianfranco; Maffei, Vincenzo; Iosa, Marco; Ivanenko, Yuri P; Lacquaniti, Francesco

    2005-02-01

    We studied how subjects learn to deal with two conflicting sensory environments as a function of the probability of each environment and the temporal distance between repeated events. Subjects were asked to intercept a visual target moving downward on a screen with randomized laws of motion. We compared five protocols that differed in the probability of constant speed (0g) targets and accelerated (1g) targets. Probability ranged from 9 to 100%, and the time interval between consecutive repetitions of the same target ranged from about 1 to 20 min. We found that subjects systematically timed their responses consistent with the assumption of gravity effects, for both 1 and 0g trials. With training, subjects rapidly adapted to 0g targets by shifting the time of motor activation. Surprisingly, the adaptation rate was independent of both the probability of 0g targets and their temporal distance. Very few 0g trials sporadically interspersed as catch trials during immersive practice with 1g trials were sufficient for learning and consolidation in long-term memory, as verified by retesting after 24 h. We argue that the memory store for adapted states of the internal gravity model is triggered by individual events and can be sustained for prolonged periods of time separating sporadic repetitions. This form of event-related learning could depend on multiple-stage memory, with exponential rise and decay in the initial stages followed by a sample-and-hold module.

  12. A comprehensive multi-scenario based approach for a reliable flood-hazard assessment: a case-study application

    NASA Astrophysics Data System (ADS)

    Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi

    2015-04-01

    Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.

  13. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  14. Global warming precipitation accumulation increases above the current-climate cutoff scale.

    PubMed

    Neelin, J David; Sahany, Sandeep; Stechmann, Samuel N; Bernstein, Diana N

    2017-02-07

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  15. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE PAGES

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; ...

    2017-01-23

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  16. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  17. Delay and probability discounting of sexual and monetary outcomes in individuals with cocaine use disorders and matched controls.

    PubMed

    Johnson, Matthew W; Johnson, Patrick S; Herrmann, Evan S; Sweeney, Mary M

    2015-01-01

    Individuals with cocaine use disorders are disproportionately affected by HIV/AIDS, partly due to higher rates of unprotected sex. Recent research suggests delay discounting of condom use is a factor in sexual HIV risk. Delay discounting is a behavioral economic concept describing how delaying an event reduces that event's value or impact on behavior. Probability discounting is a related concept describing how the uncertainty of an event decreases its impact on behavior. Individuals with cocaine use disorders (n = 23) and matched non-cocaine-using controls (n = 24) were compared in decision-making tasks involving hypothetical outcomes: delay discounting of condom-protected sex (Sexual Delay Discounting Task), delay discounting of money, the effect of sexually transmitted infection (STI) risk on likelihood of condom use (Sexual Probability Discounting Task), and probability discounting of money. The Cocaine group discounted delayed condom-protected sex (i.e., were more likely to have unprotected sex vs. wait for a condom) significantly more than controls in two of four Sexual Delay Discounting Task partner conditions. The Cocaine group also discounted delayed money (i.e., preferred smaller immediate amounts over larger delayed amounts) significantly more than controls. In the Sexual Probability Discounting Task, both groups showed sensitivity to STI risk, however the groups did not differ. The Cocaine group did not consistently discount probabilistic money more or less than controls. Steeper discounting of delayed, but not probabilistic, sexual outcomes may contribute to greater rates of sexual HIV risk among individuals with cocaine use disorders. Probability discounting of sexual outcomes may contribute to risk of unprotected sex in both groups. Correlations showed sexual and monetary results were unrelated, for both delay and probability discounting. The results highlight the importance of studying specific behavioral processes (e.g., delay and probability discounting) with respect to specific outcomes (e.g., monetary and sexual) to understand decision making in problematic behavior.

  18. On the use of variability time-scales as an early classifier of radio transients and variables

    NASA Astrophysics Data System (ADS)

    Pietka, M.; Staley, T. D.; Pretorius, M. L.; Fender, R. P.

    2017-11-01

    We have shown previously that a broad correlation between the peak radio luminosity and the variability time-scales, approximately L ∝ τ5, exists for variable synchrotron emitting sources and that different classes of astrophysical sources occupy different regions of luminosity and time-scale space. Based on those results, we investigate whether the most basic information available for a newly discovered radio variable or transient - their rise and/or decline rate - can be used to set initial constraints on the class of events from which they originate. We have analysed a sample of ≈800 synchrotron flares, selected from light curves of ≈90 sources observed at 5-8 GHz, representing a wide range of astrophysical phenomena, from flare stars to supermassive black holes. Selection of outbursts from the noisy radio light curves has been done automatically in order to ensure reproducibility of results. The distribution of rise/decline rates for the selected flares is modelled as a Gaussian probability distribution for each class of object, and further convolved with estimated areal density of that class in order to correct for the strong bias in our sample. We show in this way that comparing the measured variability time-scale of a radio transient/variable of unknown origin can provide an early, albeit approximate, classification of the object, and could form part of a suite of measurements used to provide early categorization of such events. Finally, we also discuss the effect scintillating sources will have on our ability to classify events based on their variability time-scales.

  19. SimEngine v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, Hai D.

    2017-03-02

    SimEngine provides the core functionalities and components that are key to the development of discrete event simulation tools. These include events, activities, event queues, random number generators, and basic result tracking classes. SimEngine was designed for high performance, integrates seamlessly into any Microsoft .Net development environment, and provides a flexible API for simulation developers.

  20. Percutaneous left atrial appendage closure vs warfarin for atrial fibrillation: a randomized clinical trial.

    PubMed

    Reddy, Vivek Y; Sievert, Horst; Halperin, Jonathan; Doshi, Shephal K; Buchbinder, Maurice; Neuzil, Petr; Huber, Kenneth; Whisenant, Brian; Kar, Saibal; Swarup, Vijay; Gordon, Nicole; Holmes, David

    2014-11-19

    While effective in preventing stroke in patients with atrial fibrillation (AF), warfarin is limited by a narrow therapeutic profile, a need for lifelong coagulation monitoring, and multiple drug and diet interactions. To determine whether a local strategy of mechanical left atrial appendage (LAA) closure was noninferior to warfarin. PROTECT AF was a multicenter, randomized (2:1), unblinded, Bayesian-designed study conducted at 59 hospitals of 707 patients with nonvalvular AF and at least 1 additional stroke risk factor (CHADS2 score ≥1). Enrollment occurred between February 2005 and June 2008 and included 4-year follow-up through October 2012. Noninferiority required a posterior probability greater than 97.5% and superiority a probability of 95% or greater; the noninferiority margin was a rate ratio of 2.0 comparing event rates between treatment groups. Left atrial appendage closure with the device (n = 463) or warfarin (n = 244; target international normalized ratio, 2-3). A composite efficacy end point including stroke, systemic embolism, and cardiovascular/unexplained death, analyzed by intention-to-treat. At a mean (SD) follow-up of 3.8 (1.7) years (2621 patient-years), there were 39 events among 463 patients (8.4%) in the device group for a primary event rate of 2.3 events per 100 patient-years, compared with 34 events among 244 patients (13.9%) for a primary event rate of 3.8 events per 100 patient-years with warfarin (rate ratio, 0.60; 95% credible interval, 0.41-1.05), meeting prespecified criteria for both noninferiority (posterior probability, >99.9%) and superiority (posterior probability, 96.0%). Patients in the device group demonstrated lower rates of both cardiovascular mortality (1.0 events per 100 patient-years for the device group [17/463 patients, 3.7%] vs 2.4 events per 100 patient-years with warfarin [22/244 patients, 9.0%]; hazard ratio [HR], 0.40; 95% CI, 0.21-0.75; P = .005) and all-cause mortality (3.2 events per 100 patient-years for the device group [57/466 patients, 12.3%] vs 4.8 events per 100 patient-years with warfarin [44/244 patients, 18.0%]; HR, 0.66; 95% CI, 0.45-0.98; P = .04). After 3.8 years of follow-up among patients with nonvalvular AF at elevated risk for stroke, percutaneous LAA closure met criteria for both noninferiority and superiority, compared with warfarin, for preventing the combined outcome of stroke, systemic embolism, and cardiovascular death, as well as superiority for cardiovascular and all-cause mortality. clinicaltrials.gov Identifier: NCT00129545.

  1. Syncope among U.S. Air Force basic military trainees, August 2012-July 2013.

    PubMed

    Webber, Bryant J; Cropper, Thomas L; Federinko, Susan P

    2013-11-01

    Syncope is a common event with many possible etiologies, ranging from benign to severe. Syncopal episodes of any origin, however, may result in traumatic injury due to postural collapse. Based on the prevalence of internal and external stressors during training, basic military trainees may be at increased risk for syncope. Between 1 August 2012 and 31 July 2013, there were 112 unique individuals who experienced syncopal or pre-syncopal events among basic military trainees at Joint Base San Antonio-Lackland, Texas, the only basic training site in the U.S. Air Force. The overall rate was 19.6 cases per 1,000 person-years (18.4 and 36.1 per 1,000 person-years in males and females, respectively). Based upon the findings of electronic chart review of the 112 cases, a majority of events occurred either during or immediately after exercise (n=38) or during a blood draw, immunization, or laceration repair (n=22). The most common etiologies were judged to be neurocardiogenic (n=54) and orthostatic hypotension (n=40), and two cases were attributed to cardiovascular disease. These findings support current preventive measures, including anemia screening during medical in-processing, an emphasis on hydration throughout training, and a padded floor in the trainee vaccination bay.

  2. Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    2000-01-01

    A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.

  3. High-altitude electrical discharges associated with thunderstorms and lightning

    NASA Astrophysics Data System (ADS)

    Liu, Ningyu; McHarg, Matthew G.; Stenbaek-Nielsen, Hans C.

    2015-12-01

    The purpose of this paper is to introduce electrical discharge phenomena known as transient luminous events above thunderstorms to the lightning protection community. Transient luminous events include the upward electrical discharges from thunderstorms known as starters, jets, and gigantic jets, and electrical discharges initiated in the lower ionosphere such as sprites, halos, and elves. We give an overview of these phenomena with a focus on starters, jets, gigantic jets, and sprites, because similar to ordinary lightning, streamers and leaders are basic components of these four types of transient luminous events. We present a few recent observations to illustrate their main properties and briefly review the theories. The research in transient luminous events has not only advanced our understanding of the effects of thunderstorms and lightning in the middle and upper atmosphere, but also improved our knowledge of basic electrical discharge processes critical for sparks and lightning.

  4. Condom Use as a Function of Number of Coital Events in New Relationships.

    PubMed

    He, Fei; Hensel, Devon J; Harezlak, Jaroslaw; Fortenberry, J Dennis

    2016-02-01

    To assess condom use as a function of number of coital events in newly formed sexual relationships. Participants who reported at least one new partner during the 12-week study interval (n = 115; ages 18-29 years; 48% women; 90% African American) completed weekly sexually transmitted infections testing and 3 times daily electronic diary collection assessing individual and partner-specific affect, daily activities, sexual behavior, and condom use. We analyzed event-level condom use percentage and participant-level behavior response effects. generalized additive mixed models were used to estimate condom use probability accounting for within-partner and within-participant correlations via random effects. The average condom use probability at the first coital event in new relationships was 55% for men and 36% for women. Analyses showed that smooth shapes of estimated condom use probabilities were similar for both sexes and were fitted using generalized additive mixed models. Relatively higher condom use percentage was followed by a sharp decline during the first 9 coital events decreasing to 16% for men and 8% for women. More rapid decline in condom use among women was highly associated with higher levels of relationship and sexual satisfaction. The likelihood of condom use declines sharply for both men and women after the early accrual experience with a partner. Relationship and sexual satisfaction also influence declines in condom use, especially among women.

  5. Condom use as a function of number of coital events in new relationships

    PubMed Central

    He, Fei; Hensel, Devon J.; Harezlak, Jaroslaw; Fortenberry, J. Dennis

    2015-01-01

    Study Objective Assess condom use as a function of number of coital events in newly formed sexual relationships. Methods Participants who reported at least one new partner during the 12-week study interval (N=115; ages 18–29 years; 48% women; 90% African American) completed weekly sexually transmitted infections testing and three-times daily electronic diary collection assessing individual and partner-specific affect, daily activities, sexual behavior and condom use. We analyzed event-level condom use percentage and subject-level behavior response effects. Generalized Additive Mixed Models (GAMMs) were used to estimate condom use probability accounting for within-partner and within-subject correlations via random effects. Results The average condom use probability at the first coital event in new relationships was 55% for men and 36% for women. Analyses showed that smooth shapes of estimated condom use probabilities were similar for both sexes and were fitted using GAMMs. Relatively higher condom use percentage was followed by a sharp decline during the first 9 coital events decreasing to 16% for men and 8% for women. More rapid decline in condom use among women was highly associated with higher levels of relationship and sexual satisfaction. Conclusions The likelihood of condom use declines sharply for both men and women after the early accrual experience with a partner. Relationship and sexual satisfaction also influence declines in condom use, especially among women. PMID:26766522

  6. NASA's Meteoroid Environments Office's Response to Three Significant Bolide Events Over North America

    NASA Technical Reports Server (NTRS)

    Blaauw, Rhiannon C.; Cooke, William J.; Kingery, Aaron M.

    2015-01-01

    Being the only U.S. Government entity charged with monitoring the meteor environment, the Meteoroid Environment Office has deployed a network of all sky and wide field meteor cameras, along with the appropriate software tools to quickly analyze data from these systems. However, the coverage of this network is still quite limited, forcing the incorporation of data from other cameras posted to the internet in analyzing many of the fireballs reported by the public and media. A procedure has been developed that determines the analysis process for a given fireball event based on the types and amount of data available. The differences between these analysis process will be explained and outlined by looking at three bolide events, all of which were large enough to produce meteorites. The first example is an ideal event - a bright meteor that occurred over NASA's All Sky Camera Network on August 2, 2014. With clear video of the event from various angles, a high-accuracy trajectory, beginning and end heights, orbit and approximate brightness/size of the event are able to be found very quickly using custom software. The bolide had the potential to have dropped meteorites, so dark flight analysis and modeling was performed, allowing potential fall locations to be mapped as a function of meteorite mass. The second case study was a bright bolide that occurred November 3, 2014 over West Virginia. This was just north of the NASA southeastern all-sky network, and just south of the Ohio-Pennsylvania network. This case study showcases the MEO's ability to use social media and various internet sources to locate videos of the event from obscure sources (including the Washington Monument) for anything that will permit a determination of a basic trajectory and fireball light curve The third case study will highlight the ability to use doppler weather radar in helping locate meteorites, which enable a definitive classification of the impactor. The input data and analysis steps differ for each case study, but the goals remain the same - a trajectory, orbit, and mass estimate for the bolide within hours of the event, and, for events with a high probability of producing meteorites, a location of the strewn field within a day.

  7. Radioprotective agents in medicine.

    PubMed

    Duraković, A

    1993-12-01

    The diminished probability of strategic nuclear confrontation alleviates some of the global concerns about large numbers of radiation casualties in the event of a nuclear war. As a result of the protection of the environment, the management of smaller numbers of radiation casualties assumes a more predictable and more specific role confined to accidents in nuclear energy projects, industry, technology and science. Recent experience of the consequences of accidents in nuclear power plants, in the field of radiotherapy and in the disposal of radioactive waste and spent fuel, present the medical and scientific communities with formidable problems if such events are to lead to minimal adverse effects on the biosphere. Whereas it is not possible to predict a nuclear or radiation accident, radioprotection is hardly an issue of health science alone, but rather an issue of the strictest quality assurance in all aspects of the utilization of nuclear energy and ionizing radiation. Thus, the medical community concerned with radioprotection will have to confine its emphasis on the management of radiation-induced alterations of the human organism from acute radiation syndromes to the stochastic concepts of chronic alterations of radiosensitive organic systems. Current multidisciplinary research in the field of radioprotection involves all aspects of basic and clinical research ranging from the subatomic mechanisms of free radical formation, macromolecular and intracellular radiation-induced alterations, biochemical and physiological homeostatic mechanisms and organ level manifestations to the clinical management of radiation casualties in a controlled hospital environment. Radioprotective agents, although widely studied in the past four decades and including several thousand agents, have not reached the level of providing the field of medicine with an agent that conforms to all criteria of an optimal radioprotectant, including effectiveness, toxicity, availability, specificity and tolerance. This article discusses the current state of radioprotection in medical therapy, and emphasizes a need for continued research in the area of medical management of radiation casualties from the viewpoint of a realistic probability of nuclear incidents or accidents in the nuclear energy-dependent world at the end of the millennium.

  8. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  9. The role of magical thinking in forecasting the future.

    PubMed

    Stavrova, Olga; Meckel, Andrea

    2017-02-01

    This article explores the role of magical thinking in the subjective probabilities of future chance events. In five experiments, we show that individuals tend to predict a more lucky future (reflected in probability judgements of lucky and unfortunate chance events) for someone who happened to purchase a product associated with a highly moral person than for someone who unknowingly purchased a product associated with a highly immoral person. In the former case, positive events were considered more likely than negative events, whereas in the latter case, the difference in the likelihood judgement of positive and negative events disappeared or even reversed. Our results indicate that this effect is unlikely to be driven by participants' immanent justice beliefs, the availability heuristic, or experimenter demand. Finally, we show that individuals rely more heavily on magical thinking when their need for control is threatened, thus suggesting that lack of control represents a factor in driving magical thinking in making predictions about the future. © 2016 The British Psychological Society.

  10. SEC proton prediction model: verification and analysis.

    PubMed

    Balch, C C

    1999-06-01

    This paper describes a model that has been used at the NOAA Space Environment Center since the early 1970s as a guide for the prediction of solar energetic particle events. The algorithms for proton event probability, peak flux, and rise time are described. The predictions are compared with observations. The current model shows some ability to distinguish between proton event associated flares and flares that are not associated with proton events. The comparisons of predicted and observed peak flux show considerable scatter, with an rms error of almost an order of magnitude. Rise time comparisons also show scatter, with an rms error of approximately 28 h. The model algorithms are analyzed using historical data and improvements are suggested. Implementation of the algorithm modifications reduces the rms error in the log10 of the flux prediction by 21%, and the rise time rms error by 31%. Improvements are also realized in the probability prediction by deriving the conditional climatology for proton event occurrence given flare characteristics.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krier, D. J.; Perry, F. V.

    Location, timing, volume, and eruptive style of post-Miocene volcanoes have defined the volcanic hazard significant to a proposed high-level radioactive waste (HLW) and spent nuclear fuel (SNF) repository at Yucca Mountain, Nevada, as a low-probability, high-consequence event. Examination of eruptive centers in the region that may be analogueues to possible future volcanic activity at Yucca Mountain have aided in defining and evaluating the consequence scenarios for intrusion into and eruption above a repository. The probability of a future event intersecting a repository at Yucca Mountain has a mean value of 1.7 x 10{sup -8} per year. This probability comes frommore » the Probabilistic Volcanic Hazard Assessment (PVHA) completed in 1996 and updated to reflect change in repository layout. Since that time, magnetic anomalies representing potential buried volcanic centers have been identified fiom magnetic surveys; however these potential buried centers only slightly increase the probability of an event intersecting the repository. The proposed repository will be located in its central portion of Yucca Mountain at approximately 300m depth. The process for assessing performance of a repository at Yucca Mountain has identified two scenarios for igneous activity that, although having a very low probability of occurrence, could have a significant consequence should an igneous event occur. Either a dike swarm intersecting repository drifts containing waste packages, or a volcanic eruption through the repository could result in release of radioactive material to the accessible environment. Ongoing investigations are assessing the mechanisms and significance of the consequence scenarios. Lathrop Wells Cone ({approx}80,000 yrs), a key analogue for estimating potential future volcanic activity, is the youngest surface expression of apparent waning basaltic volcanism in the region. Cone internal structure, lavas, and ash-fall tephra have been examined to estimate eruptive volume, eruption type, and subsurface disturbance accompanying conduit growth and eruption. The Lathrop Wells volcanic complex has a total volume estimate of approximately 0.1 km{sup 3}. The eruptive products indicate a sequence of initial magmatic fissure fountaining, early Strombolian activity, and a brief hydrovolcanic phase, and violent Strombolian phase(s). Lava flows adjacent to the Lathrop Wells Cone probably were emplaced during the mid-eruptive sequence. Ongoing investigations continue to address the potential hazards of a volcanic event at Yucca Mountain.« less

  12. The Reinforcing Event (RE) Menu

    ERIC Educational Resources Information Center

    Addison, Roger M.; Homme, Lloyd E.

    1973-01-01

    A motivational system, the Contingency Management System, uses contracts in which some amount of defined task behavior is demanded for some interval of reinforcing event. The Reinforcing Event Menu, a list of high probability reinforcing behaviors, is used in the system as a prompting device for the learner and as an aid for the administrator in…

  13. Applying Kaplan-Meier to Item Response Data

    ERIC Educational Resources Information Center

    McNeish, Daniel

    2018-01-01

    Some IRT models can be equivalently modeled in alternative frameworks such as logistic regression. Logistic regression can also model time-to-event data, which concerns the probability of an event occurring over time. Using the relation between time-to-event models and logistic regression and the relation between logistic regression and IRT, this…

  14. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    NASA Astrophysics Data System (ADS)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  15. Application of importance sampling to the computation of large deviations in nonequilibrium processes.

    PubMed

    Kundu, Anupam; Sabhapandit, Sanjib; Dhar, Abhishek

    2011-03-01

    We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.

  16. Optimal Futility Interim Design: A Predictive Probability of Success Approach with Time-to-Event Endpoint.

    PubMed

    Tang, Zhongwen

    2015-01-01

    An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.

  17. State-dependent biasing method for importance sampling in the weighted stochastic simulation algorithm.

    PubMed

    Roh, Min K; Gillespie, Dan T; Petzold, Linda R

    2010-11-07

    The weighted stochastic simulation algorithm (wSSA) was developed by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] to efficiently estimate the probabilities of rare events in discrete stochastic systems. The wSSA uses importance sampling to enhance the statistical accuracy in the estimation of the probability of the rare event. The original algorithm biases the reaction selection step with a fixed importance sampling parameter. In this paper, we introduce a novel method where the biasing parameter is state-dependent. The new method features improved accuracy, efficiency, and robustness.

  18. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    NASA Astrophysics Data System (ADS)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  19. Credible occurrence probabilities for extreme geophysical events: earthquakes, volcanic eruptions, magnetic storms

    USGS Publications Warehouse

    Love, Jeffrey J.

    2012-01-01

    Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.

  20. A cosmic book. [of physics of early universe

    NASA Technical Reports Server (NTRS)

    Peebles, P. J. E.; Silk, Joseph

    1988-01-01

    A system of assigning odds to the basic elements of cosmological theories is proposed in order to evaluate the strengths and weaknesses of the theories. A figure of merit for the theories is obtained by counting and weighing the plausibility of each of the basic elements that is not substantially supported by observation or mature fundamental theory. The magnetized strong model is found to be the most probable. In order of decreasing probability, the ranking for the rest of the models is: (1) the magnetized string model with no exotic matter and the baryon adiabatic model; (2) the hot dark matter model and the model of cosmic string loops; (3) the canonical cold dark matter model, the cosmic string loops model with hot dark matter, and the baryonic isocurvature model; and (4) the cosmic string loops model with no exotic matter.

  1. Supplementary health insurance as a tool for risk-selection in mandatory basic health insurance markets.

    PubMed

    Paolucci, Francesco; Schut, Erik; Beck, Konstantin; Gress, Stefan; Van de Voorde, Carine; Zmora, Irit

    2007-04-01

    As the share of supplementary health insurance (SI) in health care finance is likely to grow, SI may become an increasingly attractive tool for risk-selection in basic health insurance (BI). In this paper, we develop a conceptual framework to assess the probability that insurers will use SI for favourable risk-selection in BI. We apply our framework to five countries in which risk-selection via SI is feasible: Belgium, Germany, Israel, the Netherlands, and Switzerland. For each country, we review the available evidence of SI being used as selection device. We find that the probability that SI is and will be used for risk-selection substantially varies across countries. Finally, we discuss several strategies for policy makers to reduce the chance that SI will be used for risk-selection in BI markets.

  2. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  3. Redundant Sensors for Mobile Robot Navigation

    DTIC Science & Technology

    1985-09-01

    represent a probability that the area is empty, while positive numbers mcan it’s probably occupied. Zero reprtsents the unknown. The basic idea is that...room to give it absolute positioning information. This works by using two infrared emitters and detectors on the robot. Measurements of anglcs are made...meters (T in Kelvin) 273 sec Distances returned when assuming 80 degrees Farenheit , but where. actual temperature is 60 degrees, will be seven inches

  4. ENSO Dynamics and Trends, AN Alternate View

    NASA Astrophysics Data System (ADS)

    Rojo Hernandez, J. D.; Lall, U.; Mesa, O. J.

    2017-12-01

    El Niño - Southern Oscillation (ENSO) is the most important inter-annual climate fluctuation on a planetary level with great effects on the hydrological cycle, agriculture, ecosystems, health and society. This work demonstrates the use of the Non-Homogeneus hidden Markov Models (NHMM) to characterize ENSO using a set of discrete states with variable transition probabilities matrix using the data of sea surface temperature anomalies (SSTA) of the Kaplan Extended SST v2 between 120E -90W, 15N-15S from Jan-1856 to Dec-2016. ENSO spatial patterns, their temporal distribution, the transition probabilities between patterns and their temporal evolution are the main results of the NHHMM applied to ENSO. The five "hidden" states found appear to represent the different "Flavors" described in the literature: the Canonical El Niño, Central El Niño, a Neutral state, Central La Niña and the Canonical Niña. Using the whole record length of the SSTA it was possible to identify trends in the dynamic system, with a decrease in the probability of occurrence of the cold events and a significant increase of the warm events, in particular of Central El Niño events whose probability of occurrence has increased Dramatically since 1960 coupled with increases in global temperature.

  5. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  6. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  7. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  8. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  9. Risk analysis of chemical, biological, or radionuclear threats: implications for food security.

    PubMed

    Mohtadi, Hamid; Murshid, Antu Panini

    2009-09-01

    If the food sector is attacked, the likely agents will be chemical, biological, or radionuclear (CBRN). We compiled a database of international terrorist/criminal activity involving such agents. Based on these data, we calculate the likelihood of a catastrophic event using extreme value methods. At the present, the probability of an event leading to 5,000 casualties (fatalities and injuries) is between 0.1 and 0.3. However, pronounced, nonstationary patterns within our data suggest that the "reoccurrence period" for such attacks is decreasing every year. Similarly, disturbing trends are evident in a broader data set, which is nonspecific as to the methods or means of attack. While at the present the likelihood of CBRN events is quite low, given an attack, the probability that it involves CBRN agents increases with the number of casualties. This is consistent with evidence of "heavy tails" in the distribution of casualties arising from CBRN events.

  10. Event-chain Monte Carlo algorithms for three- and many-particle interactions

    NASA Astrophysics Data System (ADS)

    Harland, J.; Michel, M.; Kampmann, T. A.; Kierfeld, J.

    2017-02-01

    We generalize the rejection-free event-chain Monte Carlo algorithm from many-particle systems with pairwise interactions to systems with arbitrary three- or many-particle interactions. We introduce generalized lifting probabilities between particles and obtain a general set of equations for lifting probabilities, the solution of which guarantees maximal global balance. We validate the resulting three-particle event-chain Monte Carlo algorithms on three different systems by comparison with conventional local Monte Carlo simulations: i) a test system of three particles with a three-particle interaction that depends on the enclosed triangle area; ii) a hard-needle system in two dimensions, where needle interactions constitute three-particle interactions of the needle end points; iii) a semiflexible polymer chain with a bending energy, which constitutes a three-particle interaction of neighboring chain beads. The examples demonstrate that the generalization to many-particle interactions broadens the applicability of event-chain algorithms considerably.

  11. Joint modeling of longitudinal data and discrete-time survival outcome.

    PubMed

    Qiu, Feiyou; Stein, Catherine M; Elston, Robert C

    2016-08-01

    A predictive joint shared parameter model is proposed for discrete time-to-event and longitudinal data. A discrete survival model with frailty and a generalized linear mixed model for the longitudinal data are joined to predict the probability of events. This joint model focuses on predicting discrete time-to-event outcome, taking advantage of repeated measurements. We show that the probability of an event in a time window can be more precisely predicted by incorporating the longitudinal measurements. The model was investigated by comparison with a two-step model and a discrete-time survival model. Results from both a study on the occurrence of tuberculosis and simulated data show that the joint model is superior to the other models in discrimination ability, especially as the latent variables related to both survival times and the longitudinal measurements depart from 0. © The Author(s) 2013.

  12. Mysterious eclipses in the light curve of KIC8462852: a possible explanation

    NASA Astrophysics Data System (ADS)

    Neslušan, L.; Budaj, J.

    2017-04-01

    Context. Apart from thousands of "regular" exoplanet candidates, Kepler satellite has discovered a small number of stars exhibiting peculiar eclipse-like events. They are most probably caused by disintegrating bodies transiting in front of the star. However, the nature of the bodies and obscuration events, such as those observed in KIC 8462852, remain mysterious. A swarm of comets or artificial alien mega-structures have been proposed as an explanation for the latter object. Aims: We explore the possibility that such eclipses are caused by the dust clouds associated with massive parent bodies orbiting the host star. Methods: We assumed a massive object and a simple model of the dust cloud surrounding the object. Then, we used the numerical integration to simulate the evolution of the cloud, its parent body, and resulting light-curves as they orbit and transit the star. Results: We found that it is possible to reproduce the basic features in the light-curve of KIC 8462852 with only four objects enshrouded in dust clouds. The fact that they are all on similar orbits and that such models require only a handful of free parameters provides additional support for this hypothesis. Conclusions: This model provides an alternative to the comet scenario. With such physical models at hand, at present, there is no need to invoke alien mega-structures for an explanation of these light-curves.

  13. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    PubMed

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  14. How Expert Pilots Think Cognitive Processes in Expert Decision Making

    DTIC Science & Technology

    1993-02-01

    Management (CRM) This document is available to the public Advanced Qualification Program (AQP) through the National Technical Information Cognitive Task Analysis (CTA...8217 Selecting realistic EDM scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events...scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events is a basic requirement for

  15. Living on the edge: Flood risks to societies Balázs M. Fekete, Shahabeddin Afshari Tork and Charles J. Vörösmarty

    NASA Astrophysics Data System (ADS)

    Fekete, B. M.; Afshari Tork, S.; Vorosmarty, C. J.

    2015-12-01

    Characterizing hydrological extreme events and assessing their societal impacts is perpetual challenge for hydrologists. Climate models predict that anticipated temperature rise leads to an intensification of the hydrological cycle and to a corresponding increase in the reoccurrence and the severity of extreme events. The societal impact of the hydrological extremes are interlinked with anthropogenic activities therefore the damages to manmade infrastructures are rarely a good measure of the extreme events' magnitudes. Extreme events are rare by definition therefore detecting change in their distributions requires long-term observational records. Currently, only in-situ monitoring time series has the temporal extent necessary for assessing the reoccurrence probabilities of extreme events, but they frequently lack the spatial coverage. Satellite remote sensing is often advocated to provide the required spatial coverage, but satellites have to compromise between spatial and temporal resolutions. Furthermore, the retrieval algorithms are often as complex as comparable hydrological models with similar degree of uncertainties in their parameterization and the validity of the final data products. In addition, anticipated changes over time in the reoccurrence frequencies of extreme events invalidates the stationarity assumption, which is the basis for using past observations to predict the probabilities future extreme events. Probably the best approach to provide more robust predictions of extreme events is the integration of the available data (in-situ and remote sensing) in a comprehensive data assimilation frameworks built on top of adequate hydrological modeling platforms. Our presentation will provide an overview of the current state of hydrological models to support data assimilations and the viable pathways to integrate in-situ and remote sensing observations for flood predictions. We will demonstrate the use of socio-economic data in combination with hydrological data assimilation to assess the resiliency to extreme flood events.

  16. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    PubMed

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Full moment tensors with uncertainties for the 2017 North Korea declared nuclear test and for a collocated, subsequent event

    NASA Astrophysics Data System (ADS)

    Alvizuri, C. R.; Tape, C.

    2017-12-01

    A seismic moment tensor is a 3×3 symmetric matrix that characterizes the far-field seismic radiation from a source, whether it be an earthquake, volcanic event, explosion. We estimate full moment tensors and their uncertainties for the North Korea declared nuclear test and for a collocated event that occurred eight minutes later. The nuclear test and the subsequent event occurred on September 3, 2017 at around 03:30 and 03:38 UTC time. We perform a grid search over the six-dimensional space of moment tensors, generating synthetic waveforms at each moment tensor grid point and then evaluating a misfit function between the observed and synthetic waveforms. The synthetic waveforms are computed using a 1-D structure model for the region; this approximation requires careful assessment of time shifts between data and synthetics, as well as careful choice of the bandpass for filtering. For each moment tensor we characterize its uncertainty in terms of waveform misfit, a probability function, and a confidence curve for the probability that the true moment tensor lies within the neighborhood of the optimal moment tensor. For each event we estimate its moment tensor using observed waveforms from all available seismic stations within a 2000-km radius. We use as much of the waveform as possible, including surface waves for all stations, and body waves above 1 Hz for some of the closest stations. Our preliminary magnitude estimates are Mw 5.1-5.3 for the first event and Mw 4.7 for the second event. Our results show a dominantly positive isotropic moment tensor for the first event, and a dominantly negative isotropic moment tensor for the subsequent event. As expected, the details of the probability density, waveform fit, and confidence curves are influenced by the structural model, the choice of filter frequencies, and the selection of stations.

  18. A probabilistic strategy for parametric catastrophe insurance

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss events. Due to the nature of parametric programmes, it is still necessary to clearly define when a payout is due or not, and so a decision threshold probability above which a loss event is considered to occur must be set, effectively converting the issued probabilities into deterministic binary outcomes. Model skill and value are evaluated over the range of possible threshold probabilities, with the objective of defining the optimal one. The predictive ability of the model is assessed. In terms of value assessment, a decision model is proposed, allowing users to quantify monetarily their expected expenses when different combinations of model event triggering and actual event occurrence take place, directly tackling the problem of basis risk.

  19. Media exposure related to the 2008 Sichuan Earthquake predicted probable PTSD among Chinese adolescents in Kunming, China: A longitudinal study.

    PubMed

    Yeung, Nelson C Y; Lau, Joseph T F; Yu, Nancy Xiaonan; Zhang, Jianping; Xu, Zhening; Choi, Kai Chow; Zhang, Qi; Mak, Winnie W S; Lui, Wacy W S

    2018-03-01

    This study examined the prevalence and the psychosocial predictors of probable PTSD among Chinese adolescents in Kunming (approximately 444 miles from the epicenter), China, who were indirectly exposed to the Sichuan Earthquake in 2008. Using a longitudinal study design, primary and secondary school students (N = 3577) in Kunming completed questionnaires at baseline (June 2008) and 6 months afterward (December 2008) in classroom settings. Participants' exposure to earthquake-related imagery and content, perceptions and emotional reactions related to the earthquake, and posttraumatic stress symptoms were measured. Univariate and forward stepwise multivariable logistic regression models were fit to identify significant predictors of probable PTSD at the 6-month follow-up. Prevalences of probable PTSD (with a Children's Revised Impact of Event Scale score ≥30) among the participants at baseline and 6-month follow-up were 16.9% and 11.1% respectively. In the multivariable analysis, those who were frequently exposed to distressful imagery had experienced at least two types of negative life events, perceived that teachers were distressed due to the earthquake, believed that the earthquake resulted from damages to the ecosystem, and felt apprehensive and emotionally disturbed due to the earthquake reported a higher risk of probable PTSD at 6-month follow-up (all ps < .05). Exposure to distressful media images, emotional responses, and disaster-related perceptions at baseline were found to be predictive of probable PTSD several months after indirect exposure to the event. Parents, teachers, and the mass media should be aware of the negative impacts of disaster-related media exposure on adolescents' psychological health. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Victimization and PTSD-like states in an Icelandic youth probability sample.

    PubMed

    Bödvarsdóttir, Iris; Elklit, Ask

    2007-10-01

    Although adolescence in many cases is a period of rebellion and experimentation with new behaviors and roles, the exposure of adolescents to life-threatening and violent events has rarely been investigated in national probability studies using a broad range of events. In an Icelandic national representative sample of 206 9th-grade students (mean = 14.5 years), the prevalence of 20 potentially traumatic events and negative life events was reported, along with the psychological impact of these events. Seventy-four percent of the girls and 79 percent of the boys were exposed to at least one event. The most common events were the death of a family member, threat of violence, and traffic accidents. The estimated lifetime prevalence of posttraumatic stress disorder-like states (PTSD; DSM-IV, APA, 1994 1) was 16 percent, whereas another 12 percent reached a sub-clinical level of PTSD-like states (missing the full diagnosis with one symptom). Following exposure, girls suffered from PTSD-like states almost twice as often as boys. Gender, mothers' education, and single-parenthood were associated with specific events. The odds ratios and 95% CI for PTSD-like states given a specific event are reported. Being exposed to multiple potentially traumatic events was associated with an increase in PTSD-like states. The findings indicate substantial mental health problems in adolescents that are associated with various types of potentially traumatic exposure.

  1. The External Validity of Prediction Models for the Diagnosis of Obstructive Coronary Artery Disease in Patients With Stable Chest Pain: Insights From the PROMISE Trial.

    PubMed

    Genders, Tessa S S; Coles, Adrian; Hoffmann, Udo; Patel, Manesh R; Mark, Daniel B; Lee, Kerry L; Steyerberg, Ewout W; Hunink, M G Myriam; Douglas, Pamela S

    2018-03-01

    This study sought to externally validate prediction models for the presence of obstructive coronary artery disease (CAD). A better assessment of the probability of CAD may improve the identification of patients who benefit from noninvasive testing. Stable chest pain patients from the PROMISE (Prospective Multicenter Imaging Study for Evaluation of Chest Pain) trial with computed tomography angiography (CTA) or invasive coronary angiography (ICA) were included. The authors assumed that patients with CTA showing 0% stenosis and a coronary artery calcium (CAC) score of 0 were free of obstructive CAD (≥50% stenosis) on ICA, and they multiply imputed missing ICA results based on clinical variables and CTA results. Predicted CAD probabilities were calculated using published coefficients for 3 models: basic model (age, sex, chest pain type), clinical model (basic model + diabetes, hypertension, dyslipidemia, and smoking), and clinical + CAC score model. The authors assessed discrimination and calibration, and compared published effects with observed predictor effects. In 3,468 patients (1,805 women; mean 60 years of age; 779 [23%] with obstructive CAD on CTA), the models demonstrated moderate-good discrimination, with C-statistics of 0.69 (95% confidence interval [CI]: 0.67 to 0.72), 0.72 (95% CI: 0.69 to 0.74), and 0.86 (95% CI: 0.85 to 0.88) for the basic, clinical, and clinical + CAC score models, respectively. Calibration was satisfactory although typical chest pain and diabetes were less predictive and CAC score was more predictive than was suggested by the models. Among the 31% of patients for whom the clinical model predicted a low (≤10%) probability of CAD, actual prevalence was 7%; among the 48% for whom the clinical + CAC score model predicted a low probability the observed prevalence was 2%. In 2 sensitivity analyses excluding imputed data, similar results were obtained using CTA as the outcome, whereas in those who underwent ICA the models significantly underestimated CAD probability. Existing clinical prediction models can identify patients with a low probability of obstructive CAD. Obstructive CAD on ICA was imputed for 61% of patients; hence, further validation is necessary. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  2. Incorporating Probability Models of Complex Test Structures to Perform Technology Independent FPGA Single Event Upset Analysis

    NASA Technical Reports Server (NTRS)

    Berg, M. D.; Kim, H. S.; Friendlich, M. A.; Perez, C. E.; Seidlick, C. M.; LaBel, K. A.

    2011-01-01

    We present SEU test and analysis of the Microsemi ProASIC3 FPGA. SEU Probability models are incorporated for device evaluation. Included is a comparison to the RTAXS FPGA illustrating the effectiveness of the overall testing methodology.

  3. The Dependence Structure of Conditional Probabilities in a Contingency Table

    ERIC Educational Resources Information Center

    Joarder, Anwar H.; Al-Sabah, Walid S.

    2002-01-01

    Conditional probability and statistical independence can be better explained with contingency tables. In this note some special cases of 2 x 2 contingency tables are considered. In turn an interesting insight into statistical dependence as well as independence of events is obtained.

  4. Bayesian probabilities for Mw 9.0+ earthquakes in the Aleutian Islands from a regionally scaled global rate

    NASA Astrophysics Data System (ADS)

    Butler, Rhett; Frazer, L. Neil; Templeton, William J.

    2016-05-01

    We use the global rate of Mw ≥ 9.0 earthquakes, and standard Bayesian procedures, to estimate the probability of such mega events in the Aleutian Islands, where they pose a significant risk to Hawaii. We find that the probability of such an earthquake along the Aleutians island arc is 6.5% to 12% over the next 50 years (50% credibility interval) and that the annualized risk to Hawai'i is about $30 M. Our method (the regionally scaled global rate method or RSGR) is to scale the global rate of Mw 9.0+ events in proportion to the fraction of global subduction (units of area per year) that takes place in the Aleutians. The RSGR method assumes that Mw 9.0+ events are a Poisson process with a rate that is both globally and regionally stationary on the time scale of centuries, and it follows the principle of Burbidge et al. (2008) who used the product of fault length and convergence rate, i.e., the area being subducted per annum, to scale the Poisson rate for the GSS to sections of the Indonesian subduction zone. Before applying RSGR to the Aleutians, we first apply it to five other regions of the global subduction system where its rate predictions can be compared with those from paleotsunami, paleoseismic, and geoarcheology data. To obtain regional rates from paleodata, we give a closed-form solution for the probability density function of the Poisson rate when event count and observation time are both uncertain.

  5. Hawkes-diffusion process and the conditional probability of defaults in the Eurozone

    NASA Astrophysics Data System (ADS)

    Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin

    2016-05-01

    This study examines market information embedded in the European sovereign CDS (credit default swap) market by analyzing the sovereign CDSs of 13 Eurozone countries from January 1, 2008, to February 29, 2012, which includes the recent Eurozone debt crisis period. We design the conditional probability of defaults for the CDS prices based on the Hawkes-diffusion process and obtain the theoretical prices of CDS indexes. To estimate the model parameters, we calibrate the model prices to empirical prices obtained from individual sovereign CDS term structure data. The estimated parameters clearly explain both cross-sectional and time-series data. Our empirical results show that the probability of a huge loss event sharply increased during the Eurozone debt crisis, indicating a contagion effect. Even countries with strong and stable economies, such as Germany and France, suffered from the contagion effect. We also find that the probability of small events is sensitive to the state of the economy, spiking several times due to the global financial crisis and the Greek government debt crisis.

  6. Learning predictive statistics from temporal sequences: Dynamics and strategies

    PubMed Central

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  7. Regional rainfall thresholds for landslide occurrence using a centenary database

    NASA Astrophysics Data System (ADS)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  8. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  9. High-pressure xenon time projection Titanium chamber: a methodology for detecting background radiation in neutrinoless double-beta decay experiments

    NASA Astrophysics Data System (ADS)

    Bachri, A.; Elmhamdi, A.; Hawron, M.; Grant, P.; Zazoum, B.; Martin, C.

    2017-10-01

    The xenon time projection chamber (TPC) promises a novel detection method for neutrinoless double-beta decay (0ν β β ) experiments. The TPC is capable of discovering the rare 0ν β β ionization signal of a distinct topological signature, with a decay energy Qββ = 2.458 MeV . However, more frequent internal (within TPC) and external events are also capable of depositing energy in the range of the Qβ β -value inside the chamber, thus mimicking 0ν β β or interfering with its direct observation. In the following paper, we illustrate a methodology for background radiation evaluation, assuming a basic cylindrical design for a toy titanium TPC that is capable of containing 100 kg of xenon gas at 20 atm pressure; we estimate the background budget and analyze the most prominent problematic events via theoretical calculation. Gamma rays emitted from nuclei of 214Bi and 208Tl present in the outer-shell titanium housing of the TPC are an example of such events for which we calculate probabilities of occurrences. We also study the effect of alpha-neutron (α-n)-induced neutrons and calculate their rate. Alpha particles which are created by the decay of naturally occurring uranium and thorium present in most materials, can react with the nucleus of low Z elements, prompting the release of neutrons and leading to thermal neutron capture. Our calculations suggest that the typical polytetrafluoroethylene (PTFE) inner coating of the chamber would constitute the primary material for neutron production, specifically; we find that the fluorine component of Teflon is much more likely to undergo an (α-n) reaction. From known contamination, we calculate an alpha production rate to be 5.5 × 107 alpha/year for the highest-purity titanium vessel with a Teflon lining. Lastly, using measurements of neutron flux from alpha bombardment, we estimate the expected neutron flux from the materials of the proposed toy TPC and identify all gamma rays (prompt or delayed, of energies comparable to the Qβ β -value) originating from thermal neutron capture with all stable elemental isotopes present in the TPC. We show that to limit the most probable reactions to a rate of one event per year or less, the neutron flux would have to be reduced to (3-6) × 10-10 cm-2ṡs-1. The predictions of our crude theoretical calculation are in good agreement with full simulation of TPC radiation background by existing experimental collaboration using xenon for 0ν β β experiment.

  10. Carryover effects associated with winter location affect fitness, social status, and population dynamics in a long-distance migrant

    USGS Publications Warehouse

    Sedinger, James S.; Schamber, Jason L.; Ward, David H.; Nicolai, Christopher A.; Conant, Bruce

    2011-01-01

    We used observations of individually marked female black brant geese (Branta bernicla nigricans; brant) at three wintering lagoons on the Pacific coast of Baja California—Laguna San Ignacio (LSI), Laguna Ojo de Liebre (LOL), and Bahía San Quintín (BSQ)—and the Tutakoke River breeding colony in Alaska to assess hypotheses about carryover effects on breeding and distribution of individuals among wintering areas. We estimated transition probabilities from wintering locations to breeding and nonbreeding by using multistratum robust-design capture-mark-recapture models. We also examined the effect of breeding on migration to wintering areas to assess the hypothesis that individuals in family groups occupied higher-quality wintering locations. We used 4,538 unique female brant in our analysis of the relationship between winter location and breeding probability. All competitive models of breeding probability contained additive effects of wintering location and the 1997–1998 El Niño–Southern Oscillation (ENSO) event on probability of breeding. Probability of breeding in non-ENSO years was 0.98 ± 0.02, 0.68 ± 0.04, and 0.91 ± 0.11 for females wintering at BSQ, LOL, and LSI, respectively. After the 1997–1998 ENSO event, breeding probability was between 2% (BSQ) and 38% (LOL) lower than in other years. Individuals that bred had the highest probability of migrating the next fall to the wintering area producing the highest probability of breeding.

  11. Velocity Structure and Plasma Properties in Halo CMEs

    NASA Technical Reports Server (NTRS)

    Wagner, William (Technical Monitor); Raymond, John C.

    2003-01-01

    We have identified a set of 23 Halo CMEs through July 2002 and 21 Partial Halo CMEs from the LASCO Halo CME Mail Archive for which Ultraviolet Coronagraph Spectrometer (UVCS) spectra exist. For each event we have collected basic information such as the event speed, whether or not UVCS caught the bright front, lines detected, Doppler shift and associated flare class. We have also obtained excellent observations of some of the spectacular events in November 2003, and we have made theoretical calculations pertaining to CME expansion at the heights observed by UVCS. We first analyzed the halo CMEs on 21 April and 24 August 2002 and the partial halo on 23 July 2002, because the X-class flares associated with these CMEs were extensively observed by RHESSI and other instruments as part of the MAX MILLENIUM campaign. These very fast CMEs showed extremely violent disruption of the pre-CME streamers, little or no cool prominence material, and the unusual (for UVCS heights) hot emission line [Fe XVIII]. Results, including a discussion of the current sheet interpretation for the [Fe XVIII] emission, are published in Raymond et al. and presented at the Fall 2002 AGU meeting and the solar physics summer school in L'Aquila, Italy. We are currently preparing two papers on the Dec. 28, 2000 partial halo event. This event was chosen to take advantage of the SEP event measured by WIND and ACE, and because a Type II radio burst coincides with the time that broad, blue-shifted O VI emission appeared in the UVCS spectra. One paper deals with a new density and velocity diagnostic for very fast CMEs; pumping of O VI lambda 1032 by Ly beta and pumping of O VI lambda 1038 by O VI lambda 1032. The other discusses physics of the shock wave and association with the SEP event. In the coming year we plan to expand the list of Halo and Partial Halo events observed by UVCS through the end of 2003. We will look at those events as a class to search for correlation between UV spectral characteristics and other CME and flare parameters. We will also choose several events for more detailed study, probably including the November 2003 events. We expect to support extended visits to CfA by S. Mancuso and A. Ciaravella. In the past year the grant covered some salary support for members of the SAO UVCS team and a 1 month visit to CfA by Angela Ciaravella, along with trips to meetings by J. Lin and J. Raymond and page charges for two papers.

  12. Long-term Changes in Extreme Air Pollution Meteorology and the Implications for Air Quality.

    PubMed

    Hou, Pei; Wu, Shiliang

    2016-03-31

    Extreme air pollution meteorological events, such as heat waves, temperature inversions and atmospheric stagnation episodes, can significantly affect air quality. Based on observational data, we have analyzed the long-term evolution of extreme air pollution meteorology on the global scale and their potential impacts on air quality, especially the high pollution episodes. We have identified significant increasing trends for the occurrences of extreme air pollution meteorological events in the past six decades, especially over the continental regions. Statistical analysis combining air quality data and meteorological data further indicates strong sensitivities of air quality (including both average air pollutant concentrations and high pollution episodes) to extreme meteorological events. For example, we find that in the United States the probability of severe ozone pollution when there are heat waves could be up to seven times of the average probability during summertime, while temperature inversions in wintertime could enhance the probability of severe particulate matter pollution by more than a factor of two. We have also identified significant seasonal and spatial variations in the sensitivity of air quality to extreme air pollution meteorology.

  13. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  14. A Continuous Method for Gene Flow

    PubMed Central

    Palczewski, Michal; Beerli, Peter

    2013-01-01

    Most modern population genetics inference methods are based on the coalescence framework. Methods that allow estimating parameters of structured populations commonly insert migration events into the genealogies. For these methods the calculation of the coalescence probability density of a genealogy requires a product over all time periods between events. Data sets that contain populations with high rates of gene flow among them require an enormous number of calculations. A new method, transition probability-structured coalescence (TPSC), replaces the discrete migration events with probability statements. Because the speed of calculation is independent of the amount of gene flow, this method allows calculating the coalescence densities efficiently. The current implementation of TPSC uses an approximation simplifying the interaction among lineages. Simulations and coverage comparisons of TPSC vs. MIGRATE show that TPSC allows estimation of high migration rates more precisely, but because of the approximation the estimation of low migration rates is biased. The implementation of TPSC into programs that calculate quantities on phylogenetic tree structures is straightforward, so the TPSC approach will facilitate more general inferences in many computer programs. PMID:23666937

  15. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  16. Modulation/demodulation techniques for satellite communications. Part 1: Background

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1981-01-01

    Basic characteristics of digital data transmission systems described include the physical communication links, the notion of bandwidth, FCC regulations, and performance measurements such as bit rates, bit error probabilities, throughputs, and delays. The error probability performance and spectral characteristics of various modulation/demodulation techniques commonly used or proposed for use in radio and satellite communication links are summarized. Forward error correction with block or convolutional codes is also discussed along with the important coding parameter, channel cutoff rate.

  17. A diffusion climatology for Cape Canaveral, Florida

    NASA Technical Reports Server (NTRS)

    Siler, R. K.

    1980-01-01

    The problem of toxic effluent released by a space shuttle launch on local plant and animal life is discussed. Based on several successive years of data, nine basic weather patterns were identified, and the probabilities of pattern occurrence, of onshore/alongshore cloud transport, of precipitation accompanying the latter, and of ground-level concentrations of hydrogen chloride were determined. Diurnal variations for the patterns were also investigated. Sketches showing probable movement of launch cloud exhaust and isobaric maps are presented.

  18. Sedimentation in Hot Creek in vicinity of Hot Creek Fish Hatchery, Mono County, California

    USGS Publications Warehouse

    Burkham, D.E.

    1978-01-01

    An accumulation of fine-grained sediment in Hot Creek downstream from Hot Creek Fish Hatchery, Mono County, Calif., created concern that the site may be deteriorating as a habitat for trout. The accumulation is a phenomenon that probably occurs naturally in the problem reach. Fluctuation in the weather probably is the basic cause of the deposition of fine-grained sediment that has occurred since about 1970. Man 's activities and the Hot Creek Fish Hatchery may have contributed to the problem; the significance of these factors, however, probably was magnified because of drought conditions in 1975-77. (Woodard-USGS)

  19. Adults Learning Mathematics: What We Should Know about Betting and Bookkeeping?

    ERIC Educational Resources Information Center

    Maasz, Juergen; Siller, Hans-Stefan

    2010-01-01

    A lot of people risk money with bets on sport events or other events. Bookkeepers that offer such bets earn a lot of money. We are making a proposal (more exactly: a concept for a part of a basic mathematics course) for learning mathematics behind the screen (internet bets are very popular). Learners should organize a "sports event"…

  20. The probability of being identified as an outlier with commonly used funnel plot control limits for the standardised mortality ratio.

    PubMed

    Seaton, Sarah E; Manktelow, Bradley N

    2012-07-16

    Emphasis is increasingly being placed on the monitoring of clinical outcomes for health care providers. Funnel plots have become an increasingly popular graphical methodology used to identify potential outliers. It is assumed that a provider only displaying expected random variation (i.e. 'in-control') will fall outside a control limit with a known probability. In reality, the discrete count nature of these data, and the differing methods, can lead to true probabilities quite different from the nominal value. This paper investigates the true probability of an 'in control' provider falling outside control limits for the Standardised Mortality Ratio (SMR). The true probabilities of an 'in control' provider falling outside control limits for the SMR were calculated and compared for three commonly used limits: Wald confidence interval; 'exact' confidence interval; probability-based prediction interval. The probability of falling above the upper limit, or below the lower limit, often varied greatly from the nominal value. This was particularly apparent when there were a small number of expected events: for expected events ≤ 50 the median probability of an 'in-control' provider falling above the upper 95% limit was 0.0301 (Wald), 0.0121 ('exact'), 0.0201 (prediction). It is important to understand the properties and probability of being identified as an outlier by each of these different methods to aid the correct identification of poorly performing health care providers. The limits obtained using probability-based prediction limits have the most intuitive interpretation and their properties can be defined a priori. Funnel plot control limits for the SMR should not be based on confidence intervals.

  1. Ischemic stroke and intracranial hemorrhage in patients with recurrent glioblastoma multiforme, treated with bevacizumab.

    PubMed

    Auer, Timo A; Renovanz, Mirjam; Marini, Federico; Brockmann, Marc A; Tanyildizi, Yasemin

    2017-07-01

    Bevacizumab (BVZ), a monoclonal antibody directed against vascular endothelial growth factor (VEGF), has been suspected to increase the incidence of ischemic stroke (IS) and intracranial hemorrhage (ICH) in GBM patients. Intracranial vascular events, such as IS and ICH, were retrospectively analyzed in 364 MRI scans of 82 patients with recurrent GBM (1st/2nd/3rd relapse). Out of these 82 patients, 40 were treated with BVZ (178 scans) in addition to basic treatment, whereas 42 patients matching for age and gender received basic treatment (186 scans). Distribution of typical vascular risk factors between both groups was analyzed retrospectively. In seven out of 82 patients (8%) vascular events were detected in MRI. Four vascular events were recorded in the BVZ-group (3 IS and 1 ICH), and 3 vascular events were found in the Control-group (1 IS and 2 ICH; p > 0.05 between both groups). Likewise, vascular risk factors (arterial hypertension, diabetes mellitus, obesity, former vascular event, hyperlipidemia, tobacco consumption and/or hypercholesterolemia) did not differ significantly between both groups. BVZ treatment does not seem to be associated with an increased risk for vascular events in patients with GBM in recurrence.

  2. A mathematical model for the occurrence of historical events

    NASA Astrophysics Data System (ADS)

    Ohnishi, Teruaki

    2017-12-01

    A mathematical model was proposed for the frequency distribution of historical inter-event time τ. A basic ingredient was constructed by assuming the significance of a newly occurring historical event depending on the magnitude of a preceding event, the decrease of its significance by oblivion during the successive events, and an independent Poisson process for the occurrence of the event. The frequency distribution of τ was derived by integrating the basic ingredient with respect to all social fields and to all stake holders. The function of such a distribution was revealed as the forms of an exponential type, a power law type or an exponential-with-a-tail type depending on the values of constants appearing in the ingredient. The validity of this model was studied by applying it to the two cases of Modern China and Northern Ireland Troubles, where the τ-distribution varies depending on the different countries interacting with China and on the different stage of history of the Troubles, respectively. This indicates that history is consisted from many components with such different types of τ-distribution, which are the similar situation to the cases of other general human activities.

  3. Teaching Elementary Probability and Statistics: Some Applications in Epidemiology.

    ERIC Educational Resources Information Center

    Sahai, Hardeo; Reesal, Michael R.

    1992-01-01

    Illustrates some applications of elementary probability and statistics to epidemiology, the branch of medical science that attempts to discover associations between events, patterns, and the cause of disease in human populations. Uses real-life examples involving cancer's link to smoking and the AIDS virus. (MDH)

  4. Coincidence probability as a measure of the average phase-space density at freeze-out

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.; Zalewski, K.

    2006-02-01

    It is pointed out that the average semi-inclusive particle phase-space density at freeze-out can be determined from the coincidence probability of the events observed in multiparticle production. The method of measurement is described and its accuracy examined.

  5. Local description of a polyenic radical cation

    NASA Astrophysics Data System (ADS)

    Karafiloglou, P.; Kapsomenos, G.

    1995-06-01

    The various local electronic events occurring in a radical cation of a linear polyene with even number of centers are investigated by means of the calculation of the expectation values of second quantized density operators, in the framework of the general poly-electron population analysis. Two series of calculations in two limit geometries (a strong alternant and a polaron-like one) are performed by using as analysers both natural AOs in ab initio correlated wave functions, as well as the model orthogonal AOs in PPP + full CI ones. The probabilities of finding simultaneously the positive charge (+) and the radical center (·) follows, in accord with basic chemical intuition, an oscillating (even-odd) law, even at distant AO positions. The probability of having a transmission of the (+) charge through the π-bonds (when the (·) is located in one extremity of the polyene) is greater than this of the transmission of the (·). Comparing the radical cation with the parent polyene, it is shown that oxidation creates an important trend of single-double bond inversion even in strongly alternant geometry; this effect is more pronounced in bonds of the middle. The examination of various CDW structures shows that some of them can have small or negligible contributions; this counterintuitive and cooperative effect is rationalized by means of Moffitt's theorem. All the above effects are not the consequence of the polaron-like geometry, but are controlled from the topology of n-centers linearly disposed and involving ( n-1) electrons.

  6. Bilayer lipid composition modulates the activity of dermaseptins, polycationic antimicrobial peptides.

    PubMed

    Duclohier, Hervé

    2006-05-01

    The primary targets of defense peptides are plasma membranes, and the induced irreversible depolarization is sufficient to exert antimicrobial activity although secondary modes of action might be at work. Channels or pores underlying membrane permeabilization are usually quite large with single-channel conductances two orders of magnitude higher than those exhibited by physiological channels involved, e.g., in excitability. Accordingly, the ion specificity and selectivity are quite low. Whereas, e.g., peptaibols favor cation transport, polycationic or basic peptides tend to form anion-specific pores. With dermaseptin B2, a 33 residue long and mostly alpha-helical peptide isolated from the skin of the South American frog Phyllomedusa bicolor, we found that the ion specificity of its pores induced in bilayers is modulated by phospholipid-charged headgroups. This suggests mixed lipid-peptide pore lining instead of the more classical barrel-stave model. Macroscopic conductance is nearly voltage independent, and concentration dependence suggests that the pores are mainly formed by dermaseptin tetramers. The two most probable single-channel events are well resolved at 200 and 500 pS (in 150 mM NaCl) with occasional other equally spaced higher or lower levels. In contrast to previous molecular dynamics previsions, this study demonstrates that dermaseptins are able to form pores, although a related analog (B6) failed to induce any significant conductance. Finally, the model of the pore we present accounts for phospholipid headgroups intercalated between peptide helices lining the pore and for one of the most probable single-channel conductance.

  7. A framework for analyzing contagion in assortative banking networks

    PubMed Central

    Hurd, Thomas R.; Gleeson, James P.; Melnik, Sergey

    2017-01-01

    We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk. PMID:28231324

  8. A framework for analyzing contagion in assortative banking networks.

    PubMed

    Hurd, Thomas R; Gleeson, James P; Melnik, Sergey

    2017-01-01

    We introduce a probabilistic framework that represents stylized banking networks with the aim of predicting the size of contagion events. Most previous work on random financial networks assumes independent connections between banks, whereas our framework explicitly allows for (dis)assortative edge probabilities (i.e., a tendency for small banks to link to large banks). We analyze default cascades triggered by shocking the network and find that the cascade can be understood as an explicit iterated mapping on a set of edge probabilities that converges to a fixed point. We derive a cascade condition, analogous to the basic reproduction number R0 in epidemic modelling, that characterizes whether or not a single initially defaulted bank can trigger a cascade that extends to a finite fraction of the infinite network. This cascade condition is an easily computed measure of the systemic risk inherent in a given banking network topology. We use percolation theory for random networks to derive a formula for the frequency of global cascades. These analytical results are shown to provide limited quantitative agreement with Monte Carlo simulation studies of finite-sized networks. We show that edge-assortativity, the propensity of nodes to connect to similar nodes, can have a strong effect on the level of systemic risk as measured by the cascade condition. However, the effect of assortativity on systemic risk is subtle, and we propose a simple graph theoretic quantity, which we call the graph-assortativity coefficient, that can be used to assess systemic risk.

  9. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  10. The Impact of the Geometrical Structure of the DNA on Parameters of the Track-Event Theory for Radiation Induced Cell Kill.

    PubMed

    Schneider, Uwe; Vasi, Fabiano; Besserer, Jürgen

    2016-01-01

    When fractionation schemes for hypofractionation and stereotactic body radiotherapy are considered, a reliable cell survival model at high dose is needed for calculating doses of similar biological effectiveness. An alternative to the LQ-model is the track-event theory which is based on the probabilities for one- and two two-track events. A one-track-event (OTE) is always represented by at least two simultaneous double strand breaks. A two-track-event (TTE) results in one double strand break. Therefore at least two two-track-events on the same or different chromosomes are necessary to produce an event which leads to cell sterilization. It is obvious that the probabilities of OTEs and TTEs must somehow depend on the geometrical structure of the chromatin. In terms of the track-event theory the ratio ε of the probabilities of OTEs and TTEs includes the geometrical dependence and is obtained in this work by simple Monte Carlo simulations. For this work it was assumed that the anchors of loop forming chromatin are most sensitive to radiation induced cell deaths. Therefore two adjacent tetranucleosomes representing the loop anchors were digitized. The probability ratio ε of OTEs and TTEs was factorized into a radiation quality dependent part and a geometrical part: ε = εion ∙ εgeo. εgeo was obtained for two situations, by applying Monte Carlo simulation for DNA on the tetranucleosomes itself and for linker DNA. Low energy electrons were represented by randomly distributed ionizations and high energy electrons by ionizations which were simulated on rays. εion was determined for electrons by using results from nanodosimetric measurements. The calculated ε was compared to the ε obtained from fits of the track event model to 42 sets of experimental human cell survival data. When the two tetranucleosomes are in direct contact and the hits are randomly distributed εgeo and ε are 0.12 and 0.85, respectively. When the hits are simulated on rays εgeo and ε are 0.10 and 0.71. For the linker-DNA εgeo and ε for randomly distributed hits are 0.010 and 0.073, and for hits on rays 0.0058 and 0.041, respectively. The calculated ε fits the experimentally obtained ε = 0.64±0.32 best for hits on the tetranucleosome when they are close to each other both, for high and low energy electrons. The parameter εgeo of the track event model was obtained by pure geometrical considerations of the chromatin structure and is 0.095 ± 0.022. It can be used as a fixed parameter in the track-event theory.

  11. Are Earthquake Clusters/Supercycles Real or Random?

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2016-12-01

    Long records of earthquakes at plate boundaries such as the San Andreas or Cascadia often show that large earthquakes occur in temporal clusters, also termed supercycles, separated by less active intervals. These are intriguing because the boundary is presumably being loaded by steady plate motion. If so, earthquakes resulting from seismic cycles - in which their probability is small shortly after the past one, and then increases with time - should occur quasi-periodically rather than be more frequent in some intervals than others. We are exploring this issue with two approaches. One is to assess whether the clusters result purely by chance from a time-independent process that has no "memory." Thus a future earthquake is equally likely immediately after the past one and much later, so earthquakes can cluster in time. We analyze the agreement between such a model and inter-event times for Parkfield, Pallet Creek, and other records. A useful tool is transformation by the inverse cumulative distribution function, so the inter-event times have a uniform distribution when the memorylessness property holds. The second is via a time-variable model in which earthquake probability increases with time between earthquakes and decreases after an earthquake. The probability of an event increases with time until one happens, after which it decreases, but not to zero. Hence after a long period of quiescence, the probability of an earthquake can remain higher than the long-term average for several cycles. Thus the probability of another earthquake is path dependent, i.e. depends on the prior earthquake history over multiple cycles. Time histories resulting from simulations give clusters with properties similar to those observed. The sequences of earthquakes result from both the model parameters and chance, so two runs with the same parameters look different. The model parameters control the average time between events and the variation of the actual times around this average, so models can be strongly or weakly time-dependent.

  12. ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms

    NASA Astrophysics Data System (ADS)

    Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.

    2006-12-01

    Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.

  13. Evidence for a seismic activity mainly constituted of hybrid events at Cayambe volcano, Ecuador. Interpretation in a iced-domes volcano context

    NASA Astrophysics Data System (ADS)

    Guillier, Bertrand; Chatelain, Jean-Luc

    2006-06-01

    The high activity level of Hybrid Events (HE) detected beneath the Cayambe volcano since 1989 has been more thoroughly investigated with data from a temporary array. The unusual HE spectral content allows separating a high-frequency signal riding on a low-frequency one, with a probable single source. HEs are interpreted as high frequency VT events, produced by the interaction between magmatic heat and an underground water system fed by thaw water from the summital glacier, which trigger simultaneous low-frequency fluid resonance in the highly fractured adjacent medium. Pure VTs are interpreted as 'aborted' HEs occurring probably in the oldest and coldest part of the volcano complex. To cite this article: B. Guillier, J.-L. Chatelain, C. R. Geoscience 338 (2006).

  14. Estimation of full moment tensors, including uncertainties, for earthquakes, volcanic events, and nuclear explosions

    NASA Astrophysics Data System (ADS)

    Alvizuri, Celso; Silwal, Vipul; Krischer, Lion; Tape, Carl

    2017-04-01

    A seismic moment tensor is a 3 × 3 symmetric matrix that provides a compact representation of seismic events within Earth's crust. We develop an algorithm to estimate moment tensors and their uncertainties from observed seismic data. For a given event, the algorithm performs a grid search over the six-dimensional space of moment tensors by generating synthetic waveforms at each grid point and then evaluating a misfit function between the observed and synthetic waveforms. 'The' moment tensor M for the event is then the moment tensor with minimum misfit. To describe the uncertainty associated with M, we first convert the misfit function to a probability function. The uncertainty, or rather the confidence, is then given by the 'confidence curve' P(V ), where P(V ) is the probability that the true moment tensor for the event lies within the neighborhood of M that has fractional volume V . The area under the confidence curve provides a single, abbreviated 'confidence parameter' for M. We apply the method to data from events in different regions and tectonic settings: small (Mw < 2.5) events at Uturuncu volcano in Bolivia, moderate (Mw > 4) earthquakes in the southern Alaska subduction zone, and natural and man-made events at the Nevada Test Site. Moment tensor uncertainties allow us to better discriminate among moment tensor source types and to assign physical processes to the events.

  15. Precursors to potential severe core damage accidents: 1994, a status report. Volume 22: Appendix I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belles, R.J.; Cletcher, J.W.; Copinger, D.A.

    Nine operational events that affected eleven commercial light-water reactors (LWRs) during 1994 and that are considered to be precursors to potential severe core damage are described. All these events had conditional probabilities of subsequent severe core damage greater than or equal to 1.0 {times} 10{sup {minus}6}. These events were identified by computer-screening the 1994 licensee event reports from commercial LWRs to identify those that could be potential precursors. Candidate precursors were then selected and evaluated in a process similar to that used in previous assessments. Selected events underwent engineering evaluation that identified, analyzed, and documented the precursors. Other events designatedmore » by the Nuclear Regulatory Commission (NRC) also underwent a similar evaluation. Finally, documented precursors were submitted for review by licensees and NRC headquarters and regional offices to ensure that the plant design and its response to the precursor were correctly characterized. This study is a continuation of earlier work, which evaluated 1969--1981 and 1984--1993 events. The report discusses the general rationale for this study, the selection and documentation of events as precursors, and the estimation of conditional probabilities of subsequent severe core damage for events. This document is bound in two volumes: Vol. 21 contains the main report and Appendices A--H; Vol. 22 contains Appendix 1.« less

  16. Operation of the PAVE PAWS Radar System at Beale Air Force Base, California. Part 2. Public Comment & AF Response.

    DTIC Science & Technology

    1980-07-01

    trip next month to Europe , and when I come back. It’s for this reason that I was not able to have it all typed and prepared, and the Air Force was...millimeter of culture medium. A mutational event such as a change in a single base pair in the bacterial DNA, which is impossible to detect by standard...100) bacteria, a rare single mutation event with a probability of say I in 100,000,000, the probability of 10-8, will thus be amplified by a factor of

  17. Dimensional Representation and Gradient Boosting for Seismic Event Classification

    NASA Astrophysics Data System (ADS)

    Semmelmayer, F. C.; Kappedal, R. D.; Magana-Zook, S. A.

    2017-12-01

    In this research, we conducted experiments of representational structures on 5009 seismic signals with the intent of finding a method to classify signals as either an explosion or an earthquake in an automated fashion. We also applied a gradient boosted classifier. While perfect classification was not attained (approximately 88% was our best model), some cases demonstrate that many events can be filtered out as very high probability being explosions or earthquakes, diminishing subject-matter experts'(SME) workload for first stage analysis. It is our hope that these methods can be refined, further increasing the classification probability.

  18. Disasters as a necessary part of benefit-cost analyses.

    PubMed

    Mark, R K; Stuart-Alexander, D E

    1977-09-16

    Benefit-cost analyses for water projects generally have not included the expected costs (residual risk) of low-probability disasters such as dam failures, impoundment-induced earthquakes, and landslides. Analysis of the history of these types of events demonstrates that dam failures are not uncommon and that the probability of a reservoir-triggered earth-quake increases with increasing reservoir depth. Because the expected costs from such events can be significant and risk is project-specific, estimates should be made for each project. The cost of expected damage from a "high-risk" project in an urban area could be comparable to project benefits.

  19. Diagnosability of Stochastic Chemical Kinetic Systems: A Discrete Event Systems Approach (PREPRINT)

    DTIC Science & Technology

    2010-01-01

    USA. E -mail: thorsley@u.washington.edu. This research is partially supported by the 2006 AFOSR MURI award “High Confidence Design for Distributed...occurrence of the finite sample path ω. These distributions are defined recursively to be π0(x) := π0(x), πωσ(x ′) := ∑ x∈X πω(x)r(x ′,σ | x) e −r(x ′,σ|x... e −rxτ . (2) This probability is this probability that the arrival time of the first event is greater than τ . For finite sample paths with strings

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Monterial, Mateusz; Clarke, Shaun

    A Bayesian approach is proposed for pulse shape discrimination of photons and neutrons in liquid organic scinitillators. Instead of drawing a decision boundary, each pulse is assigned a photon or neutron confidence probability. In addition, this allows for photon and neutron classification on an event-by-event basis. The sum of those confidence probabilities is used to estimate the number of photon and neutron instances in the data. An iterative scheme, similar to an expectation-maximization algorithm for Gaussian mixtures, is used to infer the ratio of photons-to-neutrons in each measurement. Therefore, the probability space adapts to data with varying photon-to-neutron ratios. Amore » time-correlated measurement of Am–Be and separate measurements of 137Cs, 60Co and 232Th photon sources were used to construct libraries of neutrons and photons. These libraries were then used to produce synthetic data sets with varying ratios of photons-to-neutrons. Probability weighted method that we implemented was found to maintain neutron acceptance rate of up to 90% up to photon-to-neutron ratio of 2000, and performed 9% better than the decision boundary approach. Furthermore, the iterative approach appropriately changed the probability space with an increasing number of photons which kept the neutron population estimate from unrealistically increasing.« less

  1. Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method

    PubMed Central

    Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan

    2018-01-01

    Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824

  2. Acoustic-assisted fluidic hourglasses

    NASA Astrophysics Data System (ADS)

    Guimaraes, Tamara; Marin, Alvaro; Kaehler, Christian J.; Barnkob, Rune

    2017-11-01

    Microfluidic devices are prone to get clogged when suspensions are forced through narrow passages. Such clogging events occur when particles form arches that block the channel. In this work we study the clogging probabilities in a microfluidic hourglass when subject to ultrasound. We measure the clogging probabilities for certain ranges of sound amplitudes and particle-to-neck size ratios in which clogging events are more likely to occur. The ultrasound induces acoustic radiation forces on the suspended particles, leading to particle migration perpendicular to the channel flow direction. The transverse particle rearrangement can significantly reduce the clogging probability by decreasing the chances of arching in the narrowing of the passage. We show that by choosing proper sound actuation conditions, the method is reliable, non-intrusive, preventive, and allows to increase the life of fluidic devices (microfluidic or larger) with particles in a wide range of sizes.

  3. Rare Event Simulation in Radiation Transport

    NASA Astrophysics Data System (ADS)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous state space is made. This involves partitioning the space into a finite number of cells. There is a tradeoff between additional computation per iteration and variance reduction per iteration that arises in determining the optimal grid size. All versions of this algorithm can be thought of as a compromise between deterministic and Monte Carlo methods, capturing advantages of both techniques.

  4. Development of a flood early warning system and communication with end-users: the Vipava/Vipacco case study in the KULTURisk FP7 project

    NASA Astrophysics Data System (ADS)

    Grossi, Giovanna; Caronna, Paolo; Ranzi, Roberto

    2014-05-01

    Within the framework of risk communication, the goal of an early warning system is to support the interaction between technicians and authorities (and subsequently population) as a prevention measure. The methodology proposed in the KULTURisk FP7 project aimed to build a closer collaboration between these actors, in the perspective of promoting pro-active actions to mitigate the effects of flood hazards. The transnational (Slovenia/ Italy) Soča/Isonzo case study focused on this concept of cooperation between stakeholders and hydrological forecasters. The DIMOSHONG_VIP hydrological model was calibrated for the Vipava/Vipacco River (650 km2), a tributary of the Soča/Isonzo River, on the basis of flood events occurred between 1998 and 2012. The European Centre for Medium-Range Weather Forecasts (ECMWF) provided the past meteorological forecasts, both deterministic (1 forecast) and probabilistic (51 ensemble members). The resolution of the ECMWF grid is currently about 15 km (Deterministic-DET) and 30 km (Ensemble Prediction System-EPS). A verification was conducted to validate the flood-forecast outputs of the DIMOSHONG_VIP+ECMWF early warning system. Basic descriptive statistics, like event probability, probability of a forecast occurrence and frequency bias were determined. Some performance measures were calculated, such as hit rate (probability of detection) and false alarm rate (probability of false detection). Relative Opening Characteristic (ROC) curves were generated both for deterministic and probabilistic forecasts. These analysis showed a good performance of the early warning system, in respect of the small size of the sample. A particular attention was spent to the design of flood-forecasting output charts, involving and inquiring stakeholders (Alto Adriatico River Basin Authority), hydrology specialists in the field, and common people. Graph types for both forecasted precipitation and discharge were set. Three different risk thresholds were identified ("attention", "pre-alarm" or "alert", "alarm"), with an "icon-style" representation, suitable for communication to civil protection stakeholders or the public. Aiming at showing probabilistic representations in a "user-friendly" way, we opted for the visualization of the single deterministic forecasted hydrograph together with the 5%, 25%, 50%, 75% and 95% percentiles bands of the Hydrological Ensemble Prediction System (HEPS). HEPS is generally used for 3-5 days hydrological forecasts, while the error due to incorrect initial data is comparable to the error due to the lower resolution with respect to the deterministic forecast. In the short term forecasting (12-48 hours) the HEPS-members show obviously a similar tendency; in this case, considering its higher resolution, the deterministic forecast is expected to be more effective. The plot of different forecasts in the same chart allows the use of model outputs from 4/5 days to few hours before a potential flood event. This framework was built to help a stakeholder, like a mayor, a civil protection authority, etc, in the flood control and management operations, and was designed to be included in a wider decision support system.

  5. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.

  6. How does new evidence change our estimates of probabilities? Carnap's formula revisited

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Quintana, Chris

    1992-01-01

    The formula originally proposed by R. Carnap in his analysis of induction is reviewed and its natural generalization is presented. A situation is considered where the probability of a certain event is determined without using standard statistical methods due to the lack of observation.

  7. Rates and impact of trauma and current stressors among Darfuri refugees in Eastern Chad.

    PubMed

    Rasmussen, Andrew; Nguyen, Leanh; Wilkinson, John; Vundla, Sikhumbuzo; Raghavan, Sumithra; Miller, Kenneth E; Keller, Allen S

    2010-04-01

    Darfur refugees face hardships associated with chronic displacement, including lack of basic needs and safety concerns. Psychiatric research on refugees has focused on trauma, but daily stressors may contribute more to variance in distress. This article reports rates of past trauma and current stressors among Darfur refugees and gauges the contribution of each to psychological distress and functional impairment. A representative sample of 848 Darfuris in 2 refugee camps were interviewed about traumatic events, stressors faced in the camps, psychological distress, and functional impairment. Basic needs and safety concerns were more strongly correlated with measures of distress (rs = .19-.31) than were war-related traumatic events (rs = .09-.20). Hierarchical regression supported models in which effects of trauma on distress were mediated by current stressors. Although war-related traumatic events are the initial causes of refugees' hardship, findings suggest that the day-to-day challenges and concerns in camps mediate psychological distress associated with these events.

  8. Rates and Impact of Trauma and Current Stressors Among Darfuri Refugees in Eastern Chad

    PubMed Central

    Rasmussen, Andrew; Nguyen, Leanh; Wilkinson, John; Vundla, Sikhumbuzo; Raghavan, Sumithra; Miller, Kenneth E.; Keller, Allen S.

    2010-01-01

    Darfur refugees face hardships associated with chronic displacement, including lack of basic needs and safety concerns. Psychiatric research on refugees has focused on trauma, but daily stressors may contribute more to variance in distress. In this article we report rates of past trauma and current stressors among Darfur refugees and gauge the contribution of each to psychological distress and functional impairment. A representative sample of 848 Darfuris in two refugee camps were interviewed about traumatic events, stressors faced in the camps, psychological distress and functional impairment. Basic needs and safety concerns were more strongly correlated with measures of distress (r's = .19–.31) than were war-related traumatic events (r's = .09–.20). Hierarchical regression supported models in which effects of trauma on distress were mediated by current stressors. Although war-related traumatic events are the initial causes of refugees' hardship, findings suggest that the day-to-day challenges and concerns in camps mediate psychological distress associated with these events. PMID:20553516

  9. Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework

    PubMed Central

    Young, Diana L.; Goodie, Adam S.; Hall, Daniel B.

    2010-01-01

    Many decisions involve a degree of personal control over event outcomes, which is exerted through one’s knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed. PMID:21278906

  10. Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework.

    PubMed

    Young, Diana L; Goodie, Adam S; Hall, Daniel B

    2011-01-01

    Many decisions involve a degree of personal control over event outcomes, which is exerted through one's knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed.

  11. The Roll of the Dice: Differentiation Outcomes and the Role of Late Protoplanetary Impacts

    NASA Astrophysics Data System (ADS)

    Heinze, W. D.

    2018-05-01

    Because late accretion occurs by the impact of 10–100 (large) embryos which have low probability of being high-velocity events and such events are necessary for magnetic dynamos, small number statics control differentiation outcomes.

  12. Semiparametric temporal process regression of survival-out-of-hospital.

    PubMed

    Zhan, Tianyu; Schaubel, Douglas E

    2018-05-23

    The recurrent/terminal event data structure has undergone considerable methodological development in the last 10-15 years. An example of the data structure that has arisen with increasing frequency involves the recurrent event being hospitalization and the terminal event being death. We consider the response Survival-Out-of-Hospital, defined as a temporal process (indicator function) taking the value 1 when the subject is currently alive and not hospitalized, and 0 otherwise. Survival-Out-of-Hospital is a useful alternative strategy for the analysis of hospitalization/survival in the chronic disease setting, with the response variate representing a refinement to survival time through the incorporation of an objective quality-of-life component. The semiparametric model we consider assumes multiplicative covariate effects and leaves unspecified the baseline probability of being alive-and-out-of-hospital. Using zero-mean estimating equations, the proposed regression parameter estimator can be computed without estimating the unspecified baseline probability process, although baseline probabilities can subsequently be estimated for any time point within the support of the censoring distribution. We demonstrate that the regression parameter estimator is asymptotically normal, and that the baseline probability function estimator converges to a Gaussian process. Simulation studies are performed to show that our estimating procedures have satisfactory finite sample performances. The proposed methods are applied to the Dialysis Outcomes and Practice Patterns Study (DOPPS), an international end-stage renal disease study.

  13. MATCH package for the ANL three-view geometry program. [For matching particle tracks from various views to obtain proper input for TVGP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gieraltowski, G.F.

    1976-02-01

    The ANL MATCH package consists of a set of 13 subroutines which are linked to the current 12-foot and 15-foot versions of the ANL TVGP program. Their purpose is to match the tracks from the various measured views to obtain a proper matched set of tracks to be processed by TVGP. The MATCH package can effectively handle up to 20 tracks per event measured in 2 or 3 views and, in cases of ambiguous match solutions, allow up to 10 match ambiguities. A basic assumption made is that the same number of tracks is measured in each view. MATCH canmore » work in either two or three measured views with the assumption that, if only two views are measured, the last point measured on each track is a good representation of the true end-point of the track. This is not to say that, if this assumption is false, that MATCH cannot obtain a match solution. It is true, however, that the probability of obtaining a match solution is inversely proportional both to the number of tracks per vertex and to the momentum of the tracks. Current uses of MATCH are in obtaining match solutions for two-view K/sup -/p (6.5 GeV/c) events measured on POLLY III and in obtaining match solutions for events with large numbers of tracks (3 to 10) produced by an anti ..nu.. p interaction in the FNAL 15-foot bubble chamber with a spectrum of momentum values ranging from 5 to 25 Gev/c. (RWR)« less

  14. Fifty Years of THERP and Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less

  15. Space shuttle solid rocket booster recovery system definition, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.

  16. The Role of Sleep in the Modulation of Gastroesophageal Reflux and Symptoms in NICU Neonates.

    PubMed

    Qureshi, Aslam; Malkar, Manish; Splaingard, Mark; Khuhro, Abdul; Jadcherla, Sudarshan

    2015-09-01

    Newborns sleep about 80% of the time. Gastroesophageal reflux disease is prevalent in about 10% of neonatal intensive care unit infants. Concurrent polysomnography and pH-impedance studies clarify the relationship of gastroesophageal reflux with sleep. To characterize spatiotemporal and chemical characteristics of impedance-positive gastroesophageal reflux and define symptom associations in sleep and wake states in symptomatic neonates. We hypothesized that frequency of impedance-positive gastroesophageal reflux events and their association with cardiorespiratory symptoms is greater during sleep. Eighteen neonates underwent concurrent polysomnography with a pH-impedance study. Impedance-positive gastroesophageal reflux events (weakly acidic or acidic) were categorized between sleep versus wake states: Symptom Index = number of symptoms with gastroesophageal reflux/total symptoms*100; Symptom Sensitivity Index = number of gastroesophageal reflux with symptoms/total gastroesophageal reflux*100; Symptom Association Probability = [(1 - probability of observed association between reflux and symptoms)*100]). We analyzed 317 gastroesophageal reflux events during 116 hours of polysomnography. During wake versus sleep, respectively, the median (interquartile range) frequency of impedance-positive gastroesophageal reflux was 4.9 (3.1-5.8) versus 1.4 (0.7-1.7) events/hour (P < 0.001) and the proximal migration was 2.6 (0.8-3.3) versus 0.2 (0.0-0.9) events/hour (P < 0.001). The Symptom Index for cardiorespiratory symptoms for impedance-positive events was 22.5 (0-55.3) versus 6.1 (0-13), P = 0.04, whereas the Symptom Sensitivity Index was 9.1 (0-23.1) versus 18.4 (0-50), P = 0.04, although Symptom Association Probability was similar (P = 0.68). Contrary to our hypothesis, frequency of gastroesophageal reflux in sleep is lower; however, spatiotemporal and chemical characteristics of gastroesophageal reflux and symptom-generation mechanisms are distinct. For cardiorespiratory symptoms during sleep, a lower Symptom Index entails evaluation for etiologies other than gastroesophageal reflux disease, a higher Symptom Sensitivity Index implies heightened esophageal sensitivity, and similar Symptom Association Probability indicates other mechanistic possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward infinity, while the Hill sphere method results in a severely underestimated probability. We provide a discussion of the reasons for these differences, and we finally present the results of the MOID method in the form of probability maps for the Earth and Mars on their current orbits. These maps show a relatively flat probability distribution, except for the occurrence of two ridges found at small inclinations and for coinciding projectile/target perihelion distances. Conclusions: Our results verify the standard formulae in the general case, away from the singularities. In fact, severe shortcomings are limited to the immediate vicinity of those extreme orbits. On the other hand, the new Monte Carlo methods can be used without excessive consumption of computer time, and the MOID method avoids the problems associated with the other methods. Appendices are available in electronic form at http://www.aanda.org

  18. A record of large earthquakes during the past two millennia on the southern Green Valley Fault, California

    USGS Publications Warehouse

    Lienkaemper, James J.; Baldwin, John N.; Turner, Robert; Sickler, Robert R.; Brown, Johnathan

    2013-01-01

    We document evidence for surface-rupturing earthquakes (events) at two trench sites on the southern Green Valley fault, California (SGVF). The 75-80-km long dextral SGVF creeps ~1-4 mm/yr. We identify stratigraphic horizons disrupted by upward-flowering shears and in-filled fissures unlikely to have formed from creep alone. The Mason Rd site exhibits four events from ~1013 CE to the Present. The Lopes Ranch site (LR, 12 km to the south) exhibits three events from 18 BCE to Present including the most recent event (MRE), 1610 ±52 yr CE (1σ) and a two-event interval (18 BCE-238 CE) isolated by a millennium of low deposition. Using Oxcal to model the timing of the 4-event earthquake sequence from radiocarbon data and the LR MRE yields a mean recurrence interval (RI or μ) of 199 ±82 yr (1σ) and ±35 yr (standard error of the mean), the first based on geologic data. The time since the most recent earthquake (open window since MRE) is 402 yr ±52 yr, well past μ~200 yr. The shape of the probability density function (pdf) of the average RI from Oxcal resembles a Brownian Passage Time (BPT) pdf (i.e., rather than normal) that permits rarer longer ruptures potentially involving the Berryessa and Hunting Creek sections of the northernmost GVF. The model coefficient of variation (cv, σ/μ) is 0.41, but a larger value (cv ~0.6) fits better when using BPT. A BPT pdf with μ of 250 yr and cv of 0.6 yields 30-yr rupture probabilities of 20-25% versus a Poisson probability of 11-17%.

  19. Induced seismicity hazard and risk by enhanced geothermal systems: an expert elicitation approach

    NASA Astrophysics Data System (ADS)

    Trutnevyte, Evelina; Azevedo, Inês L.

    2018-03-01

    Induced seismicity is a concern for multiple geoenergy applications, including low-carbon enhanced geothermal systems (EGS). We present the results of an international expert elicitation (n = 14) on EGS induced seismicity hazard and risk. Using a hypothetical scenario of an EGS plant and its geological context, we show that expert best-guess estimates of annualized exceedance probabilities of an M ≥ 3 event range from 0.2%-95% during reservoir stimulation and 0.2%-100% during operation. Best-guess annualized exceedance probabilities of M ≥ 5 event span from 0.002%-2% during stimulation and 0.003%-3% during operation. Assuming that tectonic M7 events could occur, some experts do not exclude induced (triggered) events of up to M7 too. If an induced M = 3 event happens at 5 km depth beneath a town with 10 000 inhabitants, most experts estimate a 50% probability that the loss is contained within 500 000 USD without any injuries or fatalities. In the case of an induced M = 5 event, there is 50% chance that the loss is below 50 million USD with the most-likely outcome of 50 injuries and one fatality or none. As we observe a vast diversity in quantitative expert judgements and underlying mental models, we conclude with implications for induced seismicity risk governance. That is, we suggest documenting individual expert judgements in induced seismicity elicitations before proceeding to consensual judgements, to convene larger expert panels in order not to cherry-pick the experts, and to aim for multi-organization multi-model assessments of EGS induced seismicity hazard and risk.

  20. Computable general equilibrium modelling of economic impacts from volcanic event scenarios at regional and national scale, Mt. Taranaki, New Zealand

    NASA Astrophysics Data System (ADS)

    McDonald, G. W.; Cronin, S. J.; Kim, J.-H.; Smith, N. J.; Murray, C. A.; Procter, J. N.

    2017-12-01

    The economic impacts of volcanism extend well beyond the direct costs of loss of life and asset damage. This paper presents one of the first attempts to assess the economic consequences of disruption associated with volcanic impacts at a range of temporal and spatial scales using multi-regional and dynamic computable general equilibrium (CGE) modelling. Based on the last decade of volcanic research findings at Mt. Taranaki, three volcanic event scenarios (Tahurangi, Inglewood and Opua) differentiated by critical physical thresholds were generated. In turn, the corresponding disruption economic impacts were calculated for each scenario. Under the Tahurangi scenario (annual probability of 0.01-0.02), a small-scale explosive (Volcanic Explosivity Index (VEI) 2-3) and dome forming eruption, the economic impacts were negligible with complete economic recovery experienced within a year. The larger Inglewood sub-Plinian to Plinian eruption scenario event (VEI > 4, annualised probability of 0.003) produced significant impacts on the Taranaki region economy of 207 million (representing 4.0% of regional gross domestic product (GDP) 1 year after the event, 2007 New Zealand dollars), that will take around 5 years to recover. The Opua scenario, the largest magnitude volcanic hazard modelled, is a major flank collapse and debris avalanche event with an annual probability of 0.00018. The associated economic impacts of this scenario were 397 million (representing 7.7% of regional GDP 1 year after the event) with the Taranaki region economy suffering permanent structural changes. Our dynamic analysis illustrates that different economic impacts play out at different stages in a volcanic crisis. We also discuss the key strengths and weaknesses of our modelling along with potential extensions.

  1. A trial-based economic evaluation of 2 nurse-led disease management programs in heart failure.

    PubMed

    Postmus, Douwe; Pari, Anees A Abdul; Jaarsma, Tiny; Luttik, Marie Louise; van Veldhuisen, Dirk J; Hillege, Hans L; Buskens, Erik

    2011-12-01

    Although previously conducted meta-analyses suggest that nurse-led disease management programs in heart failure (HF) can improve patient outcomes, uncertainty regarding the cost-effectiveness of such programs remains. To compare the relative merits of 2 variants of a nurse-led disease management program (basic or intensive support by a nurse specialized in the management of patients with HF) against care as usual (routine follow-up by a cardiologist), a trial-based economic evaluation was conducted alongside the COACH study. In terms of costs per life-year, basic support was found to dominate care as usual, whereas the incremental cost-effectiveness ratio between intensive support and basic support was found to be equal to €532,762 per life-year; in terms of costs per quality-adjusted life-year (QALY), basic support was found to dominate both care as usual and intensive support. An assessment of the uncertainty surrounding these findings showed that, at a threshold value of €20,000 per life-year/€20,000 per QALY, basic support was found to have a probability of 69/62% of being optimal against 17/30% and 14/8% for care as usual and intensive support, respectively. The results of our subgroup analysis suggest that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF would be optimal if the willingness-to-pay threshold exceeds €45,345 per life-year/€59,289 per QALY. Although the differences in costs and effects among the 3 study groups were not statistically significant, from a decision-making perspective, basic support still had a relatively large probability of generating the highest health outcomes at the lowest costs. Our results also substantiated that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF could further improve health outcomes at slightly higher costs. Copyright © 2011 Mosby, Inc. All rights reserved.

  2. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    NASA Astrophysics Data System (ADS)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  3. 2012 National Policy Seminar Wrap-Up

    ERIC Educational Resources Information Center

    Blandford, Ayoka

    2012-01-01

    CTE works! That was the recurring theme that attendees heard at the 2012 National Policy Seminar (NPS) hosted by ACTE. For those new to the event and lobbying, a pre-conference workshop, "Learning the Ropes of Washington CTE Advocacy," laid out the basics of Hill advocacy. Veteran CTE advocates were offered a basics-plus session,…

  4. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  5. Statistical properties of fluctuations of time series representing appearances of words in nationwide blog data and their applications: An example of modeling fluctuation scalings of nonstationary time series.

    PubMed

    Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako

    2016-11-01

    To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3×10^{9} Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.

  6. Stochastic modelling of intermittent fluctuations in the scrape-off layer: Correlations, distributions, level crossings, and moment estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, O. E., E-mail: odd.erik.garcia@uit.no; Kube, R.; Theodorsen, A.

    A stochastic model is presented for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas. The fluctuations in the plasma density are modeled by a super-position of uncorrelated pulses with fixed shape and duration, describing radial motion of blob-like structures. In the case of an exponential pulse shape and exponentially distributed pulse amplitudes, predictions are given for the lowest order moments, probability density function, auto-correlation function, level crossings, and average times for periods spent above and below a given threshold level. Also, the mean squared errors on estimators of sample mean and variance for realizations of the process bymore » finite time series are obtained. These results are discussed in the context of single-point measurements of fluctuations in the scrape-off layer, broad density profiles, and implications for plasma–wall interactions due to the transient transport events in fusion grade plasmas. The results may also have wide applications for modelling fluctuations in other magnetized plasmas such as basic laboratory experiments and ionospheric irregularities.« less

  7. Counterfactual Reasoning: Developing a sense of “nearest possible world”

    PubMed Central

    Rafetseder, Eva; Cristi-Vargas, Renate; Perner, Josef

    2011-01-01

    We investigated at what point in development 3- to 6-year-old children begin to demonstrate counterfactual reasoning by controlling for fortuitously correct answers that result from basic conditional reasoning. Basic conditional reasoning occurs when one applies typical regularities (such as “If it doesn’t rain the street is dry”) to counterfactual questions (such as “If it had not rained, would the street be wet or dry?”) without regard to actual events (for example, if street cleaners had just been washing the street). In counterfactual reasoning, however, the conditional reasoning must be constrained by actual events (according to the “nearest possible world”). In situations when counterfactual reasoning and basic conditional reasoning would yield the same answers, even the youngest children gave mostly correct answers. However, tasks in which the two reasoning strategies resulted in different answers proved unusually difficult even for the older children. PMID:20331674

  8. Oblique impacts: Catastrophic vs. protracted effects

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.

    1988-01-01

    Proposed impacts as the cause of biologic catastrophes at the end of the Cretaceous and Eocene face several enigmas: protracted extinctions, even prior to the stratigraphic cosmogenic signature; widespread but non-uniform dispersal of the meteoritic component; absence of a crater of sufficient size; and evidence for massive intensive fires. Various hypotheses provide reasonable mechanisms for mass mortalities: global cooling by continental impact sites; global warming by oceanic impact sites; contrasting effects of asteroidal, cometary, and even multiple impacts; and stress on an already fragile global environment. Yet not every known large impact is associated with a major biologic catastrophe. An alternative is expanded: the consequences of an oblique impact. The most probable angle of impact is 45 deg with the probability for an impact at smaller angles decreasing: A vertical impact is as rare as a tangential impact with a 5 deg impact angle or less occurring only 8 percent of the time. Consequently a low-angle impact is a rare but probable event. Laboratory experiments at the NASA-Ames Vertical Gun Range reveal important information about cratering efficiency, impact vaporization, projectile dispersal, and phenomenology, thereby providing perspective for possible consequences of such an impact on both the Earth and Moon. Oblique impacts are rare but certain events through geologic time: A 5 deg impact by a 2 km-diameter impactor on the Earth would occur only once in about 18 my with a 10 km-diameter once in about 450 my. Major life extinctions beginning prior to the stratigraphic cosmogenic signature or protracted extinctions seemingly too long after the proposed event may not be evidence against an impact as a cause but evidence for a more complex but probable sequence of events.

  9. Water level dynamics in wetlands and nesting success of Black Terns in Maine

    USGS Publications Warehouse

    Gilbert, A.T.; Servello, F.A.

    2005-01-01

    The Black Tern (Chlidonias niger) nests in freshwater wetlands that are prone to water level fluctuations, and nest losses to flooding are common. We examined temporal patterns in water levels at six sites with Black Tern colonies in Maine and determined probabilities of flood events and associated nest loss at Douglas Pond, the location of the largest breeding colony. Daily precipitation data from weather stations and water flow data from a flow gauge below Douglas Pond were obtained for 1960-1999. Information on nest losses from three floods at Douglas Pond in 1997-1999 were used to characterize small (6% nest loss), medium (56% nest loss) and large (94% nest loss) flood events, and we calculated probabilities of these three levels of flooding occurring at Douglas Pond using historic water levels data. Water levels generally decreased gradually during the nesting season at colony sites, except at Douglas Pond where water levels fluctuated substantially in response to rain events. Annual probabilities of small, medium, and large flood events were 68%, 35%, and 13% for nests initiated during 23 May-12 July, with similar probabilities for early (23 May-12 June) and late (13 June-12 July) periods. An index of potential nest loss indicated that medium floods at Douglas Pond had the greatest potential effect on nest success because they occurred relatively frequently and inundated large proportions of nests. Nest losses at other colonies were estimated to be approximately 30% of those at Douglas Pond. Nest losses to flooding appear to be common for the Black Tern in Maine and related to spring precipitation patterns, but ultimate effects on breeding productivity are uncertain.

  10. Implementing system simulation of C3 systems using autonomous objects

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1987-01-01

    The basis of all conflict recognition in simulation is a common frame of reference. Synchronous discrete-event simulation relies on the fixed points in time as the basic frame of reference. Asynchronous discrete-event simulation relies on fixed-points in the model space as the basic frame of reference. Neither approach provides sufficient support for autonomous objects. The use of a spatial template as a frame of reference is proposed to address these insufficiencies. The concept of a spatial template is defined and an implementation approach offered. Discussed are the uses of this approach to analyze the integration of sensor data associated with Command, Control, and Communication systems.

  11. Forecasting in Complex Systems

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2014-12-01

    Complex nonlinear systems are typically characterized by many degrees of freedom, as well as interactions between the elements. Interesting examples can be found in the areas of earthquakes and finance. In these two systems, fat tails play an important role in the statistical dynamics. For earthquake systems, the Gutenberg-Richter magnitude-frequency is applicable, whereas for daily returns for the securities in the financial markets are known to be characterized by leptokurtotic statistics in which the tails are power law. Very large fluctuations are present in both systems. In earthquake systems, one has the example of great earthquakes such as the M9.1, March 11, 2011 Tohoku event. In financial systems, one has the example of the market crash of October 19, 1987. Both were largely unexpected events that severely impacted the earth and financial systems systemically. Other examples include the M9.3 Andaman earthquake of December 26, 2004, and the Great Recession which began with the fall of Lehman Brothers investment bank on September 12, 2013. Forecasting the occurrence of these damaging events has great societal importance. In recent years, national funding agencies in a variety of countries have emphasized the importance of societal relevance in research, and in particular, the goal of improved forecasting technology. Previous work has shown that both earthquakes and financial crashes can be described by a common Landau-Ginzburg-type free energy model. These metastable systems are characterized by fat tail statistics near the classical spinodal. Correlations in these systems can grow and recede, but do not imply causation, a common source of misunderstanding. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In this talk, we describe the basic phenomenology of these systems and emphasize their similarities and differences. We also consider the problem of forecast validation and verification. In both of these systems, we show that small event counts (the natural time domain) is an important component of a forecast system.

  12. Rupture preparation process controlled by surface roughness on meter-scale laboratory fault

    NASA Astrophysics Data System (ADS)

    Yamashita, Futoshi; Fukuyama, Eiichi; Xu, Shiqing; Mizoguchi, Kazuo; Kawakata, Hironori; Takizawa, Shigeru

    2018-05-01

    We investigate the effect of fault surface roughness on rupture preparation characteristics using meter-scale metagabbro specimens. We repeatedly conducted the experiments with the same pair of rock specimens to make the fault surface rough. We obtained three experimental results under the same experimental conditions (6.7 MPa of normal stress and 0.01 mm/s of loading rate) but at different roughness conditions (smooth, moderately roughened, and heavily roughened). During each experiment, we observed many stick-slip events preceded by precursory slow slip. We investigated when and where slow slip initiated by using the strain gauge data processed by the Kalman filter algorithm. The observed rupture preparation processes on the smooth fault (i.e. the first experiment among the three) showed high repeatability of the spatiotemporal distributions of slow slip initiation. Local stress measurements revealed that slow slip initiated around the region where the ratio of shear to normal stress (τ/σ) was the highest as expected from finite element method (FEM) modeling. However, the exact location of slow slip initiation was where τ/σ became locally minimum, probably due to the frictional heterogeneity. In the experiment on the moderately roughened fault, some irregular events were observed, though the basic characteristics of other regular events were similar to those on the smooth fault. Local stress data revealed that the spatiotemporal characteristics of slow slip initiation and the resulting τ/σ drop for irregular events were different from those for regular ones even under similar stress conditions. On the heavily roughened fault, the location of slow slip initiation was not consistent with τ/σ anymore because of the highly heterogeneous static friction on the fault, which also decreased the repeatability of spatiotemporal distributions of slow slip initiation. These results suggest that fault surface roughness strongly controls the rupture preparation process, and generally increases its complexity with the degree of roughness.

  13. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  14. Determining the Probability that a Small Event in Brazil (magnitude 3.5 to 4.5 mb) will be Followed by a Larger Event

    NASA Astrophysics Data System (ADS)

    Assumpcao, M.

    2013-05-01

    A typical earthquake story in Brazil: A swarm of small earthquakes starts to occur near a small town, reaching magnitude 3.5, causing some alarm but no damage. The freightened population, not used to feeling earthquakes, calls the seismology experts who set up a local network to study the seismicity. To the usual and inevitable question "Are we going to have a larger earthquake?", the usual and standard answer "It is not possible to predict earthquakes; larger earthquakes are possible". Fearing unecessary panic, seismologists often add that "however, large earthquakes are not very likely". This vague answer has proven quite inadequate. "Not very likely" is interpreted by the population and authorities as "not going to happen, and there is not need to do anything". Before L'Aquila 2009, one case of magnitude 3.8 in Eastern Brazil was followed seven months later by a magnitude 4.9 causing serious damage to poorly built houses. One child died and the affected population felt deceived by the seismologists. In order to provide better answers than just a vague "not likely", we examined the Brazilian catalog of earthquakes for all cases of moderate magnitude (3.4 mb or larger) that were followed, up to one year later, by a larger event. We found that the chance of an event with magnitude 3.4 or larger being the foreshock of a larger magntitude is roughly 1/6. The probability of an event being a foreshock varies with magnitude from about 20% for a 3.5 mb to about 5% for a 4.5 mb. Also, given that an event in the range 3.4 to 4.3 is a foreshock, the probability that the mainshock will be 4.7 or larger is 1/6. The probability for a larger event to occur decreases with time after the occurrence of the possible foreshock with a time constant of ~70 days. Perhaps, by giving the population and civil defense a more quantitative answer (such as "the chance of a larger even is like rolling a six in a dice") may help the decision to reinforce poor houses or even evacuate people from very vulnerable houses in the epicentral area.

  15. Combining conversation analysis and event sequencing to study health communication.

    PubMed

    Pecanac, Kristen E

    2018-06-01

    Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.

  16. The Forecast Interpretation Tool—a Monte Carlo technique for blending climatic distributions with probabilistic forecasts

    USGS Publications Warehouse

    Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon

    2011-01-01

    Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.

  17. Uncertainty and denial: a resource-rational model of the value of information.

    PubMed

    Pierson, Emma; Goodman, Noah

    2014-01-01

    Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them.

  18. On the properties of stochastic intermittency in rainfall processes.

    PubMed

    Molini, A; La, Barbera P; Lanza, L G

    2002-01-01

    In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.

  19. Uncertainty and Denial: A Resource-Rational Model of the Value of Information

    PubMed Central

    Pierson, Emma; Goodman, Noah

    2014-01-01

    Classical decision theory predicts that people should be indifferent to information that is not useful for making decisions, but this model often fails to describe human behavior. Here we investigate one such scenario, where people desire information about whether an event (the gain/loss of money) will occur even though there is no obvious decision to be made on the basis of this information. We find a curious dual trend: if information is costless, as the probability of the event increases people want the information more; if information is not costless, people's desire for the information peaks at an intermediate probability. People also want information more as the importance of the event increases, and less as the cost of the information increases. We propose a model that explains these results, based on the assumption that people have limited cognitive resources and obtain information about which events will occur so they can determine whether to expend effort planning for them. PMID:25426631

  20. The Coincident Coherence of Extreme Doppler Velocity Events with p-mode Patches in the Solar Photosphere.

    NASA Astrophysics Data System (ADS)

    McClure, Rachel Lee

    2018-06-01

    Observations of the solar photosphere show many spatially compact Doppler velocity events with short life spans and extreme values. In the IMaX spectropolarimetric inversion data of the first flight of the SUNRISE balloon in 2009 these striking flashes in the intergranule lanes and complementary outstanding values in the centers of granules have line of sight Doppler velocity values in excess of 4 sigma from the mean. We conclude that values outside 4 sigma are a result from the superposition of the granulation flows and the p-modes.To determine how granulation and p-modes contribute to these outstanding Doppler events, I separate the two components using the Fast Fourier Transform. I produce the power spectrum of the spatial wave frequencies and their corresponding frequency in time for each image, and create a k-omega filter to separate the two components. Using the filtered data, test the hypothesis that extreme events occur because of strict superposition between the p-mode Doppler velocities and the granular velocities. I compare event counts from the observational data to those produced by random superposition of the two flow components and find that the observational event counts are consistent with the model event counts in the limit of small number statistics. Poisson count probabilities of event numbers observed are consistent with expected model count probability distributions.

  1. Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities

    NASA Astrophysics Data System (ADS)

    Abadie, Luis Maria; Galarraga, Ibon; Sainz de Murieta, Elisa

    2017-01-01

    A quantification of present and future mean annual losses due to extreme coastal events can be crucial for adequate decision making on adaptation to climate change in coastal areas around the globe. However, this approach is limited when uncertainty needs to be accounted for. In this paper, we assess coastal flood risk from sea-level rise and extreme events in 120 major cities around the world using an alternative stochastic approach that accounts for uncertainty. Probability distributions of future relative (local) sea-level rise have been used for each city, under three IPPC emission scenarios, RCP 2.6, 4.5 and 8.5. The approach allows a continuous stochastic function to be built to assess yearly evolution of damages from 2030 to 2100. Additionally, we present two risk measures that put low-probability, high-damage events in the spotlight: the Value at Risk (VaR) and the Expected Shortfall (ES), which enable the damages to be estimated when a certain risk level is exceeded. This level of acceptable risk can be defined involving different stakeholders to guide progressive adaptation strategies. The method presented here is new in the field of economics of adaptation and offers a much broader picture of the challenges related to dealing with climate impacts. Furthermore, it can be applied to assess not only adaptation needs but also to put adaptation into a timeframe in each city.

  2. Precise measurement of the top-quark mass in the lepton+jets topology at CDF II.

    PubMed

    Aaltonen, T; Abulencia, A; Adelman, J; Affolder, T; Akimoto, T; Albrow, M G; Amerio, S; Amidei, D; Anastassov, A; Anikeev, K; Annovi, A; Antos, J; Aoki, M; Apollinari, G; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzi-Bacchetta, P; Azzurri, P; Bacchetta, N; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Baroiant, S; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Behari, S; Bellettini, G; Bellinger, J; Belloni, A; Benjamin, D; Beretvas, A; Beringer, J; Berry, T; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bolshov, A; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carrillo, S; Carlsmith, D; Carosi, R; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, I; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Cilijak, M; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Coca, M; Compostella, G; Convery, M E; Conway, J; Cooper, B; Copic, K; Cordelli, M; Cortiana, G; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; DaRonco, S; Datta, M; D'Auria, S; Davies, T; Dagenhart, D; de Barbaro, P; De Cecco, S; Deisher, A; De Lentdecker, G; De Lorenzo, G; Dell'Orso, M; Delli Paoli, F; Demortier, L; Deng, J; Deninno, M; De Pedis, D; Derwent, P F; Di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Dörr, C; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, I; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Field, R; Flanagan, G; Forrest, R; Forrester, S; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garcia, J E; Garberson, F; Garfinkel, A F; Gay, C; Gerberich, H; Gerdes, D; Giagu, S; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Goldstein, J; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Group, R C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Hamilton, A; Han, B-Y; Han, J Y; Handler, R; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hauser, J; Hays, C; Heck, M; Heijboer, A; Heinemann, B; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Holloway, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; Iyutin, B; James, E; Jang, D; Jayatilaka, B; Jeans, D; Jeon, E J; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Karchin, P E; Kato, Y; Kemp, Y; Kephart, R; Kerzel, U; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Klute, M; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kraan, A C; Kraus, J; Kreps, M; Kroll, J; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhlmann, S E; Kuhr, T; Kulkarni, N P; Kusakabe, Y; Kwang, S; Laasanen, A T; Lai, S; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; LeCompte, T; Lee, J; Lee, J; Lee, Y J; Lee, S W; Lefèvre, R; Leonardo, N; Leone, S; Levy, S; Lewis, J D; Lin, C; Lin, C S; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lu, R-S; Lucchesi, D; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; Mack, P; MacQueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis, A; Margaroli, F; Marginean, R; Marino, C; Marino, C P; Martin, A; Martin, M; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Matsunaga, H; Mattson, M E; Mazini, R; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzemer, S; Menzione, A; Merkel, P; Mesropian, C; Messina, A; Miao, T; Miladinovic, N; Miles, J; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyamoto, A; Moed, S; Moggi, N; Mohr, B; Moon, C S; Moore, R; Morello, M; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nodulman, L; Norniella, O; Nurse, E; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Oldeman, R; Orava, R; Osterberg, K; Pagliarone, C; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Piedra, J; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Portell, X; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Reisert, B; Rekovic, V; Renton, P; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Saarikko, H; Safonov, A; Sakumoto, W K; Salamanna, G; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savard, P; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sfyrla, A; Shalhout, S Z; Shapiro, M D; Shears, T; Shepard, P F; Sherman, D; Shimojima, M; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soderberg, M; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spinella, F; Spreitzer, T; Squillacioti, P; Stanitzki, M; Staveris-Polykalas, A; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Sun, H; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Tourneur, S; Trischuk, W; Tsuno, S; Tu, Y; Turini, N; Ukegawa, F; Uozumi, S; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vazquez, F; Velev, G; Vellidis, C; Veramendi, G; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Vollrath, I; Volobouev, I; Volpi, G; Würthwein, F; Wagner, P; Wagner, R G; Wagner, R L; Wagner, J; Wagner, W; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Wynne, S M; Yagil, A; Yamamoto, K; Yamaoka, J; Yamashita, T; Yang, C; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zhang, X; Zhou, J; Zucchelli, S

    2007-11-02

    We present a measurement of the mass of the top quark from proton-antiproton collisions recorded at the CDF experiment in Run II of the Fermilab Tevatron. We analyze events from the single lepton plus jets final state (tt-->W(+)bW(-)b-->lnubqq'b). The top-quark mass is extracted using a direct calculation of the probability density that each event corresponds to the tt final state. The probability is a function of both the mass of the top quark and the energy scale of the calorimeter jets, which is constrained in situ by the hadronic W boson mass. Using 167 events observed in 955 pb(-1) of integrated luminosity, we achieve the single most precise measurement of the top-quark mass, 170.8+/-2.2(stat.)+/-1.4(syst.) GeV/c(2).

  3. The calculation of aircraft collision probabilities

    DOT National Transportation Integrated Search

    1971-10-01

    The basic limitation of, air traffic compression, from the safety point of view, is the increased risk of collision due to reduced separations. In order to evolve new procedures, and eventually a fully, automatic system, it is desirable to have a mea...

  4. 75 FR 9592 - FPL Energy Maine Hydro, LLC; Notice of Intent To Prepare an Environmental Document and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-03

    ... a probable maximum flood, and modification of the existing earthen embankments for improved slope stability and safety. The proposed remedial measures would not alter the basic footprint of the existing dam...

  5. Probability Learning: Changes in Behavior across Time and Development

    ERIC Educational Resources Information Center

    Plate, Rista C.; Fulvio, Jacqueline M.; Shutts, Kristin; Green, C. Shawn; Pollak, Seth D.

    2018-01-01

    Individuals track probabilities, such as associations between events in their environments, but less is known about the degree to which experience--within a learning session and over development--influences people's use of incoming probabilistic information to guide behavior in real time. In two experiments, children (4-11 years) and adults…

  6. Exaggerated Risk: Prospect Theory and Probability Weighting in Risky Choice

    ERIC Educational Resources Information Center

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-01-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and…

  7. Making Heads or Tails of Probability: An Experiment with Random Generators

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  8. Comparative Analytical Study of Evoked and Event Related Potentials as Correlates of Cognitive Processes

    DTIC Science & Technology

    1990-07-16

    lower brain levels, probably precognitive , suggesting the need to distinguish the components in humans that depend on cognition from those that do...definition of the stimulus regime and that much of this response occurs at lower brain levels, probably precognitive , suggesting the need to distinguish the

  9. Hazard rating forest stands for gypsy moth

    Treesearch

    Ray R., Jr. Hicks

    1991-01-01

    A gypsy moth hazard exists when forest conditions prevail that are conducive to extensive damage from gypsy moth. Combining forest hazard rating with information on insect population trends provides the basis for predicting the probability (risk) of an event occurring. The likelihood of defoliation is termed susceptibility and the probability of damage (mortality,...

  10. Search Path Evaluation Incorporating Object Placement Structure

    DTIC Science & Technology

    2007-12-20

    the probability of the set complement of this event: Pr(Ed) = 1 - kP PC (83) (k,t)I iIG Equation (83) provides the probability that if there is an...Networks," to appear in IEEE Transactions on Aerospace and Electronic Systems. 3. B. G. Koopman, Search and Screening: General Principles and Historical

  11. Effect of Donor and Recipient Factors on Corneal Graft Rejection

    PubMed Central

    Stulting, R. Doyle; Sugar, Alan; Beck, Roy; Belin, Michael; Dontchev, Mariya; Feder, Robert S.; Gal, Robin L.; Holland, Edward J.; Kollman, Craig; Mannis, Mark J.; Price, Francis; Stark, Walter; Verdier, David D.

    2014-01-01

    Purpose To assess the relationship between donor and recipient factors and corneal allograft rejection in eyes that underwent penetrating keratoplasty (PK) in the Cornea Donor Study. Methods 1090 subjects undergoing corneal transplantation for a moderate risk condition (principally Fuchs’ dystrophy or pseudophakic corneal edema) were followed for up to 5 years. Associations of baseline recipient and donor factors with the occurrence of a probable or definite rejection event were assessed in univariate and multivariate proportional hazards models. Results Eyes with pseudophakic or aphakic corneal edema (N=369) were more likely to experience a rejection event than eyes with Fuchs’ dystrophy (N=676) (34% ± 6% versus 22% ± 4%; hazard ratio = 1.56; 95% confidence interval 1.21 to 2.03). Among eyes with Fuchs’dystrophy, a higher probability of a rejection event was observed in phakic post-transplant eyes compared with eyes that underwent cataract extraction with or without intraocular lens implantation during PK (29% vs. 19%; hazard ratio = 0.54; 95% confidence interval 0.36 to 0.82). Female recipients had a higher probability of a rejection event than males (29% vs. 21%; hazard ratio=1.42; 95% confidence interval 1.08 to 1.87), after controlling for the effect of preoperative diagnosis and lens status. Donor age and donor recipient ABO compatibility were not associated with rejection. Conclusions There was a substantially higher graft rejection rate in eyes with pseudophakic or aphakic corneal edema compared with eyes with Fuchs’ dystrophy. Female recipients were more likely to have a rejection event than males. Graft rejection was not associated with donor age. PMID:22488114

  12. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  13. A probability index for surface zonda wind occurrence at Mendoza city through vertical sounding principal components analysis

    NASA Astrophysics Data System (ADS)

    Otero, Federico; Norte, Federico; Araneo, Diego

    2018-01-01

    The aim of this work is to obtain an index for predicting the probability of occurrence of zonda event at surface level from sounding data at Mendoza city, Argentine. To accomplish this goal, surface zonda wind events were previously found with an objective classification method (OCM) only considering the surface station values. Once obtained the dates and the onset time of each event, the prior closest sounding for each event was taken to realize a principal component analysis (PCA) that is used to identify the leading patterns of the vertical structure of the atmosphere previously to a zonda wind event. These components were used to construct the index model. For the PCA an entry matrix of temperature ( T) and dew point temperature (Td) anomalies for the standard levels between 850 and 300 hPa was build. The analysis yielded six significant components with a 94 % of the variance explained and the leading patterns of favorable weather conditions for the development of the phenomenon were obtained. A zonda/non-zonda indicator c can be estimated by a logistic multiple regressions depending on the PCA component loadings, determining a zonda probability index \\widehat{c} calculable from T and Td profiles and it depends on the climatological features of the region. The index showed 74.7 % efficiency. The same analysis was performed by adding surface values of T and Td from Mendoza Aero station increasing the index efficiency to 87.8 %. The results revealed four significantly correlated PCs with a major improvement in differentiating zonda cases and a reducing of the uncertainty interval.

  14. A method for modeling bias in a person's estimates of likelihoods of events

    NASA Technical Reports Server (NTRS)

    Nygren, Thomas E.; Morera, Osvaldo

    1988-01-01

    It is of practical importance in decision situations involving risk to train individuals to transform uncertainties into subjective probability estimates that are both accurate and unbiased. We have found that in decision situations involving risk, people often introduce subjective bias in their estimation of the likelihoods of events depending on whether the possible outcomes are perceived as being good or bad. Until now, however, the successful measurement of individual differences in the magnitude of such biases has not been attempted. In this paper we illustrate a modification of a procedure originally outlined by Davidson, Suppes, and Siegel (3) to allow for a quantitatively-based methodology for simultaneously estimating an individual's subjective utility and subjective probability functions. The procedure is now an interactive computer-based algorithm, DSS, that allows for the measurement of biases in probability estimation by obtaining independent measures of two subjective probability functions (S+ and S-) for winning (i.e., good outcomes) and for losing (i.e., bad outcomes) respectively for each individual, and for different experimental conditions within individuals. The algorithm and some recent empirical data are described.

  15. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  16. Cytochemical evaluation of the Guard procedure a regressive staining method for demonstrating chromosomal basic proteins. I. Effects of fixation, blocking reactions, selective extractions, and polyacid "differentiation".

    PubMed

    Cowden, R R; Rasch, E M; Curtis, S K

    1976-08-12

    Appropriately fixed preparations stained by a modification of the Guard (1959) reaction for "sex chromatin" display selective staining of interphase chromatin and mitotic or meiotic chromosomes. This is a regressive staining method which seems to depend on the selective displacement of an acidic dye from less basic structures, and retention of the dye at more basic sites. The results obtained with the reaction can be controlled by the length of time that the preparations are "differentiated" in solutions containing phosphomolybdic and phosphotungstic acids (polyacids). After three- or four-hour exposures to polyacid solutions, all chromatin is stained. However, with longer differentiation, "condensed" chromatin can be stained preferentially. Of a number of fixatives investigated, only 10% formalin, ethanol-acetic acid (3:1), and Bouin's solution proved useful. Others resulted in diminished specificity or a total loss of selectivity. The most intense staining was obtained after formalin fixation. Less intense dyebinding was observed after fixation in 3:1 - probably due to extraction of some histone fractions-and the least amount of dye was bound in Bouin's-fixed chromatin - probably due to blockage of arginine residues by picric acid. The reaction was not affected by enzymatic removal of nucleic acids or the extraction of lipids. It was diminished by treatment with trypsin or weak acetylation, and it was completely prevented by strong acetylation, deamination, or extraction of basic proteins with HCl. The results presented suggest that the modified Guard (1959) procedure selectively demonstrates basic nucleoproteins. Further, by the use of regressive differentiation in polyacid solutions, the retention of dye in more condensed chromatin can be favored.

  17. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  18. Postfragmentation density function for bacterial aggregates in laminar flow

    PubMed Central

    Byrne, Erin; Dzul, Steve; Solomon, Michael; Younger, John

    2014-01-01

    The postfragmentation probability density of daughter flocs is one of the least well-understood aspects of modeling flocculation. We use three-dimensional positional data of Klebsiella pneumoniae bacterial flocs in suspension and the knowledge of hydrodynamic properties of a laminar flow field to construct a probability density function of floc volumes after a fragmentation event. We provide computational results which predict that the primary fragmentation mechanism for large flocs is erosion. The postfragmentation probability density function has a strong dependence on the size of the original floc and indicates that most fragmentation events result in clumps of one to three bacteria eroding from the original floc. We also provide numerical evidence that exhaustive fragmentation yields a limiting density inconsistent with the log-normal density predicted in the literature, most likely due to the heterogeneous nature of K. pneumoniae flocs. To support our conclusions, artificial flocs were generated and display similar postfragmentation density and exhaustive fragmentation. PMID:21599205

  19. Risk and Risk Assessment in Environmental Education.

    ERIC Educational Resources Information Center

    Chiras, Daniel D.

    1982-01-01

    Risk assessment (the identification of hazards, the determination of the probability of a hazardous event occurring, and an estimation of the severity of such an event's occurrence) is suggested as a technique to be used to analyze current issues in environmental education in an objective manner. (PEB)

  20. USING THE HERMITE POLYNOMIALS IN RADIOLOGICAL MONITORING NETWORKS.

    PubMed

    Benito, G; Sáez, J C; Blázquez, J B; Quiñones, J

    2018-03-15

    The most interesting events in Radiological Monitoring Network correspond to higher values of H*(10). The higher doses cause skewness in the probability density function (PDF) of the records, which there are not Gaussian anymore. Within this work the probability of having a dose >2 standard deviations is proposed as surveillance of higher doses. Such probability is estimated by using the Hermite polynomials for reconstructing the PDF. The result is that the probability is ~6 ± 1%, much >2.5% corresponding to Gaussian PDFs, which may be of interest in the design of alarm level for higher doses.

Top