Sample records for numerical probability statements

  1. Stochastic Formal Correctness of Numerical Algorithms

    NASA Technical Reports Server (NTRS)

    Daumas, Marc; Lester, David; Martin-Dorel, Erik; Truffert, Annick

    2009-01-01

    We provide a framework to bound the probability that accumulated errors were never above a given threshold on numerical algorithms. Such algorithms are used for example in aircraft and nuclear power plants. This report contains simple formulas based on Levy's and Markov's inequalities and it presents a formal theory of random variables with a special focus on producing concrete results. We selected four very common applications that fit in our framework and cover the common practices of systems that evolve for a long time. We compute the number of bits that remain continuously significant in the first two applications with a probability of failure around one out of a billion, where worst case analysis considers that no significant bit remains. We are using PVS as such formal tools force explicit statement of all hypotheses and prevent incorrect uses of theorems.

  2. Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems

    NASA Astrophysics Data System (ADS)

    Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger

    2018-05-01

    In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.

  3. A 30-year history of earthquake crisis communication in California and lessons for the future

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  4. Valx: A system for extracting and structuring numeric lab test comparison statements from text

    PubMed Central

    Hao, Tianyong; Liu, Hongfang; Weng, Chunhua

    2017-01-01

    Objectives To develop an automated method for extracting and structuring numeric lab test comparison statements from text and evaluate the method using clinical trial eligibility criteria text. Methods Leveraging semantic knowledge from the Unified Medical Language System (UMLS) and domain knowledge acquired from the Internet, Valx takes 7 steps to extract and normalize numeric lab test expressions: 1) text preprocessing, 2) numeric, unit, and comparison operator extraction, 3) variable identification using hybrid knowledge, 4) variable - numeric association, 5) context-based association filtering, 6) measurement unit normalization, and 7) heuristic rule-based comparison statements verification. Our reference standard was the consensus-based annotation among three raters for all comparison statements for two variables, i.e., HbA1c and glucose, identified from all of Type 1 and Type 2 diabetes trials in ClinicalTrials.gov. Results The precision, recall, and F-measure for structuring HbA1c comparison statements were 99.6%, 98.1%, 98.8% for Type 1 diabetes trials, and 98.8%, 96.9%, 97.8% for Type 2 Diabetes trials, respectively. The precision, recall, and F-measure for structuring glucose comparison statements were 97.3%, 94.8%, 96.1% for Type 1 diabetes trials, and 92.3%, 92.3%, 92.3% for Type 2 diabetes trials, respectively. Conclusions Valx is effective at extracting and structuring free-text lab test comparison statements in clinical trial summaries. Future studies are warranted to test its generalizability beyond eligibility criteria text. The open-source Valx enables its further evaluation and continued improvement among the collaborative scientific community. PMID:26940748

  5. Valx: A System for Extracting and Structuring Numeric Lab Test Comparison Statements from Text.

    PubMed

    Hao, Tianyong; Liu, Hongfang; Weng, Chunhua

    2016-05-17

    To develop an automated method for extracting and structuring numeric lab test comparison statements from text and evaluate the method using clinical trial eligibility criteria text. Leveraging semantic knowledge from the Unified Medical Language System (UMLS) and domain knowledge acquired from the Internet, Valx takes seven steps to extract and normalize numeric lab test expressions: 1) text preprocessing, 2) numeric, unit, and comparison operator extraction, 3) variable identification using hybrid knowledge, 4) variable - numeric association, 5) context-based association filtering, 6) measurement unit normalization, and 7) heuristic rule-based comparison statements verification. Our reference standard was the consensus-based annotation among three raters for all comparison statements for two variables, i.e., HbA1c and glucose, identified from all of Type 1 and Type 2 diabetes trials in ClinicalTrials.gov. The precision, recall, and F-measure for structuring HbA1c comparison statements were 99.6%, 98.1%, 98.8% for Type 1 diabetes trials, and 98.8%, 96.9%, 97.8% for Type 2 diabetes trials, respectively. The precision, recall, and F-measure for structuring glucose comparison statements were 97.3%, 94.8%, 96.1% for Type 1 diabetes trials, and 92.3%, 92.3%, 92.3% for Type 2 diabetes trials, respectively. Valx is effective at extracting and structuring free-text lab test comparison statements in clinical trial summaries. Future studies are warranted to test its generalizability beyond eligibility criteria text. The open-source Valx enables its further evaluation and continued improvement among the collaborative scientific community.

  6. Analyzing diffuse scattering with supercomputers. Corrigendum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michels-Clark, Tara M.; Lynch, Vickie E.; Hoffmann, Christina M.

    2016-03-01

    The study by Michels-Clark et al. (2013 [Michels-Clark, T. M., Lynch, V. E., Hoffmann, C. M., Hauser, J., Weber, T., Harrison, R. & Bürgi, H. B. (2013). J. Appl. Cryst. 46, 1616-1625.]) contains misleading errors which are corrected here. The numerical results reported in that paper and the conclusions given there are not affected and remain unchanged. The transition probabilities in Table 1 (rows 4, 5, 7, 8) and Fig. 2 (rows 1 and 2) of the original paper were different from those used in the numerical calculations. Corrected transition probabilities as used in the computations are given in Tablemore » 1 and Fig. 1 of this article. The Δ parameter in the stacking model expresses the preference for the fifth layer in a five-layer stack to be eclipsed with respect to the first layer. This statement corrects the original text on p. 1622, lines 4–7. In the original Fig. 2 the helicity of the layer stacks b L and b R in rows 3 and 4 had been given as opposite to those in rows 1, 2 and 5. Fig. 1 of this article shows rows 3 and 4 corrected to correspond to rows 1, 2 and 5.« less

  7. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    PubMed

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  8. EMC: Mission Statement

    Science.gov Websites

    EMC: Mission Statement Mesoscale Modeling Branch Mission Statement The Mesoscale Modeling Branch , advanced numerical techniques applied to mesoscale modeling problems, parameterization of mesoscale new observing systems. The Mesoscale Modeling Branch publishes research results in various media for

  9. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  10. ERP Correlates of Verbal and Numerical Probabilities in Risky Choices: A Two-Stage Probability Processing View

    PubMed Central

    Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin

    2016-01-01

    Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612

  11. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  12. 40 CFR 156.10 - Labeling requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Prominence and legibility. (i) All words, statements, graphic representations, designs or other information..., statements, designs, or graphic matter on the labeling) and expressed in such terms as to render it likely to... phrase as “when used as directed”; and (x) Non-numerical and/or comparative statements on the safety of...

  13. 40 CFR 156.10 - Labeling requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) Prominence and legibility. (i) All words, statements, graphic representations, designs or other information..., statements, designs, or graphic matter on the labeling) and expressed in such terms as to render it likely to... phrase as “when used as directed”; and (x) Non-numerical and/or comparative statements on the safety of...

  14. 40 CFR 156.10 - Labeling requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Prominence and legibility. (i) All words, statements, graphic representations, designs or other information..., statements, designs, or graphic matter on the labeling) and expressed in such terms as to render it likely to... phrase as “when used as directed”; and (x) Non-numerical and/or comparative statements on the safety of...

  15. 40 CFR 156.10 - Labeling requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Prominence and legibility. (i) All words, statements, graphic representations, designs or other information..., statements, designs, or graphic matter on the labeling) and expressed in such terms as to render it likely to... phrase as “when used as directed”; and (x) Non-numerical and/or comparative statements on the safety of...

  16. 40 CFR 156.10 - Labeling requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Prominence and legibility. (i) All words, statements, graphic representations, designs or other information..., statements, designs, or graphic matter on the labeling) and expressed in such terms as to render it likely to... phrase as “when used as directed”; and (x) Non-numerical and/or comparative statements on the safety of...

  17. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  18. Reply to "Comment on 'Fractional quantum mechanics' and 'Fractional Schrödinger equation' ".

    PubMed

    Laskin, Nick

    2016-06-01

    The fractional uncertainty relation is a mathematical formulation of Heisenberg's uncertainty principle in the framework of fractional quantum mechanics. Two mistaken statements presented in the Comment have been revealed. The origin of each mistaken statement has been clarified and corrected statements have been made. A map between standard quantum mechanics and fractional quantum mechanics has been presented to emphasize the features of fractional quantum mechanics and to avoid misinterpretations of the fractional uncertainty relation. It has been shown that the fractional probability current equation is correct in the area of its applicability. Further studies have to be done to find meaningful quantum physics problems with involvement of the fractional probability current density vector and the extra term emerging in the framework of fractional quantum mechanics.

  19. On recent advances and future research directions for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Soliman, M. O.; Manhardt, P. D.

    1986-01-01

    This paper highlights some recent accomplishments regarding CFD numerical algorithm constructions for generation of discrete approximate solutions to classes of Reynolds-averaged Navier-Stokes equations. Following an overview of turbulent closure modeling, and development of appropriate conservation law systems, a Taylor weak-statement semi-discrete approximate solution algorithm is developed. Various forms for completion to the final linear algebra statement are cited, as are a range of candidate numerical linear algebra solution procedures. This development sequence emphasizes the key building blocks of a CFD RNS algorithm, including solution trial and test spaces, integration procedure and added numerical stability mechanisms. A range of numerical results are discussed focusing on key topics guiding future research directions.

  20. Statement Verification: A Stochastic Model of Judgment and Response.

    ERIC Educational Resources Information Center

    Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia

    1994-01-01

    A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)

  1. Lattice Theory, Measures and Probability

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2007-11-01

    In this tutorial, I will discuss the concepts behind generalizing ordering to measuring and apply these ideas to the derivation of probability theory. The fundamental concept is that anything that can be ordered can be measured. Since we are in the business of making statements about the world around us, we focus on ordering logical statements according to implication. This results in a Boolean lattice, which is related to the fact that the corresponding logical operations form a Boolean algebra. The concept of logical implication can be generalized to degrees of implication by generalizing the zeta function of the lattice. The rules of probability theory arise naturally as a set of constraint equations. Through this construction we are able to neatly connect the concepts of order, structure, algebra, and calculus. The meaning of probability is inherited from the meaning of the ordering relation, implication, rather than being imposed in an ad hoc manner at the start.

  2. 14 CFR 313.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REGULATIONS IMPLEMENTATION OF THE ENERGY POLICY AND CONSERVATION ACT § 313.3 Definitions. As used in this part: (a) Energy efficiency means the ratio of the useful output of services in air transportation to the energy consumption of such services. (b) Energy statement is a statement of the probable impact of a...

  3. 14 CFR 313.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REGULATIONS IMPLEMENTATION OF THE ENERGY POLICY AND CONSERVATION ACT § 313.3 Definitions. As used in this part: (a) Energy efficiency means the ratio of the useful output of services in air transportation to the energy consumption of such services. (b) Energy statement is a statement of the probable impact of a...

  4. [The mission].

    PubMed

    Ruiz Moreno, J; Blanch Mon, A

    2000-01-01

    After having made a historical review of the concept of mission statement, of evaluating its importance (See Part I), of describing the bases to create a mission statement from a strategic perspective and of analyzing the advantages of this concept, probably more important as a business policy (See Parts I and II), the authors proceed to analyze the mission statement in health organizations. Due to the fact that a mission statement is lacking in the majority of health organizations, the strategy of health organizations are not exactly favored; as a consequence, neither are its competitive advantage nor the development of its essential competencies. After presenting a series of mission statements corresponding to Anglo-Saxon health organizations, the authors highlight two mission statements corresponding to our social context. The article finishes by suggesting an adequate sequence for developing a mission statement in those health organizations having a strategic sense.

  5. How Well Was the Sun Observed during the Maunder Minimum?

    NASA Astrophysics Data System (ADS)

    Hoyt, Douglas V.; Schatten, Kenneth H.

    1996-04-01

    In this paper we examine how well the Sun and sunspots were observed during the Maunder Minimum from 1645 to 1715. Recent research has given us the dates of observations by Hevelius, Picard, La Hire, Flamsteed, and about 70 other observers. These specific observations allow a ‘lower estimate’ of the fraction of the time the Sun was observed to be deduced. It is found that 52.7% of the days have recorded observations. There are additional 12 observers who provide general statements that no sunspots were observed during specific years or intervals despite diligent efforts. Taking these statements to mean, unrealistically, that every day during these intervals was observed, gives an ‘upper estimate’ of 98% of the days. If the general statements are relaxed by assuming that 100 ± 50 days per year were actually observed by these diligent observers, than our ‘best estimate’ is that 68%±7% of the days during the Maunder Minimum were observed. In short, this supports the view that the Maunder Minimum existed and was not an artifact of few observations. Some sunspots are probably still missed in modern compilations, but the existence of a prolonged sunspot minimum would not be threatened by their discovery in future research. Additional support for intense scrutiny of the Sun comes from a report of a white-light flare in 1705 and from the numerous reports of new sunspots entering the disk of the Sun.

  6. Statements of Special Educational Needs and Tribunal Appeals in England and Wales 2003-2013--In Numbers

    ERIC Educational Resources Information Center

    Marsh, Alan J.

    2014-01-01

    The study presents a statistical analysis of statements of special educational needs and Special Educational Needs and Disability (SEND) tribunal appeal rates in England and Wales. It is set against the backcloth of the 2014 Children and Families Act which replaces statements with Education, Health and Care (EHC) plans. The numerical overview…

  7. Quantifying risk: verbal probability expressions in Spanish and English.

    PubMed

    Cohn, Lawrence D; Vázquez, Miguel E Cortés; Alvarez, Adolfo

    2009-01-01

    To investigate how Spanish- and English-speaking adults interpret verbal probability expressions presented in Spanish and English (eg, posiblemente and possibly, respectively). Professional translators and university students from México and the United States read a series of likelihood statements in Spanish or English and then estimated the certainty implied by each statement. Several terms that are regarded as cognates in English and Spanish elicited significantly different likelihood ratings. Several language equivalencies were also identified. These findings provide the first reported evaluation of Spanish likelihood terms for use in risk communications directed towards monolingual and bilingual Spanish speakers.

  8. 17 CFR 210.8-05 - Pro forma financial information.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... consummated during the most recent fiscal year or subsequent interim period, pro forma statements of income... (2) If consummation of the transaction has occurred or is probable after the date of the most recent... combination as of the date of the most recent balance sheet. For a purchase, pro forma statements of income...

  9. A Taylor weak-statement algorithm for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Kim, J. W.

    1987-01-01

    Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.

  10. Smokers' knowledge and understanding of advertised tar numbers: health policy implications.

    PubMed

    Cohen, J B

    1996-01-01

    This article examines health policy implications of providing smokers with numerical tar yield information in cigarette advertising. Results of a national probability telephone survey regarding smokers' knowledge and understanding of numerical tar yields and deliveries are reported. Few smokers knew the tar level of their own cigarettes (the exception being smokers of 1- to 5-mg tar cigarettes), and a majority could not correctly judge the relative tar levels of cigarettes. Smokers were unsure whether switching to lower-tar cigarettes would reduce their personal health risks. Many smokers relied on absolute numbers in making trade-offs between number of cigarettes smoked and their tar levels, thus confusion machine-rated tar-yields with actual amounts ingested. The wisdom of the present method of providing tar and nicotine numbers in ads and recommendations for modifying the test protocol are now under discussion. This research indicates that these tar numbers and their implications are poorly understood. The paper recommends revisions in tar ratings to make them more useful and a required statement on cigarette packages to more explicitly relate tar levels to major health risks.

  11. Smokers' knowledge and understanding of advertised tar numbers: health policy implications.

    PubMed Central

    Cohen, J B

    1996-01-01

    OBJECTIVES. This article examines health policy implications of providing smokers with numerical tar yield information in cigarette advertising. METHODS. Results of a national probability telephone survey regarding smokers' knowledge and understanding of numerical tar yields and deliveries are reported. RESULTS. Few smokers knew the tar level of their own cigarettes (the exception being smokers of 1- to 5-mg tar cigarettes), and a majority could not correctly judge the relative tar levels of cigarettes. Smokers were unsure whether switching to lower-tar cigarettes would reduce their personal health risks. Many smokers relied on absolute numbers in making trade-offs between number of cigarettes smoked and their tar levels, thus confusion machine-rated tar-yields with actual amounts ingested. CONCLUSIONS. The wisdom of the present method of providing tar and nicotine numbers in ads and recommendations for modifying the test protocol are now under discussion. This research indicates that these tar numbers and their implications are poorly understood. The paper recommends revisions in tar ratings to make them more useful and a required statement on cigarette packages to more explicitly relate tar levels to major health risks. PMID:8561236

  12. On the nonlinearity of spatial scales in extreme weather attribution statements

    NASA Astrophysics Data System (ADS)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  13. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  14. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE PAGES

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; ...

    2017-06-17

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  15. 30 CFR 77.216-2 - Water, sediment, or slurry impoundments and impounding structures; minimum plan requirements...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... instrumentation. (9) Graphs showing area-capacity curves. (10) A statement of the runoff attributable to the probable maximum precipitation of 6-hour duration and the calculations used in determining such runoff. (11) A statement of the runoff attributable to the storm for which the structure is designed and the...

  16. Discussion on how to implement a verbal scale in a forensic laboratory: Benefits, pitfalls and suggestions to avoid misunderstandings.

    PubMed

    Marquis, Raymond; Biedermann, Alex; Cadola, Liv; Champod, Christophe; Gueissaz, Line; Massonnet, Geneviève; Mazzella, Williams David; Taroni, Franco; Hicks, Tacha

    2016-09-01

    In a recently published guideline for evaluative reporting in forensic science, the European Network of Forensic Science Institutes (ENFSI) recommended the use of the likelihood ratio for the measurement of the value of forensic results. As a device to communicate the probative value of the results, the ENFSI guideline mentions the possibility to define and use a verbal scale, which should be unified within a forensic institution. This paper summarizes discussions held between scientists of our institution to develop and implement such a verbal scale. It intends to contribute to general discussions likely to be faced by any forensic institution that engages in continuous monitoring and improving of their evaluation and reporting format. We first present published arguments in favour of the use of such verbal qualifiers. We emphasise that verbal qualifiers do not replace the use of numbers to evaluate forensic findings, but are useful to communicate the probative value, since the weight of evidence in terms of likelihood ratio are still apprehended with difficulty by both the forensic scientists, especially in the absence of hard data, and the recipient of information. We further present arguments that support the development of the verbal scale that we propose. Recognising the limits of the use of such a verbal scale, we then discuss its disadvantages: it may lead to the spurious view according to which the value of the observations made in a given case is relative to other cases. Verbal qualifiers are also prone to misunderstandings and cannot be coherently combined with other evidence. We therefore recommend not using the verbal qualifier alone in a written statement. While scientists should only report on the probability of the findings - and not on the probability of the propositions, which are the duty of the Court - we suggest showing examples to let the recipient of information understand how the scientific evidence affects the probabilities of the propositions. To avoid misunderstandings, we also advise to mention in the statement what the results do not mean. Finally, we are of the opinion that if experts were able to coherently articulate numbers, and if recipients of information could properly handle such numbers, then verbal qualifiers could be abandoned completely. At that time, numerical expressions of probative value will be appropriately understood, as other numerical measures that most of us understand without the need of any further explanation, such as expressions for length or temperature. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  17. A numerically efficient finite element hydroelastic analysis. Volume 2: Implementation in NASTRAN, part 1

    NASA Technical Reports Server (NTRS)

    Coppolino, R. N.

    1974-01-01

    Details are presented of the implementation of the new formulation into NASTRAN including descriptions of the DMAP statements required for conversion of the program and details pertaining to problem definition and bulk data considerations. Details of the current 1/8-scale space shuttle external tank mathematical model, numerical results and analysis/test comparisons are also presented. The appendices include a description and listing of a FORTRAN program used to develop harmonic transformation bulk data (multipoint constraint statements) and sample bulk data information for a number of hydroelastic problems.

  18. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  19. 17 CFR 244.101 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 244.102). (a)(1) Non-GAAP financial measure. A non-GAAP financial measure is a numerical measure of a... income, balance sheet or statement of cash flows (or equivalent statements) of the issuer; or (ii... from the most directly comparable measure so calculated and presented. (2) A non-GAAP financial measure...

  20. 17 CFR 244.101 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 244.102). (a)(1) Non-GAAP financial measure. A non-GAAP financial measure is a numerical measure of a... income, balance sheet or statement of cash flows (or equivalent statements) of the issuer; or (ii... from the most directly comparable measure so calculated and presented. (2) A non-GAAP financial measure...

  1. 17 CFR 244.101 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 244.102). (a)(1) Non-GAAP financial measure. A non-GAAP financial measure is a numerical measure of a... income, balance sheet or statement of cash flows (or equivalent statements) of the issuer; or (ii... from the most directly comparable measure so calculated and presented. (2) A non-GAAP financial measure...

  2. A numerical algorithm with preference statements to evaluate the performance of scientists.

    PubMed

    Ricker, Martin

    Academic evaluation committees have been increasingly receptive for using the number of published indexed articles, as well as citations, to evaluate the performance of scientists. It is, however, impossible to develop a stand-alone, objective numerical algorithm for the evaluation of academic activities, because any evaluation necessarily includes subjective preference statements. In a market, the market prices represent preference statements, but scientists work largely in a non-market context. I propose a numerical algorithm that serves to determine the distribution of reward money in Mexico's evaluation system, which uses relative prices of scientific goods and services as input. The relative prices would be determined by an evaluation committee. In this way, large evaluation systems (like Mexico's Sistema Nacional de Investigadores ) could work semi-automatically, but not arbitrarily or superficially, to determine quantitatively the academic performance of scientists every few years. Data of 73 scientists from the Biology Institute of Mexico's National University are analyzed, and it is shown that the reward assignation and academic priorities depend heavily on those preferences. A maximum number of products or activities to be evaluated is recommended, to encourage quality over quantity.

  3. The stability issues in problems of mathematical modeling

    NASA Astrophysics Data System (ADS)

    Mokin, A. Yu.; Savenkova, N. P.; Udovichenko, N. S.

    2018-03-01

    In the paper it is briefly considered various aspects of stability concepts, which are used in physics, mathematics and numerical methods of solution. The interrelation between these concepts is described, the questions of preliminary stability research before the numerical solution of the problem and the correctness of the mathematical statement of the physical problem are discussed. Examples of concrete mathematical statements of individual physical problems are given: a nonlocal problem for the heat equation, the Korteweg-de Fries equation with boundary conditions at infinity, the sine-Gordon equation, the problem of propagation of femtosecond light pulses in an area with a cubic nonlinearity.

  4. The Detection of Signals in Impulsive Noise.

    DTIC Science & Technology

    1983-06-01

    ASSI FICATION/ DOWN GRADING SCHEOUL1E * I1S. DISTRIBUTION STATEMENT (of th0i0 Rhport) Approved for Public Release; Distribucion Unlimited * 17...has a symmetric distribution, sgn(x i) will be -1 with probability 1/2 and +1 with probability 1/2. Considering the sum of observations as 0 binomial

  5. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  6. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  7. Complete Numerical Solution of the Diffusion Equation of Random Genetic Drift

    PubMed Central

    Zhao, Lei; Yue, Xingye; Waxman, David

    2013-01-01

    A numerical method is presented to solve the diffusion equation for the random genetic drift that occurs at a single unlinked locus with two alleles. The method was designed to conserve probability, and the resulting numerical solution represents a probability distribution whose total probability is unity. We describe solutions of the diffusion equation whose total probability is unity as complete. Thus the numerical method introduced in this work produces complete solutions, and such solutions have the property that whenever fixation and loss can occur, they are automatically included within the solution. This feature demonstrates that the diffusion approximation can describe not only internal allele frequencies, but also the boundary frequencies zero and one. The numerical approach presented here constitutes a single inclusive framework from which to perform calculations for random genetic drift. It has a straightforward implementation, allowing it to be applied to a wide variety of problems, including those with time-dependent parameters, such as changing population sizes. As tests and illustrations of the numerical method, it is used to determine: (i) the probability density and time-dependent probability of fixation for a neutral locus in a population of constant size; (ii) the probability of fixation in the presence of selection; and (iii) the probability of fixation in the presence of selection and demographic change, the latter in the form of a changing population size. PMID:23749318

  8. Pavement markings and safety : tech transfer summary.

    DOT National Transportation Integrated Search

    2010-11-01

    Objective: This study explores the statistical relationship between crash occurrence probability and longitudinal pavement marking retroreflectivity. : Problem Statement: Previous research on pavement markings, from a safety perspective, tackled vari...

  9. Questioning the Relevance of Model-Based Probability Statements on Extreme Weather and Future Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2007-12-01

    We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?

  10. Progress on a Taylor weak statement finite element algorithm for high-speed aerodynamic flows

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Freels, J. D.

    1989-01-01

    A new finite element numerical Computational Fluid Dynamics (CFD) algorithm has matured to the point of efficiently solving two-dimensional high speed real-gas compressible flow problems in generalized coordinates on modern vector computer systems. The algorithm employs a Taylor Weak Statement classical Galerkin formulation, a variably implicit Newton iteration, and a tensor matrix product factorization of the linear algebra Jacobian under a generalized coordinate transformation. Allowing for a general two-dimensional conservation law system, the algorithm has been exercised on the Euler and laminar forms of the Navier-Stokes equations. Real-gas fluid properties are admitted, and numerical results verify solution accuracy, efficiency, and stability over a range of test problem parameters.

  11. Children and Gun Violence. Hearings on S. 1087, a Bill To Amend Title 18, United States Code, To Prohibit the Possession of a Handgun or Ammunition by, or the Private Transfer of a Handgun or Ammunition to, a Juvenile, before the Subcommittee on Juvenile Justice of the Committee on the Judiciary. United States Senate, 103rd Congress, First Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.

    This transcript contains the following: (1) statements of several Committee Members; (2) text of the proposed legislation; (3) a list of witnesses; and (4) statements, testimony, and supporting documents submitted by the witnesses. Numerous laypersons and professionals have their testimonies recorded, giving statements in support of and in…

  12. Maximum Entropy Calculations on a Discrete Probability Space

    DTIC Science & Technology

    1986-01-01

    constraints acting besides normalization. Statement 3: " The aim of this paper is to show that the die experiment just spoken of has solutions by classical ...analysis. Statement 4: We snall solve this problem in a purely classical way, without the need for recourse to any exotic estimator, such as ME." Note... The I’iximoun Entropy Principle lin i rejirk.ible -series ofT papers beginning in 1957, E. T. J.ayiieti (1957) be~gan a revuluuion in inductive

  13. On the reality of the conjunction fallacy.

    PubMed

    Sides, Ashley; Osherson, Daniel; Bonini, Nicolao; Viale, Riccardo

    2002-03-01

    Attributing higher "probability" to a sentence of form p-and-q, relative to p, is a reasoning fallacy only if (1) the word probability carries its modern, technical meaning and (2) the sentence p is interpreted as a conjunct of the conjunction p-and-q. Legitimate doubts arise about both conditions in classic demonstrations of the conjunction fallacy. We used betting paradigms and unambiguously conjunctive statements to reduce these sources of ambiguity about conjunctive reasoning. Despite the precautions, conjunction fallacies were as frequent under betting instructions as under standard probability instructions.

  14. Evaluation of strategies to communicate harmful and potentially harmful constituent (HPHC) information through cigarette package inserts: a discrete choice experiment.

    PubMed

    Salloum, Ramzi G; Louviere, Jordan J; Getz, Kayla R; Islam, Farahnaz; Anshari, Dien; Cho, Yoojin; O'Connor, Richard J; Hammond, David; Thrasher, James F

    2017-07-13

    The US Food and Drug Administration (FDA) has regulatory authority to use inserts to communicate with consumers about harmful and potentially harmful constituents (HPHCs) in tobacco products; however, little is known about the most effective manner for presenting HPHC information. In a discrete choice experiment, participants evaluated eight choice sets, each of which showed two cigarette packages from four different brands and tar levels (high vs low), accompanied by an insert that included between-subject manipulations (ie, listing of HPHCs vs grouping by disease outcome and numeric values ascribed to HPHCs vs no numbers) and within-subject manipulations (ie, 1 of 4 warning topics; statement linking an HPHC with disease vs statement with no HPHC link). For each choice set, participants were asked: (1) which package is more harmful and (2) which motivates them to not smoke; each with a 'no difference' option. Alternative-specific logit models regressed choice on attribute levels. 1212 participants were recruited from an online consumer panel (725 18-29-year-old smokers and susceptible non-smokers and 487 30-64-year-old smokers). Participants were more likely to endorse high-tar products as more harmful than low-tar products, with a greater effect when numeric HPHC information was present. Compared with a simple warning statement, the statement linking HPHCs with disease encouraged quit motivation. Numeric HPHC information on inserts appears to produce misunderstandings that some cigarettes are less harmful than others. Furthermore, brief narratives that link HPHCs to smoking-related disease may promote cessation versus communications that do not explicitly link HPHCs to disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Interactivity fosters Bayesian reasoning without instruction.

    PubMed

    Vallée-Tourangeau, Gaëlle; Abadie, Marlène; Vallée-Tourangeau, Frédéric

    2015-06-01

    Successful statistical reasoning emerges from a dynamic system including: a cognitive agent, material artifacts with their actions possibilities, and the thoughts and actions that are realized while reasoning takes place. Five experiments provide evidence that enabling the physical manipulation of the problem information (through the use of playing cards) substantially improves statistical reasoning, without training or instruction, not only with natural frequency statements (Experiment 1) but also with single-event probability statements (Experiment 2). Improved statistical reasoning was not simply a matter of making all sets and subsets explicit in the pack of cards (Experiment 3), it was not merely due to the discrete and countable layout resulting from the cards manipulation, and it was not mediated by participants' level of engagement with the task (Experiment 5). The positive effect of an increased manipulability of the problem information on participants' reasoning performance was generalizable both over problems whose numeric properties did not map perfectly onto the cards and over different types of cards (Experiment 4). A systematic analysis of participants' behaviors revealed that manipulating cards improved performance when reasoners spent more time actively changing the presentation layout "in the world" as opposed to when they spent more time passively pointing at cards, seemingly attempting to solve the problem "in their head." Although they often go unnoticed, the action possibilities of the material artifacts available and the actions that are realized on those artifacts are constitutive of successful statistical reasoning, even in adults who have ostensibly reached cognitive maturity. (c) 2015 APA, all rights reserved).

  16. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  17. Simple Messages Help Set the Record Straight about Scientific Agreement on Human-Caused Climate Change: The Results of Two Experiments

    PubMed Central

    Myers, Teresa A.; Maibach, Edward; Peters, Ellen; Leiserowitz, Anthony

    2015-01-01

    Human-caused climate change is happening; nearly all climate scientists are convinced of this basic fact according to surveys of experts and reviews of the peer-reviewed literature. Yet, among the American public, there is widespread misunderstanding of this scientific consensus. In this paper, we report results from two experiments, conducted with national samples of American adults, that tested messages designed to convey the high level of agreement in the climate science community about human-caused climate change. The first experiment tested hypotheses about providing numeric versus non-numeric assertions concerning the level of scientific agreement. We found that numeric statements resulted in higher estimates of the scientific agreement. The second experiment tested the effect of eliciting respondents’ estimates of scientific agreement prior to presenting them with a statement about the level of scientific agreement. Participants who estimated the level of agreement prior to being shown the corrective statement gave higher estimates of the scientific consensus than respondents who were not asked to estimate in advance, indicating that incorporating an “estimation and reveal” technique into public communication about scientific consensus may be effective. The interaction of messages with political ideology was also tested, and demonstrated that messages were approximately equally effective among liberals and conservatives. Implications for theory and practice are discussed. PMID:25812121

  18. Simple messages help set the record straight about scientific agreement on human-caused climate change: the results of two experiments.

    PubMed

    Myers, Teresa A; Maibach, Edward; Peters, Ellen; Leiserowitz, Anthony

    2015-01-01

    Human-caused climate change is happening; nearly all climate scientists are convinced of this basic fact according to surveys of experts and reviews of the peer-reviewed literature. Yet, among the American public, there is widespread misunderstanding of this scientific consensus. In this paper, we report results from two experiments, conducted with national samples of American adults, that tested messages designed to convey the high level of agreement in the climate science community about human-caused climate change. The first experiment tested hypotheses about providing numeric versus non-numeric assertions concerning the level of scientific agreement. We found that numeric statements resulted in higher estimates of the scientific agreement. The second experiment tested the effect of eliciting respondents' estimates of scientific agreement prior to presenting them with a statement about the level of scientific agreement. Participants who estimated the level of agreement prior to being shown the corrective statement gave higher estimates of the scientific consensus than respondents who were not asked to estimate in advance, indicating that incorporating an "estimation and reveal" technique into public communication about scientific consensus may be effective. The interaction of messages with political ideology was also tested, and demonstrated that messages were approximately equally effective among liberals and conservatives. Implications for theory and practice are discussed.

  19. What is the uncertainty principle of non-relativistic quantum mechanics?

    NASA Astrophysics Data System (ADS)

    Riggs, Peter J.

    2018-05-01

    After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.

  20. Numerical solution of the stochastic parabolic equation with the dependent operator coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashyralyev, Allaberen; Department of Mathematics, ITTU, Ashgabat; Okur, Ulker

    2015-09-18

    In the present paper, a single step implicit difference scheme for the numerical solution of the stochastic parabolic equation with the dependent operator coefficient is presented. Theorem on convergence estimates for the solution of this difference scheme is established. In applications, this abstract result permits us to obtain the convergence estimates for the solution of difference schemes for the numerical solution of initial boundary value problems for parabolic equations. The theoretical statements for the solution of this difference scheme are supported by the results of numerical experiments.

  1. Communicating uncertainty: managing the inherent probabilistic character of hazard estimates

    NASA Astrophysics Data System (ADS)

    Albarello, Dario

    2013-04-01

    Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific communicators) of a view of probability (the "frequentist" view) that is useful in scientific practice but is very far from the common use of uncertain reasoning (that is nearer to the "epistemic" view). Considering probability a sort of physical measure inherent in the process under examination (like an acceleration value) instead of a degree of belief (rationally inferred) about any statement concerning future occurrences tends to hide the importance of a shared responsibility about relevant choices that involves scientists and citizens in the same extent.

  2. A globally well-posed finite element algorithm for aerodynamics applications

    NASA Technical Reports Server (NTRS)

    Iannelli, G. S.; Baker, A. J.

    1991-01-01

    A finite element CFD algorithm is developed for Euler and Navier-Stokes aerodynamic applications. For the linear basis, the resultant approximation is at least second-order-accurate in time and space for synergistic use of three procedures: (1) a Taylor weak statement, which provides for derivation of companion conservation law systems with embedded dispersion-error control mechanisms; (2) a stiffly stable second-order-accurate implicit Rosenbrock-Runge-Kutta temporal algorithm; and (3) a matrix tensor product factorization that permits efficient numerical linear algebra handling of the terminal large-matrix statement. Thorough analyses are presented regarding well-posed boundary conditions for inviscid and viscous flow specifications. Numerical solutions are generated and compared for critical evaluation of quasi-one- and two-dimensional Euler and Navier-Stokes benchmark test problems.

  3. Poisson's ratio of fiber-reinforced composites

    NASA Astrophysics Data System (ADS)

    Christiansson, Henrik; Helsing, Johan

    1996-05-01

    Poisson's ratio flow diagrams, that is, the Poisson's ratio versus the fiber fraction, are obtained numerically for hexagonal arrays of elastic circular fibers in an elastic matrix. High numerical accuracy is achieved through the use of an interface integral equation method. Questions concerning fixed point theorems and the validity of existing asymptotic relations are investigated and partially resolved. Our findings for the transverse effective Poisson's ratio, together with earlier results for random systems by other authors, make it possible to formulate a general statement for Poisson's ratio flow diagrams: For composites with circular fibers and where the phase Poisson's ratios are equal to 1/3, the system with the lowest stiffness ratio has the highest Poisson's ratio. For other choices of the elastic moduli for the phases, no simple statement can be made.

  4. TSCA Inventory Policy and Guidance

    EPA Pesticide Factsheets

    A list of numerous policy statements and guidance documents on how to identify certain chemical substances for the purpose of assigning unique and unambiguous descriptions tor each substance listed on the Inventory.

  5. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    DOE PAGES

    Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.

    2016-02-16

    Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less

  6. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  7. 12 CFR 221.101 - Determination and effect of purpose of loan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... lender “acting in good faith.” The requirement of “good faith” is of vital importance here. Its... would probably be subject to this part. It could not accept in good faith a statement to the contrary...

  8. Executive Summary to EDC-2: The Endocrine Society's Second Scientific Statement on Endocrine-Disrupting Chemicals

    PubMed Central

    Chappell, V. A.; Fenton, S. E.; Flaws, J. A.; Nadal, A.; Prins, G. S.; Toppari, J.; Zoeller, R. T.

    2015-01-01

    This Executive Summary to the Endocrine Society's second Scientific Statement on environmental endocrine-disrupting chemicals (EDCs) provides a synthesis of the key points of the complete statement. The full Scientific Statement represents a comprehensive review of the literature on seven topics for which there is strong mechanistic, experimental, animal, and epidemiological evidence for endocrine disruption, namely: obesity and diabetes, female reproduction, male reproduction, hormone-sensitive cancers in females, prostate cancer, thyroid, and neurodevelopment and neuroendocrine systems. EDCs such as bisphenol A, phthalates, pesticides, persistent organic pollutants such as polychlorinated biphenyls, polybrominated diethyl ethers, and dioxins were emphasized because these chemicals had the greatest depth and breadth of available information. The Statement also included thorough coverage of studies of developmental exposures to EDCs, especially in the fetus and infant, because these are critical life stages during which perturbations of hormones can increase the probability of a disease or dysfunction later in life. A conclusion of the Statement is that publications over the past 5 years have led to a much fuller understanding of the endocrine principles by which EDCs act, including nonmonotonic dose-responses, low-dose effects, and developmental vulnerability. These findings will prove useful to researchers, physicians, and other healthcare providers in translating the science of endocrine disruption to improved public health. PMID:26414233

  9. A tool for simulating collision probabilities of animals with marine renewable energy devices.

    PubMed

    Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise

    2017-01-01

    The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.

  10. Numerical Solutions for Laminar Boundary Layer Behind Blast Waves.

    DTIC Science & Technology

    1980-05-01

    DISTRIBUTION STATEMENT (of thle Report) Approved for public release; distribution unlimited. 17 . DISTRIBUTION STATEMENT (of the abstract entered in Block 20...Reference I ............. 41 5. Boundary-Layer Functions for Case A, B, C, and D ......... 98 3 NOMENCLATURE A constant, Eqs. (10) and ( 17 ) B...the constant A was chosen as follows to simplify the coefficients of f and g1 A = 2mF CZ(a+i) OPO/pCO ( The ( 17 ) The explicit dependence of the flow

  11. Banning Books.

    ERIC Educational Resources Information Center

    Trede, Mildred

    1991-01-01

    The "Game of Decisions" is presented to encourage students to consider the consequences of banning books and/or ideas. The game involves story writing, creating probability graphs, writing a letter protesting censorship from a chosen historical period, and examining a controversial science issue. Three thesis statements for generating group…

  12. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    PubMed

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Believable Statements of Uncertainty and Believable Science

    PubMed Central

    Lindstrom, Richard M.

    2017-01-01

    Nearly fifty years ago, two landmark papers appeared that should have cured the problem of ambiguous uncertainty statements in published data. Eisenhart’s paper in Science called for statistically meaningful numbers, and Currie’s Analytical Chemistry paper revealed the wide range in common definitions of detection limit. Confusion and worse can result when uncertainties are misinterpreted or ignored. The recent stories of cold fusion, variable radioactive decay, and piezonuclear reactions provide cautionary examples in which prior probability has been neglected. We show examples from our laboratory and others to illustrate the fact that uncertainty depends on both statistical and scientific judgment. PMID:28584391

  14. Experimental joint weak measurement on a photon pair as a probe of Hardy's paradox.

    PubMed

    Lundeen, J S; Steinberg, A M

    2009-01-16

    It has been proposed that the ability to perform joint weak measurements on postselected systems would allow us to study quantum paradoxes. These measurements can investigate the history of those particles that contribute to the paradoxical outcome. Here we experimentally perform weak measurements of joint (i.e., nonlocal) observables. In an implementation of Hardy's paradox, we weakly measure the locations of two photons, the subject of the conflicting statements behind the paradox. Remarkably, the resulting weak probabilities verify all of these statements but, at the same time, resolve the paradox.

  15. [Comments on the use of the "life-table method" in orthopedics].

    PubMed

    Hassenpflug, J; Hahne, H J; Hedderich, J

    1992-01-01

    In the description of long term results, e.g. of joint replacements, survivorship analysis is used increasingly in orthopaedic surgery. The survivorship analysis is more useful to describe the frequency of failure rather than global statements in percentage. The relative probability of failure for fixed intervals is drawn from the number of controlled patients and the frequency of failure. The complementary probabilities of success are linked in their temporal sequence thus representing the probability of survival at a fixed endpoint. Necessary condition for the use of this procedure is the exact definition of moment and manner of failure. It is described how to establish survivorship tables.

  16. Numerical Simulations of Vortical Mode Stirring: Effects of Large Scale Shear and Strain

    DTIC Science & Technology

    2015-09-30

    Numerical Simulations of Vortical Mode Stirring: Effects of Large-Scale Shear and Strain M.-Pascale Lelong NorthWest Research Associates...can be implemented in larger-scale ocean models. These parameterizations will incorporate the effects of local ambient conditions including latitude...talk at the 1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Nonlinear Effects in Internal Waves Conference held

  17. Spatial correlations and probability density function of the phase difference in a developed speckle-field: numerical and natural experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mysina, N Yu; Maksimova, L A; Ryabukho, V P

    Investigated are statistical properties of the phase difference of oscillations in speckle-fields at two points in the far-field diffraction region, with different shapes of the scatterer aperture. Statistical and spatial nonuniformity of the probability density function of the field phase difference is established. Numerical experiments show that, for the speckle-fields with an oscillating alternating-sign transverse correlation function, a significant nonuniformity of the probability density function of the phase difference in the correlation region of the field complex amplitude, with the most probable values 0 and p, is observed. A natural statistical interference experiment using Young diagrams has confirmed the resultsmore » of numerical experiments. (laser applications and other topics in quantum electronics)« less

  18. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in an underestimate of the likelihood of an event occurring ‘today’ leading to potentially inappropriate action choices. We thus present some initial guidelines for communicating such eruption forecasts.

  19. Interval-type and affine arithmetic-type techniques for handling uncertainty in expert systems

    NASA Astrophysics Data System (ADS)

    Ceberio, Martine; Kreinovich, Vladik; Chopra, Sanjeev; Longpre, Luc; Nguyen, Hung T.; Ludascher, Bertram; Baral, Chitta

    2007-02-01

    Expert knowledge consists of statements Sj (facts and rules). The facts and rules are often only true with some probability. For example, if we are interested in oil, we should look at seismic data. If in 90% of the cases, the seismic data were indeed helpful in locating oil, then we can say that if we are interested in oil, then with probability 90% it is helpful to look at the seismic data. In more formal terms, we can say that the implication "if oil then seismic" holds with probability 90%. Another example: a bank A trusts a client B, so if we trust the bank A, we should trust B too; if statistically this trust was justified in 99% of the cases, we can conclude that the corresponding implication holds with probability 99%. If a query Q is deducible from facts and rules, what is the resulting probability p(Q) in Q? We can describe the truth of Q as a propositional formula F in terms of Sj, i.e., as a combination of statements Sj linked by operators like &, [logical or], and [not sign]; computing p(Q) exactly is NP-hard, so heuristics are needed. Traditionally, expert systems use technique similar to straightforward interval computations: we parse F and replace each computation step with corresponding probability operation. Problem: at each step, we ignore the dependence between the intermediate results Fj; hence intervals are too wide. Example: the estimate for P(A[logical or][not sign]A) is not 1. Solution: similar to affine arithmetic, besides P(Fj), we also compute P(Fj&Fi) (or P(Fj1&...&Fjd)), and on each step, use all combinations of l such probabilities to get new estimates. Results: e.g., P(A[logical or][not sign]A) is estimated as 1.

  20. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  1. On Modeling of If-Then Rules for Probabilistic Inference

    DTIC Science & Technology

    1993-02-01

    conditionals b -- a. This space contains A strictly. Contrary to a statement in Gilio and Spezzaferri (1992), these conditionals are equivalent to...Wiley, N.Y. [4] Gilio , A. and Spezzaferri, F. (1992). Knowledge integration for condi- tional probability assessmn-ts. Proceedings 8th Conf. Uncertainty

  2. Comments on statistical issues in numerical modeling for underground nuclear test monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W.L.; Anderson, K.K.

    1993-11-01

    The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks.

  3. THE BERNOULLI EQUATION AND COMPRESSIBLE FLOW THEORIES

    EPA Science Inventory

    The incompressible Bernoulli equation is an analytical relationship between pressure, kinetic energy, and potential energy. As perhaps the simplest and most useful statement for describing laminar flow, it buttresses numerous incompressible flow models that have been developed ...

  4. American Society for Metabolic and Bariatric Surgery position statement on long-term survival benefit after metabolic and bariatric surgery.

    PubMed

    Kim, Julie; Eisenberg, Dan; Azagury, Dan; Rogers, Ann; Campos, Guilherme M

    2016-01-01

    The following position statement has been issued by the American Society for Metabolic and Bariatric Surgery in response to numerous inquiries made to the Society by patients, physicians, society members, hospitals, health insurance payors, the media, and others regarding the benefit of metabolic and bariatric surgery on long-term survival. An overview of the current available published peer-reviewed scientific evidence is presented. Copyright © 2016 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  5. Polish Society of Endocrinology Position statement on endocrine disrupting chemicals (EDCs).

    PubMed

    Rutkowska, Aleksandra; Rachoń, Dominik; Milewicz, Andrzej; Ruchała, Marek; Bolanowski, Marek; Jędrzejuk, Diana; Bednarczuk, Tomasz; Górska, Maria; Hubalewska-Dydejczyk, Alicja; Kos-Kudła, Beata; Lewiński, Andrzej; Zgliczyński, Wojciech

    2015-01-01

    With the reference to the position statements of the Endocrine Society, the Paediatric Endocrine Society, and the European Society of Paediatric Endocrinology, the Polish Society of Endocrinology points out the adverse health effects caused by endocrine disrupting chemicals (EDCs) commonly used in daily life as components of plastics, food containers, pharmaceuticals, and cosmetics. The statement is based on the alarming data about the increase of the prevalence of many endocrine disorders such as: cryptorchidism, precocious puberty in girls and boys, and hormone-dependent cancers (endometrium, breast, prostate). In our opinion, it is of human benefit to conduct epidemiological studies that will enable the estimation of the risk factors of exposure to EDCs and the probability of endocrine disorders. Increasing consumerism and the industrial boom has led to severe pollution of the environment with a corresponding negative impact on human health; thus, there is great necessity for the biomonitoring of EDCs in Poland.

  6. Coincidence probabilities for spacecraft gravitational wave experiments - Massive coalescing binaries

    NASA Technical Reports Server (NTRS)

    Tinto, Massimo; Armstrong, J. W.

    1991-01-01

    Massive coalescing binary systems are candidate sources of gravitational radiation in the millihertz frequency band accessible to spacecraft Doppler tracking experiments. This paper discusses signal processing and detection probability for waves from coalescing binaries in the regime where the signal frequency increases linearly with time, i.e., 'chirp' signals. Using known noise statistics, thresholds with given false alarm probabilities are established for one- and two-spacecraft experiments. Given the threshold, the detection probability is calculated as a function of gravitational wave amplitude for both one- and two-spacecraft experiments, assuming random polarization states and under various assumptions about wave directions. This allows quantitative statements about the detection efficiency of these experiments and the utility of coincidence experiments. In particular, coincidence probabilities for two-spacecraft experiments are insensitive to the angle between the directions to the two spacecraft, indicating that near-optical experiments can be done without constraints on spacecraft trajectories.

  7. Ability Level Estimation of Students on Probability Unit via Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Özyurt, Hacer; Özyurt, Özcan

    2015-01-01

    Problem Statement: Learning-teaching activities bring along the need to determine whether they achieve their goals. Thus, multiple choice tests addressing the same set of questions to all are frequently used. However, this traditional assessment and evaluation form contrasts with modern education, where individual learning characteristics are…

  8. Processing Quantified Noun Phrases with Numbers versus Verbal Quantifiers

    ERIC Educational Resources Information Center

    Moxey, Linda M.

    2018-01-01

    Statements containing quantity information are commonplace. Although there is literature explaining the way in which quantities themselves are conveyed in numbers or words (e.g., "many", "probably"), there is less on the effects of different types of quantity description on the processing of surrounding text. Given that…

  9. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  10. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  11. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  12. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  13. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  14. 76 FR 65541 - Environmental Assessment and Finding of No Significant Impact Related to Exemption From Certain...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ... Regulatory Commission. ACTION: Environmental assessment and finding of no significant impact. FOR FURTHER... the action does not require either an environmental assessment or an environmental impact statement... adverse environmental impacts. The proposed action will not significantly increase the probability or...

  15. A Bayesian approach to reliability and confidence

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1989-01-01

    The historical evolution of NASA's interest in quantitative measures of reliability assessment is outlined. The introduction of some quantitative methodologies into the Vehicle Reliability Branch of the Safety, Reliability and Quality Assurance (SR and QA) Division at Johnson Space Center (JSC) was noted along with the development of the Extended Orbiter Duration--Weakest Link study which will utilize quantitative tools for a Bayesian statistical analysis. Extending the earlier work of NASA sponsor, Richard Heydorn, researchers were able to produce a consistent Bayesian estimate for the reliability of a component and hence by a simple extension for a system of components in some cases where the rate of failure is not constant but varies over time. Mechanical systems in general have this property since the reliability usually decreases markedly as the parts degrade over time. While they have been able to reduce the Bayesian estimator to a simple closed form for a large class of such systems, the form for the most general case needs to be attacked by the computer. Once a table is generated for this form, researchers will have a numerical form for the general solution. With this, the corresponding probability statements about the reliability of a system can be made in the most general setting. Note that the utilization of uniform Bayesian priors represents a worst case scenario in the sense that as researchers incorporate more expert opinion into the model, they will be able to improve the strength of the probability calculations.

  16. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  17. CCM Continuity Constraint Method: A finite-element computational fluid dynamics algorithm for incompressible Navier-Stokes fluid flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, P. T.

    1993-09-01

    As the field of computational fluid dynamics (CFD) continues to mature, algorithms are required to exploit the most recent advances in approximation theory, numerical mathematics, computing architectures, and hardware. Meeting this requirement is particularly challenging in incompressible fluid mechanics, where primitive-variable CFD formulations that are robust, while also accurate and efficient in three dimensions, remain an elusive goal. This dissertation asserts that one key to accomplishing this goal is recognition of the dual role assumed by the pressure, i.e., a mechanism for instantaneously enforcing conservation of mass and a force in the mechanical balance law for conservation of momentum. Provingmore » this assertion has motivated the development of a new, primitive-variable, incompressible, CFD algorithm called the Continuity Constraint Method (CCM). The theoretical basis for the CCM consists of a finite-element spatial semi-discretization of a Galerkin weak statement, equal-order interpolation for all state-variables, a 0-implicit time-integration scheme, and a quasi-Newton iterative procedure extended by a Taylor Weak Statement (TWS) formulation for dispersion error control. Original contributions to algorithmic theory include: (a) formulation of the unsteady evolution of the divergence error, (b) investigation of the role of non-smoothness in the discretized continuity-constraint function, (c) development of a uniformly H 1 Galerkin weak statement for the Reynolds-averaged Navier-Stokes pressure Poisson equation, (d) derivation of physically and numerically well-posed boundary conditions, and (e) investigation of sparse data structures and iterative methods for solving the matrix algebra statements generated by the algorithm.« less

  18. 43 CFR 2812.0-6 - Statement of policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the O. and C. lands presents peculiar problems of management which require for their solution the... significant part by the cost of transporting the logs to the mill. Where there is an existing road which is... capacity to accommodate the probable normal requirements both of the applicant and of the Government and...

  19. 77 FR 50534 - Biweekly Notice; Applications and Amendments to Facility Operating Licenses and Combined Licenses...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ...) involve a significant increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident previously... statement of the alleged facts or expert opinion which support the contention and on which the requestor...

  20. Teaching Qualitative Energy-Eigenfunction Shape with Physlets

    ERIC Educational Resources Information Center

    Belloni, Mario; Christian, Wolfgang; Cox, Anne J.

    2007-01-01

    More than 35 years ago, French and Taylor outlined an approach to teach students and teachers alike how to understand "qualitative plots of bound-state wave functions." They described five fundamental statements based on the quantum-mechanical concepts of probability and energy (total and potential), which could be used to deduce the shape of…

  1. 76 FR 63342 - Environmental Impact Statement, Tappan Zee Hudson River Crossing Project (Rockland and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-12

    ... are highest during the morning eastbound commute and the evening westbound commute, but the bridge is..., agencies, and the public during the scoping process. 4. Probable Effects The EIS will consider in detail the potential environmental effects of the alternatives under consideration based on the current...

  2. You Say IFRS, I Say FASB…Let's Call the Whole Thing Off

    ERIC Educational Resources Information Center

    Tickell, Geoffrey; Rahman, Monsurur; Alexandre, Romain

    2013-01-01

    This paper discusses the noticeable nervousness of many US-based financial statement issuers in adopting IFRS. For contextual purposes, the paper provides an overview of the FASB/IFRS convergence so far and its probable future. A detailed review of convergence in accounting standards is explained through the respective standards for "Pensions…

  3. 17 CFR 210.8-05 - Pro forma financial information.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., and should include the following: (1) If the transaction was consummated during the most recent fiscal... transaction has occurred or is probable after the date of the most recent balance sheet required by § 210.8-02... recent balance sheet. For a purchase, pro forma statements of income reflecting the combined operations...

  4. 17 CFR 210.8-05 - Pro forma financial information.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., and should include the following: (1) If the transaction was consummated during the most recent fiscal... transaction has occurred or is probable after the date of the most recent balance sheet required by § 210.8-02... recent balance sheet. For a purchase, pro forma statements of income reflecting the combined operations...

  5. 17 CFR 210.8-05 - Pro forma financial information.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., and should include the following: (1) If the transaction was consummated during the most recent fiscal... transaction has occurred or is probable after the date of the most recent balance sheet required by § 210.8-02... recent balance sheet. For a purchase, pro forma statements of income reflecting the combined operations...

  6. Thermal and Mechanical Non-Equilibrium Effects on Turbulent Flows: Fundamental Studies of Energy Exchanges Through Direct Numerical Simulations, Molecular Simulations and Experiments

    DTIC Science & Technology

    2016-02-26

    AFRL-AFOSR-VA-TR-2016-0104 Thermal and mechanical non-equilibrium effects on turbulent flows:fundamental studies of energy exchanges through direct...flows: fundamental studies of energy exchanges through direct numerical simulations, molecular simulations and experiments 5a.  CONTRACT NUMBER 5b...AVAILABILITY STATEMENT A DISTRIBUTION UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT Utilizing internal energy exchange for intelligent

  7. The Finite-Size Scaling Relation for the Order-Parameter Probability Distribution of the Six-Dimensional Ising Model

    NASA Astrophysics Data System (ADS)

    Merdan, Ziya; Karakuş, Özlem

    2016-11-01

    The six dimensional Ising model with nearest-neighbor pair interactions has been simulated and verified numerically on the Creutz Cellular Automaton by using five bit demons near the infinite-lattice critical temperature with the linear dimensions L=4,6,8,10. The order parameter probability distribution for six dimensional Ising model has been calculated at the critical temperature. The constants of the analytical function have been estimated by fitting to probability function obtained numerically at the finite size critical point.

  8. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  9. Descriptive and numeric estimation of risk for psychotic disorders among affected individuals and relatives: Implications for clinical practice

    PubMed Central

    Austin, Jehannine C.; Hippman, Catriona; Honer, William G.

    2013-01-01

    Studies show that individuals with psychotic illnesses and their families want information about psychosis risks for other relatives. However, deriving accurate numeric probabilities for psychosis risk is challenging, and people have difficulty interpreting probabilistic information, thus some have suggested that clinicians should use risk descriptors, such as ‘moderate’ or ‘quite high’, rather than numbers. Little is known about how individuals with psychosis and their family members use quantitative and qualitative descriptors of risk in the specific context of chance for an individual to develop psychosis. We explored numeric and descriptive estimations of psychosis risk among individuals with psychotic disorders and unaffected first-degree relatives. In an online survey, respondents numerically and descriptively estimated risk for an individual to develop psychosis in scenarios where they had: A) no affected family members; and B) an affected sibling. 219 affected individuals and 211 first-degree relatives participated. Affected individuals estimated significantly higher risks than relatives. Participants attributed all descriptors between “very low” and “very high” to probabilities of 1%, 10%, 25% and 50%+. For a given numeric probability, different risk descriptors were attributed in different scenarios. Clinically, brief interventions around risk (using either probabilities or descriptors alone) are vulnerable to miscommunication and potentially profoundly negative consequences –interventions around risk are best suited to in-depth discussion. PMID:22421074

  10. Descriptive and numeric estimation of risk for psychotic disorders among affected individuals and relatives: implications for clinical practice.

    PubMed

    Austin, Jehannine C; Hippman, Catriona; Honer, William G

    2012-03-30

    Studies show that individuals with psychotic illnesses and their families want information about psychosis risks for other relatives. However, deriving accurate numeric probabilities for psychosis risk is challenging, and people have difficulty interpreting probabilistic information; thus, some have suggested that clinicians should use risk descriptors, such as "moderate" or "quite high", rather than numbers. Little is known about how individuals with psychosis and their family members use quantitative and qualitative descriptors of risk in the specific context of chance for an individual to develop psychosis. We explored numeric and descriptive estimations of psychosis risk among individuals with psychotic disorders and unaffected first-degree relatives. In an online survey, respondents numerically and descriptively estimated risk for an individual to develop psychosis in scenarios where they had: A) no affected family members; and B) an affected sibling. Participants comprised 219 affected individuals and 211 first-degree relatives participated. Affected individuals estimated significantly higher risks than relatives. Participants attributed all descriptors between "very low" and "very high" to probabilities of 1%, 10%, 25% and 50%+. For a given numeric probability, different risk descriptors were attributed in different scenarios. Clinically, brief interventions around risk (using either probabilities or descriptors alone) are vulnerable to miscommunication and potentially negative consequences-interventions around risk are best suited to in-depth discussion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Science Learning at Home: Involving Families

    ERIC Educational Resources Information Center

    Crawford, Elizabeth Outlaw; Heaton, Emily T.; Heslop, Karen; Kixmiller, Kassandra

    2009-01-01

    Families' involvement in their children's science learning at home has numerous benefits, especially when they support children's self-initiated investigations. In a position statement on parental involvement in science education, the National Science Teachers Association (NSTA 2009) stresses the role of parents in the daily reinforcement of…

  12. Proportional Change Index: An Alternative for Comparing Child Change Data.

    ERIC Educational Resources Information Center

    Wolery, Mark

    1983-01-01

    The Proportional Change Index (PCI), a numerical statement of the relationship between children's rate of development during intervention with the rate of development at the time intervention began, is proposed as a way of expressing child progress from developmental data. (Author/CL)

  13. Achieving better cooling of turbine blades using numerical simulation methods

    NASA Astrophysics Data System (ADS)

    Inozemtsev, A. A.; Tikhonov, A. S.; Sendyurev, C. I.; Samokhvalov, N. Yu.

    2013-02-01

    A new design of the first-stage nozzle vane for the turbine of a prospective gas-turbine engine is considered. The blade's thermal state is numerically simulated in conjugate statement using the ANSYS CFX 13.0 software package. Critical locations in the blade design are determined from the distribution of heat fluxes, and measures aimed at achieving more efficient cooling are analyzed. Essentially lower (by 50-100°C) maximal temperature of metal has been achieved owing to the results of the performed work.

  14. Laboratory Investigations and Numerical Modeling of Loss Mechanisms in Sound Propagation in Sandy Sediments

    DTIC Science & Technology

    2009-09-30

    poroelastic medium,” Submitted for publication in J. Acoust . Soc. Am., (2009). 9. B.T. Hefner, D.R. Jackson, and J. Calantoni, “The effects of...B.T. Hefner and D.R. Jackson, “Dispersion and attenuation due to scattering from heterogeneities the frame bulk modulus of a poroelastic medium,” Submitted for publication in J. Acoust . Soc. Am., (2009). ...DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Laboratory Investigations and Numerical Modeling of Loss

  15. Impact of Typhoons on the Western Pacific Ocean (ITOP) DRI: Numerical Modeling of Ocean Mixed Layer Turbulence and Entrainment at High Winds

    DTIC Science & Technology

    2013-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Impact of Typhoons on the Western Pacific Ocean (ITOP...The measurement and modeling activities include a focus on the impact of surface waves, air-sea fluxes and the temperature, salinity and velocity...SUBTITLE Impact of Typhoons on the Western Pacific Ocean (ITOP) DRI: Numerical Modeling of Ocean Mixed Layer Turbulence and Entrainment at High Winds

  16. Views and Dreams: A Delphi Investigation into Library 2.0 Applications

    ERIC Educational Resources Information Center

    Bronstein, Jenny; Aharony, Noa

    2009-01-01

    The study's purpose was to investigate the views and opinions of librarians about the implementation of Web 2.0 technologies into library operations and services. The Delphi technique was chosen as the method of inquiry in this study, in which a group of panelists graded the desirability and probability of a list of statements. Thirty-nine…

  17. 75 FR 28652 - Certain Environmental Goods: Probable Economic Effect of Duty-Free Treatment for U.S. Imports...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ...; and Prepare several case studies on the competitive position of selected U.S. environmental goods... environmental goods of significant export and/or commercial interest to the United States. Each case study will... appear at the public hearing. September 14, 2010: Deadline for filing pre-hearing briefs and statements...

  18. 75 FR 26287 - Notice; Applications and Amendments to Facility Operating Licenses Involving Proposed No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-11

    ... amendment would not (1) involve a significant increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident... contention and a concise statement of the alleged facts or expert opinion which support the contention and on...

  19. Using Qualitative and Phenomenological Principles to Assess Stakeholders' Perceptions of Probability

    ERIC Educational Resources Information Center

    Newman, Isadore; Hitchcock, John H.; Nastasi, Bonnie K.

    2017-01-01

    Any attempt to influence behavior by sharing a research finding that makes a probabilistic statement (e.g., a p value) should necessarily entail consideration of how consumers of the information might interpret this information. Such consideration can be informed, at least in part, by applying phenomenological principles of inquiry. This does not…

  20. Ecological risk assessment to support fuels treatment project decisions

    Treesearch

    Jay O' Laughlin

    2010-01-01

    Risk is a combined statement of the probability that something of value will be damaged and some measure of the damage’s adverse effect. Wildfires burning in the uncharacteristic fuel conditions now typical throughout the Western United States can damage ecosystems and adversely affect environmental conditions. Wildfire behavior can be modified by prefire fuel...

  1. Publishing SNP genotypes of human embryonic stem cell lines: policy statement of the International Stem Cell Forum Ethics Working Party.

    PubMed

    Knoppers, Bartha M; Isasi, Rosario; Benvenisty, Nissim; Kim, Ock-Joo; Lomax, Geoffrey; Morris, Clive; Murray, Thomas H; Lee, Eng Hin; Perry, Margery; Richardson, Genevra; Sipp, Douglas; Tanner, Klaus; Wahlström, Jan; de Wert, Guido; Zeng, Fanyi

    2011-09-01

    Novel methods and associated tools permitting individual identification in publicly accessible SNP databases have become a debatable issue. There is growing concern that current technical and ethical safeguards to protect the identities of donors could be insufficient. In the context of human embryonic stem cell research, there are no studies focusing on the probability that an hESC line donor could be identified by analyzing published SNP profiles and associated genotypic and phenotypic information. We present the International Stem Cell Forum (ISCF) Ethics Working Party's Policy Statement on "Publishing SNP Genotypes of Human Embryonic Stem Cell Lines (hESC)". The Statement prospectively addresses issues surrounding the publication of genotypic data and associated annotations of hESC lines in open access databases. It proposes a balanced approach between the goals of open science and data sharing with the respect for fundamental bioethical principles (autonomy, privacy, beneficence, justice and research merit and integrity).

  2. Ethnocentrism is an unacceptable rationale for health care policy: a critique of transplant tourism position statements.

    PubMed

    Evans, R W

    2008-06-01

    Medical tourism has emerged as a global health care phenomenon, valued at $60 billion worldwide in 2006. Transplant tourism, unlike other more benign forms of medical tourism, has become a flashpoint within the transplant community, underscoring the uneasy relationships among science, religion, politics, ethics and international health care policies concerning the commercialization of transplantation. Numerous professional associations have drafted or issued position statements condemning transplant tourism. Often the criticism is misdirected. The real issue concerns both the source and circumstances surrounding the procurement of donor organs, including commercialization. Unfortunately, many of the position statements circulated to date represent an ethnocentric and decidedly western view of transplantation. As such, the merits of culturally insensitive policy statements issued by otherwise well-intended transplant professionals, and the organizations they represent, must be evaluated within the broader context of foreign relations and diplomacy, as well as cultural and ethical relativity. Having done so, many persons may find themselves reluctant to endorse statements that have produced a misleading social desirability bias, which, to a great extent, has impeded more thoughtful and inclusive deliberations on the issues. Therefore, instead of taking an official position on policy matters concerning the commercial aspects of transplantation, international professional associations should offer culturally respectful guidance.

  3. P values are only an index to evidence: 20th- vs. 21st-century statistical science.

    PubMed

    Burnham, K P; Anderson, D R

    2014-03-01

    Early statistical methods focused on pre-data probability statements (i.e., data as random variables) such as P values; these are not really inferences nor are P values evidential. Statistical science clung to these principles throughout much of the 20th century as a wide variety of methods were developed for special cases. Looking back, it is clear that the underlying paradigm (i.e., testing and P values) was weak. As Kuhn (1970) suggests, new paradigms have taken the place of earlier ones: this is a goal of good science. New methods have been developed and older methods extended and these allow proper measures of strength of evidence and multimodel inference. It is time to move forward with sound theory and practice for the difficult practical problems that lie ahead. Given data the useful foundation shifts to post-data probability statements such as model probabilities (Akaike weights) or related quantities such as odds ratios and likelihood intervals. These new methods allow formal inference from multiple models in the a prior set. These quantities are properly evidential. The past century was aimed at finding the "best" model and making inferences from it. The goal in the 21st century is to base inference on all the models weighted by their model probabilities (model averaging). Estimates of precision can include model selection uncertainty leading to variances conditional on the model set. The 21st century will be about the quantification of information, proper measures of evidence, and multi-model inference. Nelder (1999:261) concludes, "The most important task before us in developing statistical science is to demolish the P-value culture, which has taken root to a frightening extent in many areas of both pure and applied science and technology".

  4. Uncertain deduction and conditional reasoning.

    PubMed

    Evans, Jonathan St B T; Thompson, Valerie A; Over, David E

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of "uncertain deduction" should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks.

  5. Comprehension of the description of side effects in drug information leaflets: a survey of doctors, pharmacists and lawyers.

    PubMed

    Ziegler, Andreas; Hadlak, Anka; Mehlbeer, Steffi; König, Inke R

    2013-10-01

    The German Federal Institute for Drugs and Medical Devices (Bundesinstitut für Arzneimittel und Medizinprodukte, BfArM) states that it uses standardized terms to describe the probabilities of side effects in drug information leaflets. It is unclear, however, whether these terms are actually understood correctly by doctors, pharmacists, and lawyers. A total of 1000 doctors, pharmacists, and lawyers were questioned by mail, and 60.4% of the questionnaires were filled out and returned. In the absence of any particular, potentially suggestive context, the respondents were asked to give a numerical interpretation of each of 20 verbal expressions of probability. Side effects were the subject of a hypothetical physician-patient case scenario. The respondents were also asked to give percentages that they felt corresponded to the terms "common," "uncommon," and "rare." The values obtained were compared with the intended values of the BfArM. The results obtained from the three professional groups resembled each other but stood in marked contrast to the BfArM definitions. With respect to side effects, the pharmacists matched the BfArM definitions most closely (5.8% "common," 1.9% "uncommon" and "rare"), followed by the physicians (3.5%, 0.3%, 0.9%) and the lawyers (0.7%, 0%, 0.7%). When the context of the side effects was not mentioned, the degree of agreement was much lower. Statements about the frequency of side effects are found in all drug information leaflets. Only a small minority of the respondents correctly stated the meaning of terms that are used to describe the frequency of occurrence of side effects, even though they routinely have to convey probabilities of side effects in the course of their professional duties. It can be concluded that the BfArM definitions of these terms do not, in general, correspond to their meanings in ordinary language.

  6. Abstract Numeric Relations and the Visual Structure of Algebra

    ERIC Educational Resources Information Center

    Landy, David; Brookes, David; Smout, Ryan

    2014-01-01

    Formal algebras are among the most powerful and general mechanisms for expressing quantitative relational statements; yet, even university engineering students, who are relatively proficient with algebraic manipulation, struggle with and often fail to correctly deploy basic aspects of algebraic notation (Clement, 1982). In the cognitive tradition,…

  7. 76 FR 23335 - Wilderness Stewardship Plan/Environmental Impact Statement, Sequoia and Kings Canyon National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-26

    ... planning and environmental impact analysis process required to inform consideration of alternative... 5, 1996. Based on an analysis of the numerous scoping comments received, and with consideration of a... proper food storage; party size; camping and campsites; human waste management; stock use; meadow...

  8. Environmental Statement for Lavon Dam and Reservoir Modification and East Fork Channel Improvement - Pertaining to East Fork Channel and Levee Improvement Increment I. Supplement.

    DTIC Science & Technology

    1977-01-01

    are capable of adapting to turbid conditions will probably be the dominant fish in the oxbows. The stream bottom dwelling population will not be much...the structure of the benthic conmunity. Snails ( gastropods ) and bivalve mollusks (pelecypods) are most abundant in the shallows areas. Stable gravel

  9. Advances in threat assessment and their application to forest and rangeland management—Volume 2

    Treesearch

    H. Michael Rauscher; Yasmeen Sands; Danny C. Lee; Jerome S. Beatty

    2010-01-01

    Risk is a combined statement of the probability that something of value will be damaged and some measure of the damage’s adverse effect. Wildfires burning in the uncharacteristic fuel conditions now typical throughout the Western United States can damage ecosystems and adversely affect environmental conditions. Wildfire behavior can be modified by prefire fuel...

  10. A Sensitivity Analysis of Circular Error Probable Approximation Techniques

    DTIC Science & Technology

    1992-03-01

    SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some

  11. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, G S; Reitsma, J B; Altman, D G; Moons, K G M

    2015-01-20

    Prediction models are developed to aid health-care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health-care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org).

  12. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD Statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-02-01

    Prediction models are developed to aid healthcare providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision-making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) initiative developed a set of recommendations for the reporting of studies developing, validating or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, healthcare professionals and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). © 2015 Stichting European Society for Clinical Investigation Journal Foundation.

  13. Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD): the TRIPOD Statement.

    PubMed

    Collins, G S; Reitsma, J B; Altman, D G; Moons, K G M

    2015-02-01

    Prediction models are developed to aid healthcare providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision-making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a web-based survey and revised during a 3-day meeting in June 2011 with methodologists, healthcare professionals and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. A complete checklist is available at http://www.tripod-statement.org. © 2015 American College of Physicians.

  14. Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-01-06

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org).

  15. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): The TRIPOD statement

    PubMed Central

    Collins, G S; Reitsma, J B; Altman, D G; Moons, K G M

    2015-01-01

    Prediction models are developed to aid health-care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health-care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). PMID:25562432

  16. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, G S; Reitsma, J B; Altman, D G; Moons, K G M

    2015-02-01

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). © 2015 Royal College of Obstetricians and Gynaecologists.

  17. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement. The TRIPOD Group.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-01-13

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). © 2015 The Authors.

  18. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD Statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-01-06

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org).

  19. Transparent reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-02-01

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Numerical Simulation of Cast Distortion in Gas Turbine Engine Components

    NASA Astrophysics Data System (ADS)

    Inozemtsev, A. A.; Dubrovskaya, A. S.; Dongauser, K. A.; Trufanov, N. A.

    2015-06-01

    In this paper the process of multiple airfoilvanes manufacturing through investment casting is considered. The mathematical model of the full contact problem is built to determine stress strain state in a cast during the process of solidification. Studies are carried out in viscoelastoplastic statement. Numerical simulation of the explored process is implemented with ProCASTsoftware package. The results of simulation are compared with the real production process. By means of computer analysis the optimization of technical process parameters is done in order to eliminate the defect of cast walls thickness variation.

  1. Improbable Outcomes: Infrequent or Extraordinary?

    ERIC Educational Resources Information Center

    Teigen, Karl Halvor; Juanchich, Marie; Riege, Anine H.

    2013-01-01

    Research on verbal probabilities has shown that "unlikely" or "improbable" events are believed to correspond to numerical probability values between 10% and 30%. However, building on a pragmatic approach of verbal probabilities and a new methodology, the present paper shows that unlikely outcomes are most often associated with outcomes that have a…

  2. On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Tetsuya; Hayakawa, Tomohisa

    The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.

  3. Uncertain deduction and conditional reasoning

    PubMed Central

    Evans, Jonathan St. B. T.; Thompson, Valerie A.; Over, David E.

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of “uncertain deduction” should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks. PMID:25904888

  4. Deriving Laws from Ordering Relations

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics.

  5. Seed zones for maintaining adapted plant populations

    Treesearch

    J. Bradley St. Clair; G. Randy Johnson; Vicky J. Erickson; Richard C. Johnson; Nancy L. Shaw

    2007-01-01

    Seed zones delineate areas within which plant materials can be transferred with little risk that they will be poorly adapted to their new location. They ensure successful restoration and revegetation, and help maintain the integrity of natural genetic structure. The value of seed zones is recognized in numerous policy statements from federal and state agencies. Results...

  6. North Korea: Back on the Terrorism List

    DTIC Science & Technology

    2010-05-24

    fabrication.” Pyongyang Korean Central Broadcasting Station, “DPRK NDC Spokesman’s Statement on ROK’s Sunken Ship Investigation Results,” May 20...underground military facilities, including tunnels and bunkers. Takashi Arimoto, Washington correspondent for the Japanese newspaper, Sankei Shimbun, has...underground tunnel with numerous assembly points that Hezbollah 71 Daniel Michaels and

  7. Lens Ray Diagrams with a Spreadsheet

    ERIC Educational Resources Information Center

    González, Manuel I.

    2018-01-01

    Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…

  8. 10 CFR Appendix A to Subpart B of... - Policy Statement for Electric Motors Covered Under the Energy Policy and Conservation Act

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... examples, most motors currently in production, or to be designed in the future, could probably be... motor by type, model number, and date of design or production; (2) the name of the original equipment... standards. 1 The term “manufacture” means “to manufacture, produce, assemble or import.” EPCA § 321(10...

  9. A Study on Detecting of Differential Item Functioning of PISA 2006 Science Literacy Items in Turkish and American Samples

    ERIC Educational Resources Information Center

    Çikirikçi Demirtasli, Nükhet; Ulutas, Seher

    2015-01-01

    Problem Statement: Item bias occurs when individuals from different groups (different gender, cultural background, etc.) have different probabilities of responding correctly to a test item despite having the same skill levels. It is important that tests or items do not have bias in order to ensure the accuracy of decisions taken according to test…

  10. The probability heuristics model of syllogistic reasoning.

    PubMed

    Chater, N; Oaksford, M

    1999-03-01

    A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.

  11. The California Earthquake Advisory Plan: A history

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  12. Key health themes and reporting of numerical cigarette-waterpipe equivalence in online news articles reporting on waterpipe tobacco smoking: a content analysis.

    PubMed

    Jawad, Mohammed; Bakir, Ali M; Ali, Mohammed; Jawad, Sena; Akl, Elie A

    2015-01-01

    There is anecdotal evidence that health messages interpreted from waterpipe tobacco smoking (WTS) research are inconsistent, such as comparing the health effects of one WTS session with that of 100 cigarettes. This study aimed to identify key health themes about WTS discussed by online news media, and how numerical cigarette-waterpipe equivalence (CWE) was being interpreted. We identified 1065 online news articles published between March 2011 and September 2012 using the 'Google Alerts' service. We screened for health themes, assessed statements mentioning CWE and reported differences between countries. We used logistic regression to identify factors associated with articles incorrectly reporting a CWE equal to or greater than 100 cigarettes, in the absence of any comparative parameter ('CWE ≥100 cigarettes'). Commonly mentioned health themes were the presence of tobacco (67%) and being as bad as cigarettes (49%), and we report on differences between countries. While 10.8% of all news articles contained at least one positive health theme, 22.9% contained a statement about a CWE. Most of these (18.6% total) were incorrectly a CWE ≥100 cigarettes, a quarter of which were made by healthcare professionals/organisations. Compared with the Middle East, articles from the USA and the UK were the most significant predictors to contain a CWE ≥100 cigarettes statement. Those wishing to write or publish information related to WTS may wish to avoid comparing WTS to cigarettes using numerical values as this is a major source of confusion. Future research is needed to address the impact of the media on the attitudes, initiation and cessation rates of waterpipe smokers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Communicating confidence in the detection and attribution of trends relevant to climate change

    NASA Astrophysics Data System (ADS)

    Ebi, K. L.

    2015-12-01

    Readily understandable and consistent language for describing confidence in detection and attribution statements can be developed based on the approach used by the International Agency for Research on Cancer (IARC). IARC was founded in 1965 to provide government authorities with expert, independent, scientific opinion on the causes of human cancer. IARC developed four standard terms for evaluations of the strength of evidence for carcinogenicity arising from human and experimental animal data, and for the strength of mechanistic evidence. Evidence is categorized as sufficient, limited, inadequate, and lack of carcinogenicity. The IARC process then combines theory, evidence, and degree of agreement into a summary evaluation that includes concise statements of the principal line(s) of argument that emerged, the conclusions of the working group on the strength of the evidence for each group of studies, citations to indicate which studies were pivotal to these conclusions, and the reasons for any differential weighting of data. The summary IARC categories are: Group 1 for agents carcinogenic to humans; Group 2 includes Group 2A (probably carcinogenic to humans) or Group 2B (possibly carcinogenic to humans) on the basis of epidemiological and experimental evidence of carcinogenicity and mechanistic and other relevant data; Group 3 for agents is not classifiable as to its carcinogenicity to humans; and Group 4 for agents probably not carcinogenic to humans. There are obvious parallels with describing confidence in key findings on detection and attribution of a trend to anthropogenic climate change with the confidence statements used by the IARC. Developing and consistent application of similar categories along with accompanying explanations of the principal lines of evidence, would be a helpful step in clearing communicating the degree and sources of certainty in the findings of detection and attribution.

  14. Bayesian Estimation of Small Effects in Exercise and Sports Science.

    PubMed

    Mengersen, Kerrie L; Drovandi, Christopher C; Robert, Christian P; Pyne, David B; Gore, Christopher J

    2016-01-01

    The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.

  15. [A modification of the Gompertz plot resulting from the age index by Ries and an approximation of the survivorship curve (author's transl)].

    PubMed

    Lohmann, W

    1978-01-01

    The shape of the survivorship curve can easily be interpreted on condition that the probability of death is proportional to an exponentially rising function of ageing. According to the formation of a sum for determining of the age index by Ries it was investigated to what extent the survivorship curve may be approximated by a sum of exponentials. It follows that the difference between the pure exponential function and a sum of exponentials by using possible values is lying within the random variation. Because the probability of death for different diseases is variable, the new statement is a better one.

  16. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    PubMed

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  17. Health Professionals Prefer to Communicate Risk-Related Numerical Information Using "1-in-X" Ratios.

    PubMed

    Sirota, Miroslav; Juanchich, Marie; Petrova, Dafina; Garcia-Retamero, Rocio; Walasek, Lukasz; Bhatia, Sudeep

    2018-04-01

    Previous research has shown that format effects, such as the "1-in-X" effect-whereby "1-in-X" ratios lead to a higher perceived probability than "N-in-N*X" ratios-alter perceptions of medical probabilities. We do not know, however, how prevalent this effect is in practice; i.e., how often health professionals use the "1-in-X" ratio. We assembled 4 different sources of evidence, involving experimental work and corpus studies, to examine the use of "1-in-X" and other numerical formats quantifying probability. Our results revealed that the use of the "1-in-X" ratio is prevalent and that health professionals prefer this format compared with other numerical formats (i.e., the "N-in-N*X", %, and decimal formats). In Study 1, UK family physicians preferred to communicate prenatal risk using a "1-in-X" ratio (80.4%, n = 131) across different risk levels and regardless of patients' numeracy levels. In Study 2, a sample from the UK adult population ( n = 203) reported that most GPs (60.6%) preferred to use "1-in-X" ratios compared with other formats. In Study 3, "1-in-X" ratios were the most commonly used format in a set of randomly sampled drug leaflets describing the risk of side effects (100%, n = 94). In Study 4, the "1-in-X" format was the most commonly used numerical expression of medical probabilities or frequencies on the UK's NHS website (45.7%, n = 2,469 sentences). The prevalent use of "1-in-X" ratios magnifies the chances of increased subjective probability. Further research should establish clinical significance of the "1-in-X" effect.

  18. What Do We Know about School Mental Health Promotion Programmes for Children and Youth?

    ERIC Educational Resources Information Center

    O'Mara, Linda; Lind, Candace

    2013-01-01

    There are numerous studies of school mental health promotion and primary prevention and many reviews of these studies; however, no clear consensus statement has emerged regarding school mental health promotion other than that child mental health is an important area that should be addressed in schools. This integrative review seeks to address this…

  19. Strategies for Civilian-Military Communication

    DTIC Science & Technology

    2013-04-01

    minimum standards in life-saving areas of humanitarian response. OCHA has a website with numerous publications dedicated to humanitarian civ-mil...United States Army War College Class of 2013 DISTRIBUTION STATEMENT: A Approved for Public Release Distribution is Unlimited This... Schools , 3624 Market Street, Philadelphia, PA 19104, (215) 662-5606. The Commission on Higher Education is an institutional accrediting agency

  20. Numerical Solution of the Problem of the Expansion of the Universe in the Schwarzschild Metric

    NASA Astrophysics Data System (ADS)

    Vasenin, I. M.; Goiko, V. L.

    2018-02-01

    The statement and solution of the problem of the expansion of the Universe in nonstationary spherically-symmetric coordinates in the Schwarzschild metric are considered without pressure taken into account. The observational data of astronomers investigating the rates of recession of distant stars are explained on the basis of the obtained solutions.

  1. Improving Emergency Management by Modeling Ant Colonies

    DTIC Science & Technology

    2015-03-01

    LEFT BLANK vii TABLE OF CONTENTS I.  THE INCIDENT COMMAND SYSTEM AND AUTONOMOUS ACTORS ......1  A.  PROBLEM STATEMENT...managerial level tasking.12 The Oklahoma City bombing has generally been viewed as a success for the ICS model; however, there were numerous occurrences...developed. The youngest generation of ant 25 Bert Holldobler and Edward O. Wilson, The Ants

  2. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  3. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  4. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  5. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  6. The Social Problem-Solving Questionnaire: Evaluation of Psychometric Properties among Turkish Primary School Students

    ERIC Educational Resources Information Center

    Dereli Iman, Esra

    2013-01-01

    Problem Statement: Children, like adults, face numerous problems and conflicts in their everyday lives, including issues with peers, siblings, older children, parents, teachers, and other adults. The methods children use to solve such problems are more important than actually facing the problems. The lack of effective social problem-solving skills…

  7. Fluctuation relation for heat exchange in Markovian open quantum systems

    NASA Astrophysics Data System (ADS)

    Ramezani, M.; Golshani, M.; Rezakhani, A. T.

    2018-04-01

    A fluctuation relation for the heat exchange of an open quantum system under a thermalizing Markovian dynamics is derived. We show that the probability that the system absorbs an amount of heat from its bath, at a given time interval, divided by the probability of the reverse process (releasing the same amount of heat to the bath) is given by an exponential factor which depends on the amount of heat and the difference between the temperatures of the system and the bath. Interestingly, this relation is akin to the standard form of the fluctuation relation (for forward-backward dynamics). We also argue that the probability of the violation of the second law of thermodynamics in the form of the Clausius statement (i.e., net heat transfer from a cold system to its hot bath) drops exponentially with both the amount of heat and the temperature differences of the baths.

  8. Fluctuation relation for heat exchange in Markovian open quantum systems.

    PubMed

    Ramezani, M; Golshani, M; Rezakhani, A T

    2018-04-01

    A fluctuation relation for the heat exchange of an open quantum system under a thermalizing Markovian dynamics is derived. We show that the probability that the system absorbs an amount of heat from its bath, at a given time interval, divided by the probability of the reverse process (releasing the same amount of heat to the bath) is given by an exponential factor which depends on the amount of heat and the difference between the temperatures of the system and the bath. Interestingly, this relation is akin to the standard form of the fluctuation relation (for forward-backward dynamics). We also argue that the probability of the violation of the second law of thermodynamics in the form of the Clausius statement (i.e., net heat transfer from a cold system to its hot bath) drops exponentially with both the amount of heat and the temperature differences of the baths.

  9. Interviewing Children Versus Tossing Coins: Accurately Assessing the Diagnosticity of Children’s Disclosures of Abuse

    PubMed Central

    LYON, THOMAS D.; AHERN, ELIZABETH C.; SCURICH, NICHOLAS

    2014-01-01

    We describe a Bayesian approach to evaluating children’s abuse disclosures and review research demonstrating that children’s disclosure of genital touch can be highly probative of sexual abuse, with the probative value depending on disclosure spontaneity and children’s age. We discuss how some commentators understate the probative value of children’s disclosures by: confusing the probability of abuse given disclosure with the probability of disclosure given abuse, assuming that children formally questioned about sexual abuse have a low prior probability of sexual abuse, misstating the probative value of abuse disclosure, and confusing the distinction between disclosure and nondisclosure with the distinction between true and false disclosures. We review interviewing methods that increase the probative value of disclosures, including interview instructions, narrative practice, noncontingent reinforcement, and questions about perpetrator/caregiver statements and children’s reactions to the alleged abuse. PMID:22339423

  10. An extended car-following model considering random safety distance with different probabilities

    NASA Astrophysics Data System (ADS)

    Wang, Jufeng; Sun, Fengxin; Cheng, Rongjun; Ge, Hongxia; Wei, Qi

    2018-02-01

    Because of the difference in vehicle type or driving skill, the driving strategy is not exactly the same. The driving speeds of the different vehicles may be different for the same headway. Since the optimal velocity function is just determined by the safety distance besides the maximum velocity and headway, an extended car-following model accounting for random safety distance with different probabilities is proposed in this paper. The linear stable condition for this extended traffic model is obtained by using linear stability theory. Numerical simulations are carried out to explore the complex phenomenon resulting from multiple safety distance in the optimal velocity function. The cases of multiple types of safety distances selected with different probabilities are presented. Numerical results show that the traffic flow with multiple safety distances with different probabilities will be more unstable than that with single type of safety distance, and will result in more stop-and-go phenomena.

  11. Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems

    DTIC Science & Technology

    2014-10-28

    Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator

  12. Land Tenure Adjustment Project. Preliminary Draft Environmental Impact Statement/Report

    DTIC Science & Technology

    1987-04-16

    southern California render the probability of major agricultural produrtion in the impact area unlikely. Extractive industry in the impact area is neither...American Indians, Vol. 8: California, ed. Robert F. Heizer . Smithsonian Institution, Washington, D.C. Bean, Lowell John, and Sylvia Brakke Vane. 1979...American Indians, Vol. 8: California, ed. Robert F. Heizer , pp. 567-569. Smithsonian Institution, Washington, D.C. LTA REVISION 2 B-2 California Air

  13. Numerical modeling of the destruction of steel plates with a gradient substrate

    NASA Astrophysics Data System (ADS)

    Orlov, M. Yu.; Glazyrin, V. P.; Orlov, Yu. N.

    2017-10-01

    The paper presents the results of numerical simulation of the shock loading process of steel barriers with a gradient substrate. In an elastic plastic axisymmetric statement, a shock is simulated along the normal in the range of initial velocities up to 300 m / s. A range of initial velocities was revealed, in which the presence of a substrate "saved" the obstacle from spallation. New tasks were announced to deepen scientific knowledge about the behavior of unidirectional gradient barriers at impact. The results of calculations are obtained in the form of graphs, calculated configurations of the "impact - barrier" and tables.

  14. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  15. [What kind of information do German health information pamphlets provide on mammography screening?].

    PubMed

    Kurzenhäuser, Stephanie

    2003-02-01

    To make an informed decision on participation in mammography screening, women have to be educated about all the risks and benefits of the procedure in a manner that is detailed and understandable. But an analysis of 27 German health pamphlets on mammography screening shows that many relevant pieces of information about the benefits, the risks, and especially the meaning of screening results are only insufficiently communicated. Many statements were presented narratively rather than as precise statistics. Depending on content, 17 to 62% of the quantifiable statements were actually given as numerical data. To provide comprehensive information and to avoid misunderstandings, it is necessary to supplement the currently available health pamphlets and make the information on mammography screening more precise.

  16. Preferred reporting items for studies mapping onto preference-based outcome measures: The MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-01-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprised of six health economists and one Delphi methodologist. Following a two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user friendly 23 item checklist. They are presented numerically and categorised within six sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.

  17. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication. A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprising six health economists and one Delphi methodologist. Following a two-round modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user-friendly 23-item checklist. They are presented numerically and categorised within six sections, namely: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  18. Probability distributions of hydraulic conductivity for the hydrogeologic units of the Death Valley regional ground-water flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.

    2002-01-01

    The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.

  19. Computation of rare transitions in the barotropic quasi-geostrophic equations

    NASA Astrophysics Data System (ADS)

    Laurie, Jason; Bouchet, Freddy

    2015-01-01

    We investigate the theoretical and numerical computation of rare transitions in simple geophysical turbulent models. We consider the barotropic quasi-geostrophic and two-dimensional Navier-Stokes equations in regimes where bistability between two coexisting large-scale attractors exist. By means of large deviations and instanton theory with the use of an Onsager-Machlup path integral formalism for the transition probability, we show how one can directly compute the most probable transition path between two coexisting attractors analytically in an equilibrium (Langevin) framework and numerically otherwise. We adapt a class of numerical optimization algorithms known as minimum action methods to simple geophysical turbulent models. We show that by numerically minimizing an appropriate action functional in a large deviation limit, one can predict the most likely transition path for a rare transition between two states. By considering examples where theoretical predictions can be made, we show that the minimum action method successfully predicts the most likely transition path. Finally, we discuss the application and extension of such numerical optimization schemes to the computation of rare transitions observed in direct numerical simulations and experiments and to other, more complex, turbulent systems.

  20. Toward Question-Asking Machines: The Logic of Questions and the Inquiry Calculus

    NASA Technical Reports Server (NTRS)

    Knuth,Kevin H.

    2005-01-01

    For over a century, the study of logic has focused on the algebra of logical statements. This work, first performed by George Boole, has led to the development of modern computers, and was shown by Richard T. Cox to be the foundation of Bayesian inference. Meanwhile the logic of questions has been much neglected. For our computing machines to be truly intelligent, they need to be able to ask relevant questions. In this paper I will show how the Boolean lattice of logical statements gives rise to the free distributive lattice of questions thus defining their algebra. Furthermore, there exists a quantity analogous to probability, called relevance, which quantifies the degree to which one question answers another. I will show that relevance is not only a natural generalization of information theory, but also forms its foundation.

  1. Preferred reporting items for studies mapping onto preference-based outcome measures: The MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-08-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. The primary audiences for the MAPS statement are researchers reporting mapping studies, the funders of the research, and peer reviewers and editors involved in assessing mapping studies for publication.A de novo list of 29 candidate reporting items and accompanying explanations was created by a working group comprised of six health economists and one Delphi methodologist. Following a two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, a final set of 23 items deemed essential for transparent reporting, and accompanying explanations, was developed. The items are contained in a user friendly 23 item checklist. They are presented numerically and categorised within six sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document.It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by eight health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.This statement was published jointly in Applied Health Economics and Health Policy, Health and Quality of Life Outcomes, International Journal of Technology Assessment in Health Care, Journal of Medical Economics, Medical Decision Making, PharmacoEconomics, and Quality of Life Research.

  2. The language of uncertainty in genetic risk communication: framing and verbal versus numerical information.

    PubMed

    Welkenhuysen, M; Evers-Kiebooms, G; d'Ydewalle, G

    2001-05-01

    Within a group of 300 medical students, two characteristics of risk communication in the context of a decision regarding prenatal diagnosis for cystic fibrosis are manipulated: verbal versus numerical probabilities and the negative versus positive framing of the problem (having a child with versus without cystic fibrosis). Independently of the manipulations, most students were in favor of prenatal diagnosis. The effect of framing was only significant in the conditions with verbal information: negative framing produced a stronger choice in favor of prenatal diagnosis than positive framing. The framing effect in the verbal conditions and its absence in the numerical conditions are explained by the dominance of the problem-occurrence orientation in health matters as well as a recoding process which is more likely to occur in the numerical (the probability "1-P" switches to its counterpart "P") than in the verbal conditions. The implications for the practice of genetic counseling are discussed.

  3. Properties of Traffic Risk Coefficient

    NASA Astrophysics Data System (ADS)

    Tang, Tie-Qiao; Huang, Hai-Jun; Shang, Hua-Yan; Xue, Yu

    2009-10-01

    We use the model with the consideration of the traffic interruption probability (Physica A 387(2008)6845) to study the relationship between the traffic risk coefficient and the traffic interruption probability. The analytical and numerical results show that the traffic interruption probability will reduce the traffic risk coefficient and that the reduction is related to the density, which shows that this model can improve traffic security.

  4. Helping Doctors and Patients Make Sense of Health Statistics.

    PubMed

    Gigerenzer, Gerd; Gaissmaier, Wolfgang; Kurz-Milcke, Elke; Schwartz, Lisa M; Woloshin, Steven

    2007-11-01

    Many doctors, patients, journalists, and politicians alike do not understand what health statistics mean or draw wrong conclusions without noticing. Collective statistical illiteracy refers to the widespread inability to understand the meaning of numbers. For instance, many citizens are unaware that higher survival rates with cancer screening do not imply longer life, or that the statement that mammography screening reduces the risk of dying from breast cancer by 25% in fact means that 1 less woman out of 1,000 will die of the disease. We provide evidence that statistical illiteracy (a) is common to patients, journalists, and physicians; (b) is created by nontransparent framing of information that is sometimes an unintentional result of lack of understanding but can also be a result of intentional efforts to manipulate or persuade people; and (c) can have serious consequences for health. The causes of statistical illiteracy should not be attributed to cognitive biases alone, but to the emotional nature of the doctor-patient relationship and conflicts of interest in the healthcare system. The classic doctor-patient relation is based on (the physician's) paternalism and (the patient's) trust in authority, which make statistical literacy seem unnecessary; so does the traditional combination of determinism (physicians who seek causes, not chances) and the illusion of certainty (patients who seek certainty when there is none). We show that information pamphlets, Web sites, leaflets distributed to doctors by the pharmaceutical industry, and even medical journals often report evidence in nontransparent forms that suggest big benefits of featured interventions and small harms. Without understanding the numbers involved, the public is susceptible to political and commercial manipulation of their anxieties and hopes, which undermines the goals of informed consent and shared decision making. What can be done? We discuss the importance of teaching statistical thinking and transparent representations in primary and secondary education as well as in medical school. Yet this requires familiarizing children early on with the concept of probability and teaching statistical literacy as the art of solving real-world problems rather than applying formulas to toy problems about coins and dice. A major precondition for statistical literacy is transparent risk communication. We recommend using frequency statements instead of single-event probabilities, absolute risks instead of relative risks, mortality rates instead of survival rates, and natural frequencies instead of conditional probabilities. Psychological research on transparent visual and numerical forms of risk communication, as well as training of physicians in their use, is called for. Statistical literacy is a necessary precondition for an educated citizenship in a technological democracy. Understanding risks and asking critical questions can also shape the emotional climate in a society so that hopes and anxieties are no longer as easily manipulated from outside and citizens can develop a better-informed and more relaxed attitude toward their health. © 2008 Association for Psychological Science.

  5. SURVIVABILITY THROUGH OPTIMIZING RESILIENT MECHANISMS (STORM)

    DTIC Science & Technology

    2017-04-01

    STATEMENT Approved for Public Release; Distribution Unlimited. PA# 88ABW-2017-0894 Date Cleared: 07 Mar 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Game ...quantitatively about cyber-attacks. Game theory is the branch of applied mathematics that formalizes strategic interaction among intelligent rational agents...mechanism based on game theory. This work has applied game theory to numerous cyber security problems: cloud security, cyber threat information sharing

  6. The Effect of Role Ambiguity and Role Conflict on Performance of Vice Principals: The Mediating Role of Burnout

    ERIC Educational Resources Information Center

    Celik, Kazim

    2013-01-01

    Problem Statement: Role ambiguity and role conflict are considered issues that affect performance and lead to burnout. While numerous studies have analyzed role ambiguity or role conflict in relation to burnout or performance, few studies have studied all of these issues together. Since vice principals are expected to carry out a variety of…

  7. Retirement, Work, and Lifelong Learning; Hearing Before the Special Committee on Aging, Ninety-Fifth Congress, Second Session. Part 4.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Special Committee on Aging.

    The transcripts of testimony given before the Senate Committee on Aging by representatives of numerous national organizations for older adults, such as the National Council on Aging and the National Gray Panthers, are provided. Issues reviewed in these statements address the following areas of concern: minority group needs, retirement planning,…

  8. 77 FR 75646 - Kenai National Wildlife Refuge, Soldotna, AK; Draft Environmental Impact Statement for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ..., and Cook Inlet Region, Inc. (CIRI), owns the subsurface estate of coal, oil, and gas in the project..., snowshoe hares, and numerous species of Neotropical birds, such as olive-sided flycatchers, myrtle warblers... received the subsurface oil, gas, and coal estate to nearly 200,000 acres within the Refuge as part of its...

  9. 78 FR 32270 - Kenai National Wildlife Refuge, Soldotna, AK; Environmental Impact Statement for the Shadura...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    .... (CIRI), owns the subsurface estate of coal, oil, and gas in the project area. The project would be in... and brown bears, lynx, snowshoe hares, and numerous species of Neotropical birds, such as olive-sided... within the Refuge, portions of the subsurface estate, consisting of the oil, gas, and coal are owned by...

  10. Developing the Next Generation NATO Reference Mobility Model

    DTIC Science & Technology

    2016-06-27

    acquisition • design UNCLASSIFIED: Distribution Statement A. Approved for public release; distribution is unlimited.(#27992) Vehicle Dynamics Model...and numerical resolution – for use in vehicle design , acquisition and operational mobility planning 27 June 2016 An open architecture was established...the current empirical methods for simulating vehicle and suspension designs . – Industry wide shortfall with tire dynamics and soft soil behavior

  11. Building Consistency between Title, Problem Statement, Purpose, & Research Questions to Improve the Quality of Research Plans and Reports

    ERIC Educational Resources Information Center

    Newman, Isadore; Covrig, Duane M.

    2013-01-01

    Consistency in the title, problem, purpose, and research question improve the logic and transparency of research. When these components of research are aligned research design and planning are more coherent and research reports are more readable. This article reviews the process for checking for and improving consistency. Numerous examples of…

  12. Simultaneous Inversion of UXO Parameters and Background Response

    DTIC Science & Technology

    2012-03-01

    11. SUPPLEMENTARY NO TES 12a. DISTRIBUTION/AVAILABILITY STATEMENT Unclassified/Unlimited 12b. DISTRIBUTIO N CODE 13. ABSTRACT (Maximum 200...demonstrated an ability to accurate recover dipole parameters using the simultaneous inversion method. Numerical modeling code for solving Maxwell’s...magnetics 15. NUMBER O F PAGES 160 16. PRICE CODE 17. SECURITY CLASSIFICATIO N OF REPORT Unclassified 18. SECURITY

  13. A Limited Survey of Dark Chocolate Bars Obtained in the United States for Undeclared Milk and Peanut Allergens.

    PubMed

    Bedford, Binaifer; Yu, Ye; Wang, Xue; Garber, Eric A E; Jackson, Lauren S

    2017-04-01

    Undeclared allergens in chocolate products have been responsible for numerous allergen-related recalls in the United States. A survey was conducted to determine the prevalence of undeclared milk and peanut in 88 and 78 dark chocolate bars, respectively. Concentrations of milk (as nonfat dry milk) or peanut in three samples of each chocolate product were determined with two milk- or peanut-specific enzyme-linked immunosorbent assay kits. In 75% of the chocolate bar products with a milk advisory statement, milk concentrations were above the limit of quantitation (2.5 μg/g [ppm]), with the majority having concentrations >1,000 ppm. An additional 67% of chocolate bars with a "traces of milk" statement contained 3 to 6,700 ppm of milk. Fifteen percent of chocolates labeled dairy free or lactose free and 25% labeled vegan were positive for milk, all with concentrations >1,000 ppm. Even for chocolates with no reference to milk on the label, 33% of these products contained 60 to 3,400 ppm of milk. The survey of chocolate products for peanuts revealed that 8% of products with an advisory statement contained peanut, with the highest concentration of 550 ppm. All nine chocolates bearing the peanut-free or allergen-free statement were negative for peanut, but 17% of chocolates with no label statement for peanut were positive for peanut at concentrations of 9 to 170 ppm. Evaluation of multiple lots of four chocolate products revealed that milk was consistently present or absent for the products investigated, but mixed results were obtained when multiple lots were tested for peanut. This study indicates that a large proportion of dark chocolate bars contain undeclared milk. The type of advisory statement or the absence of a milk advisory statement on products did not predict the amount or absence of milk protein. In contrast, a lower proportion of chocolates containing undeclared peanut was found. Consumers with food allergies should be cautious when purchasing dark chocolate products, particularly those that have an advisory label statement.

  14. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  15. Determining the Statistical Power of the Kolmogorov-Smirnov and Anderson-Darling Goodness-of-Fit Tests via Monte Carlo Simulation

    DTIC Science & Technology

    2016-12-01

    KS and AD Statistical Power via Monte Carlo Simulation Statistical power is the probability of correctly rejecting the null hypothesis when the...Select a caveat DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Determining the Statistical Power...real-world data to test the accuracy of the simulation. Statistical comparison of these metrics can be necessary when making such a determination

  16. Statement of Robert F. Hale, Assistant Director, National Security Division, Congressional Budget Office

    DTIC Science & Technology

    1987-03-17

    rates also have some potential disadvantages . Key among them are higher near-term program costs. These costs would probably require offsetting budget...HIGHER PRODUCTION RATES Should DoD produce weapons at higher rates? Certain disadvantages must be weighed against the merits of higher rates... Disadvantages of Higher Production Rates The most imporant disadvantage of higher production rates is the delay or cancellation of new weapons systems that

  17. Disability Evaluation System and Temporary Limited Duty Assignment Process: A Qualitative Review.

    DTIC Science & Technology

    1998-03-01

    Statement addressing the requirement for monitoring, frequency of treat- ments/ therapy , and the associated operational assignment limitation; Informed...ACC does not exist in the EAIS, ARIS, or the EMF data bases. The system is able to track changes in duty station, but not ACC’s. If a member is on...specific geographic assignment. 4. Requires extensive or prolonged medical therapy . 5. Who through continued military service would probably result in

  18. NASA, John F. Kennedy Space Center environmental impact statement

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The probable total impact of the John F. Kennedy Space Center (KSC) operations on the environment is discussed in terms of launch operations emissions and environmental quality. A schedule of planned launches through 1973 is included with a description of the systems for eliminating harmful emissions during launch operations. The effects of KSC on wild life and environmental quality are discussed along with the irreversible and irretrievable commitments of natural resources.

  19. Phase Equilibria and Transition in Mixtures of a Homopolymer and a Block Copolymer. I. Small-Angle X-Ray Scattering Study.

    DTIC Science & Technology

    1983-03-08

    tlh repow ) !Unclassified lie. DECLASSI FICATION/ DOWNGRADING SCHEDULE 16. DISTRIBUTION STATEMENT ( of this Report) Distribution Unlimited, Approved for...a block copolymer can sometimes be transformed into a homogeneous, disordered structure. The tem- perature of the transition depends on the degree of ...probably that the morphology is gradually transformed from spherical to cylindrical and eventually to lamellar packing. There is, however, no evidence of

  20. Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, G S; Reitsma, J B; Altman, D G; Moons, K G M

    2015-02-01

    Prediction models are developed to aid healthcare providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision-making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD) initiative developed a set of recommendations for the reporting of studies developing, validating or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a web-based survey and revised during a 3-day meeting in June 2011 with methodologists, healthcare professionals and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study, regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). © 2015 Joint copyright. The Authors and Annals of Internal Medicine. Diabetic Medicine published by John Wiley Ltd. on behalf of Diabetes UK.

  1. Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis (TRIPOD)

    PubMed Central

    Reitsma, Johannes B.; Altman, Douglas G.; Moons, Karel G.M.

    2015-01-01

    Background— Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. Methods— The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. Results— The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. Conclusions— To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). PMID:25561516

  2. Theory of atomic spectral emission intensity

    NASA Astrophysics Data System (ADS)

    Yngström, Sten

    1994-07-01

    The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.

  3. Detection of Orbital Debris Collision Risks for the Automated Transfer Vehicle

    NASA Technical Reports Server (NTRS)

    Peret, L.; Legendre, P.; Delavault, S.; Martin, T.

    2007-01-01

    In this paper, we present a general collision risk assessment method, which has been applied through numerical simulations to the Automated Transfer Vehicle (ATV) case. During ATV ascent towards the International Space Station, close approaches between the ATV and objects of the USSTRACOM catalog will be monitored through collision rosk assessment. Usually, collision risk assessment relies on an exclusion volume or a probability threshold method. Probability methods are more effective than exclusion volumes but require accurate covariance data. In this work, we propose to use a criterion defined by an adaptive exclusion area. This criterion does not require any probability calculation but is more effective than exclusion volume methods as demonstrated by our numerical experiments. The results of these studies, when confirmed and finalized, will be used for the ATV operations.

  4. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Herzog, James P. (Inventor); Bickford, Randall L. (Inventor)

    2005-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  5. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2006-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  6. Surveillance System and Method having an Adaptive Sequential Probability Fault Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2008-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  7. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  8. Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.

    PubMed

    Jaspersen, Johannes G; Montibeller, Gilberto

    2015-07-01

    Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. © 2015 Society for Risk Analysis.

  9. Influence of the random walk finite step on the first-passage probability

    NASA Astrophysics Data System (ADS)

    Klimenkova, Olga; Menshutin, Anton; Shchur, Lev

    2018-01-01

    A well known connection between first-passage probability of random walk and distribution of electrical potential described by Laplace equation is studied. We simulate random walk in the plane numerically as a discrete time process with fixed step length. We measure first-passage probability to touch the absorbing sphere of radius R in 2D. We found a regular deviation of the first-passage probability from the exact function, which we attribute to the finiteness of the random walk step.

  10. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprising six health economists and one Delphi methodologist. A two-round, modified Delphi survey, with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within six sections: (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS Explanation and Elaboration paper. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of the reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  11. Preferred Reporting Items for Studies Mapping onto Preference-Based Outcome Measures: The MAPS Statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-08-01

    "Mapping" onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite the publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist that aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of 6 health economists and 1 Delphi methodologist. A 2-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies, and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorized within 6 sections, namely: (i) title and abstract; (ii) introduction; (iii) methods; (iv) results; (v) discussion; and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency, and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by 7 health economics and quality-of-life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years.

  12. "Girls Are as Good as Boys at Math" Implies That Boys Are Probably Better: A Study of Expressions of Gender Equality.

    PubMed

    Chestnut, Eleanor K; Markman, Ellen M

    2018-06-28

    Although "Girls are as good as boys at math" explicitly expresses equality, we predict it could nevertheless suggest that boys have more raw talent. In statements with this subject-complement structure, the item in the complement position serves as the reference point and is thus considered more typical and prominent. This explains why "Tents are like houses," for instance, sounds better than "Houses are like tents"-people generally think of houses as more typical. For domains about ability, the reference point should be the item that is typically more skilled. We further propose that the reference point should be naturally more skilled. In two experiments, we presented adults with summaries of actual scientific evidence for gender equality in math (Experiment 1) or verbal ability (Experiment 2), but we manipulated whether the reference point in the statements of equality in the summaries (e.g., "Boys' verbal ability is as good as girls'") was girls or boys. As predicted, adults attributed more natural ability to each gender when it was in the complement rather than subject position. Yet, in Experiment 3, we found that when explicitly asked, participants judged that such sentences were not biased in favor of either gender, indicating that subject-complement statements must be transmitting this bias in a subtle way. Thus, statements such as "Girls are as good as boys at math" can actually backfire and perpetuate gender stereotypes about natural ability. © 2018 Cognitive Science Society, Inc.

  13. [Is "evidence-based medicine" followed by "confidence-based medicine"?].

    PubMed

    Porzsolt, Franz; Fangerau, Heiner

    2010-08-01

    In an appeal concerning accusations of defamation, the England and Wales Court of Appeal Decisions determined that evidence-based statements are to be judged as opinions and not statements of fact. Since the authors consider it probable that this legal judgment will exert influence on physicians' decisions about the provision of health care services, they have compiled the implications of the judgment and discuss its consequences. The own analyses and considerations lead to the conclusion that confidence-based medicine follows evidence-based medicine. This extension is necessary because evidence-based medicine has not been able to generate the required trust. Therefore, it will be demanded to underpin the existing concept with additional data. These data will be necessary because it is no longer sufficient to convince scientists with data which are obtained under ideal conditions, but to convince critical members of society with additional data which have been obtained under everyday conditions.

  14. Simple, accurate formula for the average bit error probability of multiple-input multiple-output free-space optical links over negative exponential turbulence channels.

    PubMed

    Peppas, Kostas P; Lazarakis, Fotis; Alexandridis, Antonis; Dangakis, Kostas

    2012-08-01

    In this Letter we investigate the error performance of multiple-input multiple-output free-space optical communication systems employing intensity modulation/direct detection and operating over strong atmospheric turbulence channels. Atmospheric-induced strong turbulence fading is modeled using the negative exponential distribution. For the considered system, an approximate yet accurate analytical expression for the average bit error probability is derived and an efficient method for its numerical evaluation is proposed. Numerically evaluated and computer simulation results are further provided to demonstrate the validity of the proposed mathematical analysis.

  15. Persistence Probability Analyzed on the Taiwan STOCK Market

    NASA Astrophysics Data System (ADS)

    Chen, I.-Chun; Chen, Hung-Jung; Tseng, Hsen-Che

    We report a numerical study of the Taiwan stock market, in which we used three data sources: the daily Taiwan stock exchange index (TAIEX) from January 1983 to May 2006, the daily OTC index from January 1995 to May 2006, and the one-min intraday data from February 2000 to December 2003. Our study is based on numerical estimates of persistence exponent θp, Hurst exponent H2, and fluctuation exponent h2. We also discuss the results concerning persistence probability P(t), qth-order price-price correlation function Gq(t), and qth-order normalized fluctuation function fq(t) among these indices.

  16. Preliminary design of an auxiliary power unit for the space shuttle. Volume 3: Details of system analysis, engineering, and design for selected system

    NASA Technical Reports Server (NTRS)

    Hamilton, M. L.; Burriss, W. L.

    1972-01-01

    Numerous candidate APU concepts, each meeting the space shuttle APU problem statement are considered. Evaluation of these concepts indicates that the optimum concept is a hydrogen-oxygen APU incorporating a recuperator to utilize the exhaust energy and using the cycle hydrogen flow as a means of cooling the component heat loads.

  17. [Diagnostic rationalism. Views of general practitioners on fibromyalgia].

    PubMed

    Daehli, B

    1993-09-20

    Clinical practice is characterized by having to make numerous important decisions, including the diagnosis. In this study, general practitioners were asked to agree or to disagree with statements of fibromyalgia. The main purpose was to test the usefulness of two well-known models for decision-making when studying diagnosis in cases of uncertainty and scepticism. The results show that the models are inadequate to explain the decisions.

  18. U.S. Air Force Annual Financial Statement 2010

    DTIC Science & Technology

    2010-01-01

    certain contract financing payments that are not reported elsewhere on Air Force’s Balance Sheet. The Air Force conducts business with commercial...the reporting entity has a contractual commitment for payment is $712.8 million. The Air Force is a party in numerous individual contracts that...promulgated by the Federal Accounting Standards Advisory Board; the Office of Management and Budget (OMB) Circular No. A-136, Financial Reporting

  19. Applications of Quantum Probability Theory to Dynamic Decision Making

    DTIC Science & Technology

    2015-08-13

    Final Report 08/13/2015 DISTRIBUTION A : Distribution approved for public release. AF Office Of Scientific Research (AFOSR)/ RTC Arlington, Virginia 22203...for failing to comply with a collection of information   if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM...SPONSOR/MONITOR’S ACRONYM(S) AFRL/AFOSR RTC 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT A DISTRIBUTION UNLIMITED: PB

  20. Winter Fish Populations in Probable Locations of Air Bubblers in the St. Marys River-Lake Superior Area

    DTIC Science & Technology

    1980-09-01

    Lawrence Seaway Navigation Season Extension, Draft Main Report and Environmental Statement. Detroit, Michigan. Potential effects on fish were discussed...to keep channels ice free for winter vessel passage. The stucies were Jone to determine base line ecological conditions and the effects of the...Subjects were: "Ecological effects of air bub- blers in the winter, a partially annotated bibliography" and "Annotated bibliography on winter fish and

  1. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  2. Numerical Modeling of Electroacoustic Logging Including Joule Heating

    NASA Astrophysics Data System (ADS)

    Plyushchenkov, Boris D.; Nikitin, Anatoly A.; Turchaninov, Victor I.

    It is well known that electromagnetic field excites acoustic wave in a porous elastic medium saturated with fluid electrolyte due to electrokinetic conversion effect. Pride's equations describing this process are written in isothermal approximation. Update of these equations, which allows to take influence of Joule heating on acoustic waves propagation into account, is proposed here. This update includes terms describing the initiation of additional acoustic waves excited by thermoelastic stresses and the heat conduction equation with right side defined by Joule heating. Results of numerical modeling of several problems of propagation of acoustic waves excited by an electric field source with and without consideration of Joule heating effect in their statements are presented. From these results, it follows that influence of Joule heating should be taken into account at the numerical simulation of electroacoustic logging and at the interpretation of its log data.

  3. A Numerical and Theoretical Study of Seismic Wave Diffraction in Complex Geologic Structure

    DTIC Science & Technology

    1989-04-14

    element methods for analyzing linear and nonlinear seismic effects in the surficial geologies relevant to several Air Force missions. The second...exact solution evaluated here indicates that edge-diffracted seismic wave fields calculated by discrete numerical methods probably exhibits significant...study is to demonstrate and validate some discrete numerical methods essential for analyzing linear and nonlinear seismic effects in the surficial

  4. Learning in Reverse: Eight-Month-Old Infants Track Backward Transitional Probabilities

    ERIC Educational Resources Information Center

    Pelucchi, Bruna; Hay, Jessica F.; Saffran, Jenny R.

    2009-01-01

    Numerous recent studies suggest that human learners, including both infants and adults, readily track sequential statistics computed between adjacent elements. One such statistic, transitional probability, is typically calculated as the likelihood that one element predicts another. However, little is known about whether listeners are sensitive to…

  5. Traceable accounts of subjective probability judgments in the IPCC and beyond

    NASA Astrophysics Data System (ADS)

    Baer, P. G.

    2012-12-01

    One of the major sources of controversy surrounding the reports of the IPCC has been the characterization of uncertainty. Although arguably the IPCC has paid more attention to the process of uncertainty analysis and communication than any comparable assessment body, its efforts to achieve consistency have produced mixed results. In particular, the extensive use of subjective probability assessment has attracted widespread criticism. Statements such as "Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years" are ubiquitous (one online database lists nearly 3000 such claims), and indeed are the primary way in which its key "findings" are reported. Much attention is drawn to the precise quantitative definition of such statements (e.g., "very likely" means >90% probability, vs. "extremely likely" which means >95% certainty). But there is no process by which the decision regarding the choice of such uncertainty level for a given finding is formally made or reported, and thus they are easily by disputed by anyone, expert or otherwise, who disagrees with the assessment. In the "Uncertainty Guidance Paper" for the Third Assessment Report, Richard Moss and Steve Schneider defined the concept of a "traceable account," which gave exhaustive detail regarding how one ought to provide documentation of such an uncertainty assessment. But the guidance, while appearing straightforward and reasonable, in fact was an unworkable recipe, which would have taken near-infinite time if used for more than a few key results, and would have required a different structuring of the text than the conventional scientific assessment. And even then it would have left a gap when it came to the actual provenance of any such specific judgments, because there simply is no formal step at which individuals turn their knowledge of the evidence on some finding into a probability judgment. The Uncertainty Guidance Papers for the TAR and subsequent assessments have left open the possibility of using such an expert elicitation within the IPCC drafting process, but to my knowledge it has never been done. Were it in fact attempted, it would reveal the inconvenient truth that there is no uniquely correct method for aggregating probability statements; indeed the standard practice within climate-related expert elicitations has been to report all individual estimates without aggregation. But if a report requires a single "consensus estimate," once you have even a single divergent opinion, the question of how to aggregate becomes unavoidable. In this paper, I review in greater detail the match or lack of it between the vision of a "traceable account" and IPCC practice, and the public discussion of selected examples of probabilistic judgments in AR4. I propose elements of a structure based on a flexible software architecture that could facilitate the development and documentation of what I call "collective subjective probability." Using a simple prototype and a pair of sample "findings" from AR4, I demonstrate an example of how such a structure could be used by a small expert community to implement a practical model of a "traceable account." I conclude with as discussion of the prospects of using such modular elicitations in support of, or as an alternative to, conventional IPCC assessment processes.

  6. Exact transition probabilities in a 6-state Landau–Zener system with path interference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitsyn, Nikolai A.

    2015-04-23

    In this paper, we identify a nontrivial multistate Landau–Zener (LZ) model for which transition probabilities between any pair of diabatic states can be determined analytically and exactly. In the semiclassical picture, this model features the possibility of interference of different trajectories that connect the same initial and final states. Hence, transition probabilities are generally not described by the incoherent successive application of the LZ formula. Finally, we discuss reasons for integrability of this system and provide numerical tests of the suggested expression for the transition probability matrix.

  7. General Math 10-12 [Instructional Objectives Exchange].

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for the Study of Evaluation.

    This collection of 123 objectives and related evaluation items is for general mathematics in grades 10 through 12. The content has been organized into nine major categories: sets; numbers, numerals, and numeration systems; operations and their properties; measurements; per cents; geometry; probability and statistics; logic; and applications and…

  8. Numerical studies on the microclimate around a sleeping person and the related thermal neutrality issues.

    PubMed

    Pan, D; Chan, M; Deng, S; Xia, L; Xu, X

    2011-11-01

    This article reports on two numerical studies on the microclimate around, and the thermal neutrality of, a sleeping person in a space installed with a displacement ventilation system. The development of a sleeping computational thermal manikin (SCTM) placed in a space air-conditioned by a displacement ventilation system is first described. This is followed by reporting the results of the first numerical study on the microclimate around the SCTM, including air temperature and velocity distributions and the heat transfer characteristics. Then the outcomes of the other numerical study on the thermal neutrality of a sleeping person are presented, including the thermal neutrality for a naked sleeping person and the effects of the total insulation value of a bedding system on the thermal neutrality of a sleeping person. STATEMENT OF RELEVANCE: The thermal environment would greatly affect the sleep quality of human beings. Through developing a SCTM, the microclimate around a sleeping person has been numerically studied. The thermal neutral environment may then be predicted and contributions to improved sleep quality may be made.

  9. National Athletic Trainers' Association Position Statement: Anabolic-Androgenic Steroids

    PubMed Central

    Kersey, Robert D.; Elliot, Diane L.; Goldberg, Linn; Kanayama, Gen; Leone, James E.; Pavlovich, Mike; Pope, Harrison G.

    2012-01-01

    This NATA position statement was developed by the NATA Research & Education Foundation. Objective This manuscript summarizes the best available scholarly evidence related to anabolic-androgenic steroids (AAS) as a reference for health care professionals, including athletic trainers, educators, and interested others. Background Health care professionals associated with sports or exercise should understand and be prepared to educate others about AAS. These synthetic, testosterone-based derivatives are widely abused by athletes and nonathletes to gain athletic performance advantages, develop their physiques, and improve their body image. Although AAS can be ergogenic, their abuse may lead to numerous negative health effects. Recommendations Abusers of AAS often rely on questionable information sources. Sports medicine professionals can therefore serve an important role by providing accurate, reliable information. The recommendations provide health care professionals with a current and accurate synopsis of the AAS-related research. PMID:23068595

  10. New Hong Kong statute protects factual statements in medical apologies from use in litigation.

    PubMed

    Leung, Gilberto Kk; Porter, Gerard

    2018-01-01

    Providing an apology which contains a factual explanation following a medical adverse incident may facilitate an amicable settlement and improve patient experience. Numerous apology laws exist with the aim of encouraging an apology but the lack of explicit and specific protection for factual admissions included in "full" apologies can give rise to legal disputes and deter their use. The new Hong Kong Apology Ordinance expressly prohibits the admission of a statement of fact in an apology as evidence of fault in a wide range of applicable proceedings and thus provides the clearest and most comprehensive apology protection to date. This should significantly encourage open medical disclosure and the provision of an apology when things go wrong. This paper examines the significance and implication of the Apology Ordinance in the medico-legal context.

  11. Validating the applicability of the GUM procedure

    NASA Astrophysics Data System (ADS)

    Cox, Maurice G.; Harris, Peter M.

    2014-08-01

    This paper is directed at practitioners seeking a degree of assurance in the quality of the results of an uncertainty evaluation when using the procedure in the Guide to the Expression of Uncertainty in Measurement (GUM) (JCGM 100 : 2008). Such assurance is required in adhering to general standards such as International Standard ISO/IEC 17025 or other sector-specific standards. We investigate the extent to which such assurance can be given. For many practical cases, a measurement result incorporating an evaluated uncertainty that is correct to one significant decimal digit would be acceptable. Any quantification of the numerical precision of an uncertainty statement is naturally relative to the adequacy of the measurement model and the knowledge used of the quantities in that model. For general univariate and multivariate measurement models, we emphasize the use of a Monte Carlo method, as recommended in GUM Supplements 1 and 2. One use of this method is as a benchmark in terms of which measurement results provided by the GUM can be assessed in any particular instance. We mainly consider measurement models that are linear in the input quantities, or have been linearized and the linearization process is deemed to be adequate. When the probability distributions for those quantities are independent, we indicate the use of other approaches such as convolution methods based on the fast Fourier transform and, particularly, Chebyshev polynomials as benchmarks.

  12. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  13. Coarse-scale restoration planning and design in Interior Columbia River Basin ecosystems: An example for restoring declining whitebark pine forests

    Treesearch

    Robert E. Keane; James P. Menakis; Wendel J. Hann

    1996-01-01

    During the last 2 years, many people from numerous government agencies and private institutions compiled a scientific assessment of the natural and human resources of the Interior Columbia River Basin (Jensen and Bourgeron 1993). This assessment is meant to guide the development of a coarse-scale Environmental Impact Statement for all 82 million hectares comprising the...

  14. Engineering of Droplet Manipulation in Tertiary Junction Microfluidic Channels

    DTIC Science & Technology

    2017-06-30

    DISTRIBUTION/AVAILABILITY STATEMENT A DISTRIBUTION UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT We have carried out an experimental and...method (LBM). Both the experimental and numerical results showed good agreement and suggested that at higher Re equal to 3, the flow was dominated by...location during grant period. Period of Performance: 06/01/2015 – 11/01/2016 Abstract We have carried out an experimental and in silico

  15. A Numerical Model of Laser Induced Fluorescence in a Hydrogen Plasma

    DTIC Science & Technology

    1990-09-11

    3TLS 12a. DISTRIBUTION AVAILABILITY STATEMENT j 12b. DISTRIBUTION CODE Approved for Public Release lAW 190-1 Distributed Unlimited ERNEST A. HAYGOOD, Ist...1o 10, in’ 10: 10 1o, o, I m- )Nit Time )Wtl F igu re 6.3: The Imrupact, of the G~roundl ( Statle P opulaht ion of At omnic I Iyd ro- geri 71 a) b

  16. Evaluation of the ACEC Benchmark Suite for Real-Time Applications

    DTIC Science & Technology

    1990-07-23

    1.0 benchmark suite waSanalyzed with respect to its measuring of Ada real-time features such as tasking, memory management, input/output, scheduling...and delay statement, Chapter 13 features , pragmas, interrupt handling, subprogram overhead, numeric computations etc. For most of the features that...meant for programming real-time systems. The ACEC benchmarks have been analyzed extensively with respect to their measuring of Ada real-time features

  17. Innovative Methods for Estimating Densities and Detection Probabilities of Secretive Reptiles Including Invasive Constrictors and Rare Upland Snakes

    DTIC Science & Technology

    2018-01-30

    1  Department of Defense Legacy Resource Management Program Agreement # W9132T-14-2-0010 ( Project # 14-754) Innovative Methods for Estimating...Upland Snakes NA 5c. PROGRAM ELEMENT NUMBER NA 6. AUTHOR(S) 5d. PROJECT NUMBER John D. Willson, Ph.D. 14-754 Shannon Pittman, Ph.D. 5e. TASK NUMBER...STATEMENT Publically available 13. SUPPLEMENTARY NOTES NA 14. ABSTRACT This project demonstrates the broad applicability of a novel simulation

  18. Final environmental impact statement for Ames Research Center

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The NASA-Ames Research Center is described. together with the nature of its activities, from which it can be seen that the center is basically not a major pollution source. Geographical, and climatic characteristics of the site are described. inasmuch as they influence both the choice of disposal methods and the environmental effects of the pollutants. The known or probable pollution sources at the center are described. Where the intensities of these sources might exceed the recommended guidelines, the corrective actions that have been taken are described.

  19. A Survey of Network Reliability.

    DTIC Science & Technology

    1983-07-01

    STATEMENT A. Atoyed for public releosel Oo . Distibution Unlimited JULY 1983 ORC 83-5 This research was supported by the Air Force Office of Scientific ...cONTso.oSoo oaoCC NaME ara AOoM It. REPORT DATE United States Air Force July 1983 Air Force Office of Scientific Research IL NUNeen or s Boiling Air Force Base...One node in K is designated the root and the reliability problem is to calculate the probability that the root can comunicate with the remaining K C V

  20. Relation of ground water to stream flow at Battle Creek, Mich.

    USGS Publications Warehouse

    Eddy, G.E.; Ferris, J.G.

    1950-01-01

    This is a summary of statements made by G.E. Eddy, State Geologist of Michigan, and J.G. Ferris, district engineer, Ground Water Branch, U.S. Geological Survey, Lansing, Mich., in a conference during the fall of 1949 with John Spoden, Chief of the Maintenance and Fold Control Division of the district office of the Corps of Engineers, Milwaukee, Wis. The conference related to the probably effect on ground-water conditions at Battle Creek of flood-control measures proposed by the Corps of Engineers.

  1. Environmental Impact Statement/Environmental Impact Report for the Disposal and Reuse of Mare Island Naval Shipyard Vallejo, California. Volume 1.

    DTIC Science & Technology

    1998-04-01

    Valley (Kroeber & Heizer 1970). In 1972, the Bureau of Indian Affairs listed only 11 individuals claiming Patwin ancestry in the entire territory...facility from the dredge disposal area to the upland open space scenic resource area would render this facility visible from viewpoints with . high...take. The COE probably would not issue a permit unless the USFWS rendered a "non-jeopardy" Biological Opinion, which would incorporate mitigations for

  2. The Efficacy of the Government’s Use of Past Performance Information: An Exploratory Study

    DTIC Science & Technology

    2014-04-30

    afraid that unless everyone is really working these things to really make an impactful statement that they probably aren’t worth a whole lot if you have...end users. This variance lends credence to H12, H13 , and H14, which posit relationships between features of communication and past performance... impact to the contractor’s ability to secure future government business. In addition to fear of a supplier dispute to ratings, this phenomenon

  3. Preferred reporting items for studies mapping onto preference-based outcome measures: the MAPS statement.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2016-02-01

    'Mapping' onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MApping onto Preference-based measures reporting Standards (MAPS) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of six health economists and one Delphi methodologist. A two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorised within six sections, namely (1) title and abstract; (2) introduction; (3) methods; (4) results; (5) discussion; and (6) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  4. PREFERRED REPORTING ITEMS FOR STUDIES MAPPING ONTO PREFERENCE-BASED OUTCOME MEASURES: THE MAPS STATEMENT.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-01-01

    "Mapping" onto generic preference-based outcome measures is increasingly being used as a means of generating health utilities for use within health economic evaluations. Despite publication of technical guides for the conduct of mapping research, guidance for the reporting of mapping studies is currently lacking. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a new checklist, which aims to promote complete and transparent reporting of mapping studies. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items was created by a working group comprised of six health economists and one Delphi methodologist. A two-round, modified Delphi survey with representatives from academia, consultancy, health technology assessment agencies, and the biomedical journal editorial community was used to identify a list of essential reporting items from this larger list. From the initial de novo list of twenty-nine candidate items, a set of twenty-three essential reporting items was developed. The items are presented numerically and categorized within six sections, namely: (i) title and abstract, (ii) introduction, (iii) methods, (iv) results, (v) discussion, and (vi) other. The MAPS statement is best applied in conjunction with the accompanying MAPS explanation and elaboration document. It is anticipated that the MAPS statement will improve the clarity, transparency. and completeness of reporting of mapping studies. To facilitate dissemination and uptake, the MAPS statement is being co-published by seven health economics and quality of life journals, and broader endorsement is encouraged. The MAPS working group plans to assess the need for an update of the reporting checklist in five years' time.

  5. A variational principle for compressible fluid mechanics. Discussion of the one-dimensional theory

    NASA Technical Reports Server (NTRS)

    Prozan, R. J.

    1982-01-01

    The second law of thermodynamics is used as a variational statement to derive a numerical procedure to satisfy the governing equations of motion. The procedure, based on numerical experimentation, appears to be stable provided the CFL condition is satisfied. This stability is manifested no matter how severe the gradients (compression or expansion) are in the flow field. For reasons of simplicity only one dimensional inviscid compressible unsteady flow is discussed here; however, the concepts and techniques are not restricted to one dimension nor are they restricted to inviscid non-reacting flow. The solution here is explicit in time. Further study is required to determine the impact of the variational principle on implicit algorithms.

  6. Deductive and inductive reasoning in obsessive-compulsive disorder.

    PubMed

    Pélissier, Marie-Claude; O'Connor, Kieron P

    2002-03-01

    This study tested the hypothesis that people with obsessive-compulsive disorder (OCD) show an inductive reasoning style distinct from people with generalized anxiety disorder (GAD) and from participants in a non-anxious (NA) control group. The experimental procedure consisted of administering a range of six deductive and inductive tasks and a probabilistic task in order to compare reasoning processes between groups. Recruitment was in the Montreal area within a French-speaking population. The participants were 12 people with OCD, 12 NA controls and 10 people with GAD. Participants completed a series of written and oral reasoning tasks including the Wason Selection Task, a Bayesian probability task and other inductive tasks, designed by the authors. There were no differences between groups in deductive reasoning. On an inductive "bridging task", the participants with OCD always took longer than the NA control and GAD groups to infer a link between two statements and to elaborate on this possible link. The OCD group alone showed a significant decrease in their degree of conviction about an arbitrary statement after inductively generating reasons to support this statement. Differences in probabilistic reasoning replicated those of previous authors. The results pinpoint the importance of examining inference processes in people with OCD in order to further refine the clinical applications of behavioural-cognitive therapy for this disorder.

  7. Probability density of tunneled carrier states near heterojunctions calculated numerically by the scattering method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R.; Myers, Samuel M.; Modine, Normand A.

    2017-09-01

    The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.

  8. A very efficient approach to compute the first-passage probability density function in a time-changed Brownian model: Applications in finance

    NASA Astrophysics Data System (ADS)

    Ballestra, Luca Vincenzo; Pacelli, Graziella; Radi, Davide

    2016-12-01

    We propose a numerical method to compute the first-passage probability density function in a time-changed Brownian model. In particular, we derive an integral representation of such a density function in which the integrand functions must be obtained solving a system of Volterra equations of the first kind. In addition, we develop an ad-hoc numerical procedure to regularize and solve this system of integral equations. The proposed method is tested on three application problems of interest in mathematical finance, namely the calculation of the survival probability of an indebted firm, the pricing of a single-knock-out put option and the pricing of a double-knock-out put option. The results obtained reveal that the novel approach is extremely accurate and fast, and performs significantly better than the finite difference method.

  9. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  10. Verbal versus Numerical Probabilities: Does Format Presentation of Probabilistic Information regarding Breast Cancer Screening Affect Women's Comprehension?

    ERIC Educational Resources Information Center

    Vahabi, Mandana

    2010-01-01

    Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…

  11. Increases in Phonotactic Probability Facilitate Spoken Nonword Repetition

    ERIC Educational Resources Information Center

    Vitevitch, M.S.; Luce, P.A.

    2005-01-01

    Lipinski and Gupta (2005) report the results of 12 experiments and numerous analyses that attempted to examine further the effects of phonotactic probability originally reported in Vitevitch and Luce (1998, and further explored in Vitevitch & Luce 1999). They suggested that Vitevitch and Luce's results were due to differences in the duration of…

  12. Multipartite entanglement characterization of a quantum phase transition

    NASA Astrophysics Data System (ADS)

    Costantini, G.; Facchi, P.; Florio, G.; Pascazio, S.

    2007-07-01

    A probability density characterization of multipartite entanglement is tested on the one-dimensional quantum Ising model in a transverse field. The average and second moment of the probability distribution are numerically shown to be good indicators of the quantum phase transition. We comment on multipartite entanglement generation at a quantum phase transition.

  13. Use of artificial landscapes to isolate controls on burn probability

    Treesearch

    Marc-Andre Parisien; Carol Miller; Alan A. Ager; Mark A. Finney

    2010-01-01

    Techniques for modeling burn probability (BP) combine the stochastic components of fire regimes (ignitions and weather) with sophisticated fire growth algorithms to produce high-resolution spatial estimates of the relative likelihood of burning. Despite the numerous investigations of fire patterns from either observed or simulated sources, the specific influence of...

  14. The Neural Correlates of Health Risk Perception in Individuals with Low and High Numeracy

    ERIC Educational Resources Information Center

    Vogel, Stephan E.; Keller, Carmen; Koschutnig, Karl; Reishofer, Gernot; Ebner, Franz; Dohle, Simone; Siegrist, Michael; Grabner, Roland H.

    2016-01-01

    The ability to use numerical information in different contexts is a major goal of mathematics education. In health risk communication, outcomes of a medical condition are frequently expressed in probabilities. Difficulties to accurately represent probability information can result in unfavourable medical decisions. To support individuals with…

  15. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-01-07

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web based survey and revised during a three day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org).To encourage dissemination of the TRIPOD Statement, this article is freely accessible on the Annals of Internal Medicine Web site (www.annals.org) and will be also published in BJOG, British Journal of Cancer, British Journal of Surgery, BMC Medicine, The BMJ, Circulation, Diabetic Medicine, European Journal of Clinical Investigation, European Urology, and Journal of Clinical Epidemiology. The authors jointly hold the copyright of this article. An accompanying explanation and elaboration article is freely available only on www.annals.org; Annals of Internal Medicine holds copyright for that article. © BMJ Publishing Group Ltd 2014.

  16. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis (TRIPOD): The TRIPOD Statement.

    PubMed

    Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M

    2015-06-01

    Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evidence shows that the quality of reporting of prediction model studies is poor. Only with full and clear reporting of information on all aspects of a prediction model can risk of bias and potential usefulness of prediction models be adequately assessed. The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. This article describes how the TRIPOD Statement was developed. An extensive list of items based on a review of the literature was created, which was reduced after a Web-based survey and revised during a 3-day meeting in June 2011 with methodologists, health care professionals, and journal editors. The list was refined during several meetings of the steering group and in e-mail discussions with the wider group of TRIPOD contributors. The resulting TRIPOD Statement is a checklist of 22 items, deemed essential for transparent reporting of a prediction model study. The TRIPOD Statement aims to improve the transparency of the reporting of a prediction model study regardless of the study methods used. The TRIPOD Statement is best used in conjunction with the TRIPOD explanation and elaboration document. To aid the editorial process and readers of prediction model studies, it is recommended that authors include a completed checklist in their submission (also available at www.tripod-statement.org). The Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) Initiative developed a set of recommendations for the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Secondary school biology teaching, 1983--2004: Objectives as stated in periodical literature

    NASA Astrophysics Data System (ADS)

    Russell, James W., Sr.

    Purpose of the study. The major purpose of this study was to identify and to classify objectives for teaching biology in secondary school in the United States during the period 1983-2004. These objectives were identified by objective statements in articles from selected professional periodicals. Procedure. The 1983-2004 period was divided into four subperiods on the basis of major historical events. Selected professional periodicals were searched for statements of objectives of secondary school biology teaching. These statements were catalogued into Knowledge, Process, Product, Attitude and Interest, or Cultural Awareness categories. The resulting data were classified within and across the four subperiods according to frequency of occurrence, category, authorship, and year. Findings. The major findings of this investigation included the following: (1) Authorships in Higher Education produced the most articles and the most statements in each subperiod. Miscellaneous authors produced the least articles and statements. (2) Statements in the Attitude and Interest category were the most frequent in the four subperiods. (3) The "most important" objectives for secondary school biology teaching were Presents major facts, principles, or fundamentals (from the Knowledge category), Expresses scientific attitudes and appreciation, Identifies the nature of science and scientists, and Identifies scientific interest and career development (from the Attitude and Interest category), and Develops scientific method of thinking (from the Process category). Conclusions. Based on the findings of this investigation, the following conclusions were made: (1) The objectives for teaching secondary school biology were influenced by historical events, especially the publication of A Nation at Risk: The Imperative for Educational Reform in 1983, America 2000 in 1988, Goals 2000 in 1994, No Child Left Behind in 2000. The rapid growth and expansion of technology and the World Wide Web during the time span of the study also influenced the number of objectives. (2) Authors in Higher Education wrote more articles about the objectives for the teaching of secondary school biology than those in Secondary Education or other categories. This was probably a reflection of the "publish or perish" environment in many colleges and universities.

  19. Study on Effects of the Stochastic Delay Probability for 1d CA Model of Traffic Flow

    NASA Astrophysics Data System (ADS)

    Xue, Yu; Chen, Yan-Hong; Kong, Ling-Jiang

    Considering the effects of different factors on the stochastic delay probability, the delay probability has been classified into three cases. The first case corresponding to the brake state has a large delay probability if the anticipant velocity is larger than the gap between the successive cars. The second one corresponding to the following-the-leader rule has intermediate delay probability if the anticipant velocity is equal to the gap. Finally, the third case is the acceleration, which has minimum delay probability. The fundamental diagram obtained by numerical simulation shows the different properties compared to that by the NaSch model, in which there exist two different regions, corresponding to the coexistence state, and jamming state respectively.

  20. Prioritizing the School Environment in School Violence Prevention Efforts

    PubMed Central

    Burke, Jessica Griffin; Gielen, Andrea Carlson

    2015-01-01

    Background Numerous studies have demonstrated an association between characteristics of the school environment and the likelihood of school violence. However, little is known about the relative importance of various characteristics of the school environment or their differential impact on multiple violence outcomes. Methods Primarily African-American students (n=27) from Baltimore City high schools participated in concept mapping sessions, which produced interpretable maps of the school environment's contribution to school violence. Participants generated statements about their school environment's influence on school violence and with the assistance of quantitative methods grouped these statements according to their similarity. Participants provided information about the importance of each of these statements for the initiation, cessation, and severity of the violence that occurs at school. Results More than half of the 132 statements generated by students were rated as school environment characteristics highly important for the initiation, cessation, and/or severity of school violence. Participants identified students' own actions, expectations for disruptive behavior, and the environment outside the school as characteristics most important for the initiation and increased severity of violence that occurs in school. Participants had a more difficult time identifying school environment characteristics important for the cessation of school violence. Conclusion This study provides support from students for the role of the school environment in school violence prevention, particularly in preventing the initiation and reducing the severity of school violence. Schools can utilize the information presented in this paper to begin discussions with students and staff about prioritizing school environment changes to reduce school violence. PMID:21592128

  1. Essential Public Health Competencies for Medical Students: Establishing a Consensus in Family Medicine.

    PubMed

    Morley, Christopher P; Rosas, Scott R; Mishori, Ranit; Jordan, William; Jarris, Yumi Shitama; Competencies Work Group, Family Medicine/Public Health; Prunuske, Jacob

    2017-01-01

    Phenomenon: The integration of public health (PH) competency training into medical education, and further integration of PH and primary care, has been urged by the U.S. Institute of Medicine. However, PH competencies are numerous, and no consensus exists over which competencies are most important for adoption by current trainees. Our objective was to conduct a group concept mapping exercise with stakeholders identifying the most important and feasible PH skills to incorporate in medical and residency curricula. We utilized a group concept mapping technique via the Concept System Global Max ( http://www.conceptsystems.com ), where family medicine educators and PH professionals completed the phrase, "A key Public Health competency for physicians-in-training to learn is …" with 1-10 statements. The statement list was edited for duplication and other issues; stakeholders then sorted the statements and rated them for importance and feasibility of integration. Multidimensional scaling and cluster analysis were used to create a two-dimensional point map of domains of PH training, allowing visual comparison of groupings of related ideas and relative importance of these ideas. There were 116 nonduplicative statements (225 total) suggested by 120 participants. Three metacategories of competencies emerged: Clinic, Community & Culture, Health System Understanding, and Population Health Science & Data. Insights: We identified and organized a set of topics that serve as a foundation for the integration of family medicine and PH education. Incorporating these topics into medical education is viewed as important and feasible by family medicine educators and PH professions.

  2. Blocking performance approximation in flexi-grid networks

    NASA Astrophysics Data System (ADS)

    Ge, Fei; Tan, Liansheng

    2016-12-01

    The blocking probability to the path requests is an important issue in flexible bandwidth optical communications. In this paper, we propose a blocking probability approximation method of path requests in flexi-grid networks. It models the bundled neighboring carrier allocation with a group of birth-death processes and provides a theoretical analysis to the blocking probability under variable bandwidth traffic. The numerical results show the effect of traffic parameters to the blocking probability of path requests. We use the first fit algorithm in network nodes to allocate neighboring carriers to path requests in simulations, and verify approximation results.

  3. Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice

    NASA Astrophysics Data System (ADS)

    Chen, Haiyan; Zhang, Fuji

    2013-08-01

    In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.

  4. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non-rich vectors, does not involve variational theory and does not involve differential equations, but is a better approximation of the minimal entropy path distance than the distance //b-a//(2). We compute minimal entropy distance matrices for examples of DNA myostatin genes and amino-acid sequences across several species. Output tree dendograms for our minimal entropy metric are compared with dendograms based on BLAST and BLAST identity scores.

  5. Communicating Disclosure Risk in Informed Consent Statements

    PubMed Central

    Singer, Eleanor; Couper, Mick P.

    2011-01-01

    For several years, we have experimented with various ways of communicating disclosure risk and harm to respondents in order to determine how these affect their willingness to participate in surveys. These experiments, which used vignettes administered to an online panel as well as a mail survey sent to a national probability sample, have demonstrated that (a) the probability of disclosure alone has no apparent effect on people's willingness to participate in the survey described, (b) the sensitivity of the survey topic has such an effect, and (c) making explicit the possible harms that might result from disclosure also reduces willingness to participate, in both the vignette and the mail experiments. As a last study in this series, we experimented with different ways of describing disclosure risk in informed consent statements that might more plausibly be used in real surveys, again using vignettes administered to an online panel. As suggested by our earlier work, we found that the precise wording of the confidentiality assurance had little effect on respondents' stated willingness to participate in the hypothetical survey described. However, the experimental manipulations did have some effect on perceptions of the risks and benefits of participation, suggesting that they are processed by respondents. And, as we have found in our previous studies, the topic of the survey has a consistent and statistically significant effect on stated willingness to participate. We explore some implications of these findings for researchers seeking to provide adequate information to potential survey respondents without alarming them unnecessarily. PMID:20831416

  6. Equivalence principle for quantum systems: dephasing and phase shift of free-falling particles

    NASA Astrophysics Data System (ADS)

    Anastopoulos, C.; Hu, B. L.

    2018-02-01

    We ask the question of how the (weak) equivalence principle established in classical gravitational physics should be reformulated and interpreted for massive quantum objects that may also have internal degrees of freedom (dof). This inquiry is necessary because even elementary concepts like a classical trajectory are not well defined in quantum physics—trajectories originating from quantum histories become viable entities only under stringent decoherence conditions. From this investigation we posit two logically and operationally distinct statements of the equivalence principle for quantum systems. Version A: the probability distribution of position for a free-falling particle is the same as the probability distribution of a free particle, modulo a mass-independent shift of its mean. Version B: any two particles with the same velocity wave-function behave identically in free fall, irrespective of their masses. Both statements apply to all quantum states, including those without a classical correspondence, and also for composite particles with quantum internal dof. We also investigate the consequences of the interaction between internal and external dof induced by free fall. For a class of initial states, we find dephasing occurs for the translational dof, namely, the suppression of the off-diagonal terms of the density matrix, in the position basis. We also find a gravitational phase shift in the reduced density matrix of the internal dof that does not depend on the particle’s mass. For classical states, the phase shift has a natural classical interpretation in terms of gravitational red-shift and special relativistic time-dilation.

  7. Screening Smoke Performance of Commercially Available Powders. 2. Visible Screening by Titanium Dioxide

    DTIC Science & Technology

    1994-06-01

    NY 10562 (Continued on page ii) 12a. DISTRIBUTION I AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is...the weight F., volume (1,. and financial ()F limited figures of merit for numerous grades of commercially available titania are listed in Table 1...is usefull in volume limited applications such as grenades, rockets, artillery rounds, mortars and smoke pots. The third figure of merit gives the

  8. Environmental impact statement for National Aeronautics and Space Administration Lewis Research Center, Cleveland, Ohio

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The probable environmental impact and adverse effects of the Lewis Research Center are assessed. The Cleveland and Plum Brook facilities are briefly described. It is felt that the absence of harmful environmental impact from the Cleveland site is apparent, and the monitoring at the Plum Brook reactor facility shows the effectiveness of effluent controls. The probable adverse effects are considered for air, water, and noise pollution, and radioactive and hazardous waste storage and disposal; it is concluded that all emissions are maintained below Federal, and local standards. There are no appropriate alternatives to the operation of the Center, and no improvement in environmental quality would result from relocation. The relationship between local short-term productivity is briefly discussed. No adverse comment has been received from public agencies or private organizations or individuals.

  9. The Effect of Outcome Desirability on Comparisons of Numerical and Linguistic Probabilities

    DTIC Science & Technology

    1986-01-01

    Shakespeare was thinking of Ann Hathaway when he wrote his twelfth sonnet . Beyth-Marom (1982) suggested other reasons for the use of non-numerical...chance" with reference to the event that Shakespeare was thinking of Ann Hathaway when he wrote his twelfth sonnet . Beyth-Marom (1982) suggested other

  10. Words or numbers? Communicating risk of adverse effects in written consumer health information: a systematic review and meta-analysis.

    PubMed

    Büchter, Roland Brian; Fechtelpeter, Dennis; Knelangen, Marco; Ehrlich, Martina; Waltering, Andreas

    2014-08-26

    Various types of framing can influence risk perceptions, which may have an impact on treatment decisions and adherence. One way of framing is the use of verbal terms in communicating the probabilities of treatment effects. We systematically reviewed the comparative effects of words versus numbers in communicating the probability of adverse effects to consumers in written health information. Nine electronic databases were searched up to November 2012. Teams of two reviewers independently assessed studies. randomised controlled trials; verbal versus numerical presentation; context: written consumer health information. Ten trials were included. Participants perceived probabilities presented in verbal terms as higher than in numeric terms: commonly used verbal descriptors systematically led to an overestimation of the absolute risk of adverse effects (Range of means: 3% - 54%). Numbers also led to an overestimation of probabilities, but the overestimation was smaller (2% - 20%). The difference in means ranged from 3.8% to 45.9%, with all but one comparison showing significant results. Use of numbers increased satisfaction with the information (MD: 0.48 [CI: 0.32 to 0.63], p < 0.00001, I2 = 0%) and likelihood of medication use (MD for very common side effects: 1.45 [CI: 0.78 to 2.11], p = 0.0001, I2 = 68%; MD for common side effects: 0.90 [CI: 0.61 to 1.19], p < 0.00001, I2 = 1%; MD for rare side effects: 0.39 [0.02 to 0.76], p = 0.04, I2 = not applicable). Outcomes were measured on a 6-point Likert scale, suggesting small to moderate effects. Verbal descriptors including "common", "uncommon" and "rare" lead to an overestimation of the probability of adverse effects compared to numerical information, if used as previously suggested by the European Commission. Numbers result in more accurate estimates and increase satisfaction and likelihood of medication use. Our review suggests that providers of consumer health information should quantify treatment effects numerically. Future research should focus on the impact of personal and contextual factors, use representative samples or be conducted in real life settings, measure behavioral outcomes and address whether benefit information can be described verbally.

  11. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  12. Medical concepts related to individual risk are better explained with "plausibility" rather than "probability".

    PubMed

    Grossi, Enzo

    2005-09-27

    The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects.

  13. Numerical Representations and Intuitions of Probabilities at 12 Months

    ERIC Educational Resources Information Center

    Téglás, Erno; Ibanez-Lillo, Alexandra; Costa, Albert; Bonatti, Luca L.

    2015-01-01

    Recent research shows that preverbal infants can reason about single-case probabilities without relying on observed frequencies, adapting their predictions to relevant dynamic parameters of the situation (Téglás, Vul, Girotto, Gonzalez, Tenenbaum & Bonatti, [Téglás, E., 2011]; Téglás, Girotto, Gonzalez & Bonatti, [Téglás, E., 2007]). Here…

  14. Multicriteria decision analysis applied to Glen Canyon Dam

    USGS Publications Warehouse

    Flug, M.; Seitz, H.L.H.; Scott, J.F.

    2000-01-01

    Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.

  15. Geotechnical support and topical studies for nuclear waste geologic repositories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    The present report lists the technical reviews and comments made during the fiscal year 1988 and summarizes the technical progress of the topical studies. In the area of technical assistance, there were numerous activities detailed in the next section. These included 24 geotechnical support activities, including reviews of 6 Study Plans (SP) and participation in 6 SP Review Workshops, review of one whole document Site Characterization Plan (SCP) and participation in the Assembled Document SCP Review Workshops by 6 LBL reviewers; the hosting of a DOE program review, the rewriting of the project statement of work, 2 trips to technicalmore » and planning meetings; preparation of proposed work statements for two new topics for DOE, and 5 instances of technical assistance to DOE. These activities are described in a Table in the following section entitled Geoscience Technical Support for Nuclear Waste Geologic Repositories.''« less

  16. Arctic National Wildlife Refuge, Alaska. Hearing before the Committee on Energy and Natural Resources, United States Senate, One Hundredth Congress, First Session, Part 2, July 22, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    This hearing consisted primarily of the testimonies of two witnesses: Roger Herrera, Manager, Exploration and Lands, Standard Oil Production Co.; and Tim Mahoney, Alaska Coaliton, Washington, DC, representing the Sierra Club. The statements of these two, plus questions from the Committee, were to address six issues primarily in the gas and oil production versus environmental debate: (1) availability of water; (2) availability of gravel; (3) disposal of waste and toxic materials; (4) the concentrated caribou calving areas; (5) the environmental record at Prudhoe Bay; and (6) air-quality issues. Sen. Fran H. Murkowski of Alaska, in noting the conflicting statements ofmore » the two witnesses noted that many of the environmental questions raised were also raised for Prudhoe Bay; further, the problems are probably not as difficult. Mr. Mahoney foresees, but the solution not as easy as Mr. Herrera, representing the oil interests, foresees.« less

  17. Statistical computation of tolerance limits

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1993-01-01

    Based on a new theory, two computer codes were developed specifically to calculate the exact statistical tolerance limits for normal distributions within unknown means and variances for the one-sided and two-sided cases for the tolerance factor, k. The quantity k is defined equivalently in terms of the noncentral t-distribution by the probability equation. Two of the four mathematical methods employ the theory developed for the numerical simulation. Several algorithms for numerically integrating and iteratively root-solving the working equations are written to augment the program simulation. The program codes generate some tables of k's associated with the varying values of the proportion and sample size for each given probability to show accuracy obtained for small sample sizes.

  18. Statistics of voids in hierarchical universes

    NASA Technical Reports Server (NTRS)

    Fry, J. N.

    1986-01-01

    As one alternative to the N-point galaxy correlation function statistics, the distribution of holes or the probability that a volume of given size and shape be empty of galaxies can be considered. The probability of voids resulting from a variety of hierarchical patterns of clustering is considered, and these are compared with the results of numerical simulations and with observations. A scaling relation required by the hierarchical pattern of higher order correlation functions is seen to be obeyed in the simulations, and the numerical results show a clear difference between neutrino models and cold-particle models; voids are more likely in neutrino universes. Observational data do not yet distinguish but are close to being able to distinguish between models.

  19. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  20. A Continuous Method for Gene Flow

    PubMed Central

    Palczewski, Michal; Beerli, Peter

    2013-01-01

    Most modern population genetics inference methods are based on the coalescence framework. Methods that allow estimating parameters of structured populations commonly insert migration events into the genealogies. For these methods the calculation of the coalescence probability density of a genealogy requires a product over all time periods between events. Data sets that contain populations with high rates of gene flow among them require an enormous number of calculations. A new method, transition probability-structured coalescence (TPSC), replaces the discrete migration events with probability statements. Because the speed of calculation is independent of the amount of gene flow, this method allows calculating the coalescence densities efficiently. The current implementation of TPSC uses an approximation simplifying the interaction among lineages. Simulations and coverage comparisons of TPSC vs. MIGRATE show that TPSC allows estimation of high migration rates more precisely, but because of the approximation the estimation of low migration rates is biased. The implementation of TPSC into programs that calculate quantities on phylogenetic tree structures is straightforward, so the TPSC approach will facilitate more general inferences in many computer programs. PMID:23666937

  1. The Tightness of the Kesten-Stigum Reconstruction Bound of Symmetric Model with Multiple Mutations

    NASA Astrophysics Data System (ADS)

    Liu, Wenjian; Jammalamadaka, Sreenivasa Rao; Ning, Ning

    2018-02-01

    It is well known that reconstruction problems, as the interdisciplinary subject, have been studied in numerous contexts including statistical physics, information theory and computational biology, to name a few. We consider a 2 q-state symmetric model, with two categories of q states in each category, and 3 transition probabilities: the probability to remain in the same state, the probability to change states but remain in the same category, and the probability to change categories. We construct a nonlinear second-order dynamical system based on this model and show that the Kesten-Stigum reconstruction bound is not tight when q ≥ 4.

  2. The Coast Artillery Journal. Volume 59, Number 1, July 1923

    DTIC Science & Technology

    1923-07-01

    probably influenced by Nelson’s statement:-"By the time the enemy has beat our fleet soundly, they ’will do us no harm this year." (Mahan’s Life of...IUllllll1lllllllnlllllllllUIJIIIIJIII.IUIJIIIIIIIIIIIIIIIIU.lIIJIIJI1IIIIUIIIJIIIIIIIUIUIIIIJIIIU.IJJIIIIIIIIJUIJIIIJ.I The Determination of Azimuth by Means of the Binaural Sense By Captain Richard B. Webb, C. A. C. 1. The...known as the binaural sense. It is well known that a man standing in the open, where he is not bothered by echoes is able, upon hearing a prolonged

  3. Strategic, Tactical and Doctrinal Military Concepts. The Deterrence Concept: A Synthesis Based on a Survey of the Literature.

    DTIC Science & Technology

    1980-04-01

    influence appears strongest, that the work is most ab - stract, and that the results have seemed to cause the most prob- lems when applied to the real...deterrence to the world at large, for Churchill’s clear exposi - tion of the basic concept, and as probably the first major statement on deterrence...works cited are in general ab - stract and theoretical, or minor and irrelevant. Also, since most of the bibliography cited is from the 1950’s (a single

  4. Weak Links: Stabilizers of Complex Systems from Proteins to Social Networks

    NASA Astrophysics Data System (ADS)

    Csermely, Peter

    Why do women stabilize our societies? Why can we enjoy and understand Shakespeare? Why are fruitflies uniform? Why do omnivorous eating habits aid our survival? Why is Mona Lisa's smile beautiful? -- Is there any answer to these questions? This book shows that the statement: "weak links stabilize complex systems" holds the answers to all of the surprising questions above. The author (recipientof several distinguished science communication prizes) uses weak (low affinity, low probability) interactions as a thread to introduce a vast varietyof networks from proteins to ecosystems.

  5. Comment on "Null weak values and the past of a quantum particle"

    NASA Astrophysics Data System (ADS)

    Sokolovski, D.

    2018-04-01

    In a recent paper [Phys. Rev. A 95, 032110 (2017), 10.1103/PhysRevA.95.032110], Duprey and Matzkin investigated the meaning of vanishing weak values and their role in the retrodiction of the past of a preselected and postselected quantum system in the presence of interference. Here we argue that any proposition regarding the weak values should be understood as a statement about the probability amplitudes. With this in mind, we revisit some of the conclusions reached in Duprey and Matzkin's work.

  6. Draft Environmental Statement/Environmental Impact Report. North Bay Aqueduct (Phase II Facilities) Solano County, California.

    DTIC Science & Technology

    1981-06-01

    be Blow-off pipes in urban areas will be required to "clear" Cache Slough for the fenced to screen them from children , intake would be approximately...tip of Grand numbers of school-aged children . Island southeast of the intake locations. This disposal site, which would probably be best reached by...ind shel I h eads to Porno , Cent ral Wiat un , Wappo :in(I ’e()Ll! litrn Ma idu groups (Davis, 1974 :34-35). of the most distinctive asp~ect.,- of

  7. Optimal Strategy in the "Price Is Right" Showcase Showdown: A Module for Students of Calculus and Probability

    ERIC Educational Resources Information Center

    Swenson, Daniel

    2015-01-01

    We walk through a module intended for undergraduates in mathematics, with the focus of finding the best strategies for competing in the Showcase Showdown on the game show "The Price Is Right." Students should have completed one semester of calculus, as well as some probability. We also give numerous suggestions for further questions that…

  8. Environmental Impact Statement for the New San Clemente Project, Monterey County, California - Regulatory Permit Application Number 16516S09.

    DTIC Science & Technology

    1987-09-01

    through 17. Each chapter is organized in three sections: 1) a description of the environmental setting; 2) an assessment of the environmental impacts...of operation of each alternative; and 3) an assessment of the environmental impacts of construction of each alternative. The environmental impacts of...involvement is discussed in Chapter 20. Contributions to the report are listed in Chapter 21. The District and their consultants have conducted numerous

  9. Mathematical modeling of radiative-conductive heat transfer in semitransparent medium with phase change

    NASA Astrophysics Data System (ADS)

    Savvinova, Nadezhda A.; Sleptsov, Semen D.; Rubtsov, Nikolai A.

    2017-11-01

    A mathematical phase change model is a formulation of the Stefan problem. Various formulations of the Stefan problem modeling of radiative-conductive heat transfer during melting or solidification of a semitransparent material are presented. Analysis of numerical results show that the radiative heat transfer has a significant effect on temperature distributions during melting (solidification) of the semitransparent material. In this paper conditions for application of various statements of the Stefan problem are analyzed.

  10. Flambeau Mining Corporation, Ladysmith, Rusk County, Wisconsin. Proposed Open Pit Copper Mine and Waste Containment Area, Draft Environmental Impact Statement.

    DTIC Science & Technology

    1976-08-01

    American elm Lonicera tatarica - tartarian honeysuckle Ulmus rubra - slippery elm Siiibucus canadenis - common elder Ulmus thoiii~sii - cork elm ...community borders the marshes and swamps. 2.060 The predominant species are the trembling aspen (Populus tremuloides), red maple (Acer rubrum), the elms ...succession. The most numerous trees (in descending order) are: white birch, red maple, aspen, sugar maple, black ash, basswood, elm (Ulmus sp.), hemlock

  11. Impact of Typhoons on the Western Pacific Ocean (ITOP) DRI:Numerical Modeling of Ocean Mixed Layer Turbulence and Entrainment at High Winds

    DTIC Science & Technology

    2013-09-23

    DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Impact of Typhoons on the Western Pacific Ocean (ITOP) DRI...measurement and modeling activities include a focus on the impact of surface waves, air- sea fluxes and the temperature, salinity and velocity structure...moment closure (SMC) to represent the impact of Langmuir turbulence. WORK COMPLETED Encouraged by good quantitative comparisons between LES

  12. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  13. Infinite capacity multi-server queue with second optional service channel

    NASA Astrophysics Data System (ADS)

    Ke, Jau-Chuan; Wu, Chia-Huang; Pearn, Wen Lea

    2013-02-01

    This paper deals with an infinite-capacity multi-server queueing system with a second optional service (SOS) channel. The inter-arrival times of arriving customers, the service times of the first essential service (FES) and the SOS channel are all exponentially distributed. A customer may leave the system after the FES channel with probability (1-θ), or at the completion of the FES may immediately require a SOS with probability θ (0 <= θ <= 1). The formulae for computing the rate matrix and stationary probabilities are derived by means of a matrix analytical approach. A cost model is developed to determine the optimal values of the number of servers and the two service rates, simultaneously, at the minimal total expected cost per unit time. Quasi-Newton method are employed to deal with the optimization problem. Under optimal operating conditions, numerical results are provided in which several system performance measures are calculated based on assumed numerical values of the system parameters.

  14. Approach to numerical safety guidelines based on a core melt criterion. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azarm, M.A.; Hall, R.E.

    1982-01-01

    A plausible approach is proposed for translating a single level criterion to a set of numerical guidelines. The criterion for core melt probability is used to set numerical guidelines for various core melt sequences, systems and component unavailabilities. These guidelines can be used as a means for making decisions regarding the necessity for replacing a component or improving part of a safety system. This approach is applied to estimate a set of numerical guidelines for various sequences of core melts that are analyzed in Reactor Safety Study for the Peach Bottom Nuclear Power Plant.

  15. Prioritizing the school environment in school violence prevention efforts.

    PubMed

    Johnson, Sarah Lindstrom; Burke, Jessica G; Gielen, Andrea C

    2011-06-01

    Numerous studies have demonstrated an association between characteristics of the school environment and the likelihood of school violence. However, little is known about the relative importance of various characteristics of the school environment or their differential impact on multiple violence outcomes. Primarily African-American students (n = 27) from Baltimore City high schools participated in concept mapping sessions, which produced interpretable maps of the school environment's contribution to school violence. Participants generated statements about their school environment's influence on school violence and, with the assistance of quantitative methods, grouped these statements according to their similarity. Participants provided information about the importance of each of these statements for the initiation, cessation, and severity of the violence that occurs at school. More than half of the 132 statements generated by students were rated as school environment characteristics highly important for the initiation, cessation, and/or severity of school violence. Participants identified students' own actions, expectations for disruptive behavior, and the environment outside the school as the characteristics most important for the initiation and increased severity of violence that occurs in school. Participants had a more difficult time identifying school environment characteristics important for the cessation of school violence. This study provides support from students for the role of the school environment in school violence prevention, particularly in preventing the initiation and reducing the severity of school violence. Schools can utilize the information presented in this article to begin discussions with students and staff about prioritizing school environment changes to reduce school violence. © 2011, American School Health Association.

  16. Does Iconicity in Pictographs Matter? The Influence of Iconicity and Numeracy on Information Processing, Decision Making, and Liking in an Eye-Tracking Study.

    PubMed

    Kreuzmair, Christina; Siegrist, Michael; Keller, Carmen

    2017-03-01

    Researchers recommend the use of pictographs in medical risk communication to improve people's risk comprehension and decision making. However, it is not yet clear whether the iconicity used in pictographs to convey risk information influences individuals' information processing and comprehension. In an eye-tracking experiment with participants from the general population (N = 188), we examined whether specific types of pictograph icons influence the processing strategy viewers use to extract numerical information. In addition, we examined the effect of iconicity and numeracy on probability estimation, recall, and icon liking. This experiment used a 2 (iconicity: blocks vs. restroom icons) × 2 (scenario: medical vs. nonmedical) between-subject design. Numeracy had a significant effect on information processing strategy, but we found no effect of iconicity or scenario. Results indicated that both icon types enabled high and low numerates to use their default way of processing and extracting the gist of the message from the pictorial risk communication format: high numerates counted icons, whereas low numerates used large-area processing. There was no effect of iconicity in the probability estimation. However, people who saw restroom icons had a higher probability of correctly recalling the exact risk level. Iconicity had no effect on icon liking. Although the effects are small, our findings suggest that person-like restroom icons in pictographs seem to have some advantages for risk communication. Specifically, in nonpersonalized prevention brochures, person-like restroom icons may maintain reader motivation for processing the risk information. © 2016 Society for Risk Analysis.

  17. Urticaria: "You're Probably Just Allergic to Something".

    PubMed

    Smallwood, Jordan

    2016-11-01

    Urticaria is a common symptom seen in pediatric patients, and it has multiple allergic and nonallergic causes. Unfortunately, it is far too common that when children present acutely for urticaria, they are told that it is an "allergy." This statement often leads to increased anxiety while the patient waits to be evaluated by an allergist/immunologist. This article discusses the frequency that allergic reactions are involved in urticaria and provides examples of potential nonallergic causes. Additionally, the article discusses approaches to treatment that may be appropriate to initiate in the pediatrician's office or acute setting. This article is intended to provide a broader understanding of urticaria and its management in the outpatient or emergency setting so that we are able to tell our patients more than"you're probably just allergic to something." [Pediatr Ann. 2016;45(11):e399-e402.]. Copyright 2016, SLACK Incorporated.

  18. Numerical optimization using flow equations.

    PubMed

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  19. Numerical optimization using flow equations

    NASA Astrophysics Data System (ADS)

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  20. The Use of Input-Output Control System Analysis for Sustainable Development of Multivariable Environmental Systems

    NASA Astrophysics Data System (ADS)

    Koliopoulos, T. C.; Koliopoulou, G.

    2007-10-01

    We present an input-output solution for simulating the associated behavior and optimized physical needs of an environmental system. The simulations and numerical analysis determined the accurate boundary loads and areas that were required to interact for the proper physical operation of a complicated environmental system. A case study was conducted to simulate the optimum balance of an environmental system based on an artificial intelligent multi-interacting input-output numerical scheme. The numerical results were focused on probable further environmental management techniques, with the objective of minimizing any risks and associated environmental impact to protect the quality of public health and the environment. Our conclusions allowed us to minimize the associated risks, focusing on probable cases in an emergency to protect the surrounded anthropogenic or natural environment. Therefore, the lining magnitude could be determined for any useful associated technical works to support the environmental system under examination, taking into account its particular boundary necessities and constraints.

  1. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  2. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    PubMed

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. How Parents Read Counting Books and Non-numerical Books to Their Preverbal Infants: An Observational Study.

    PubMed

    Goldstein, Alison; Cole, Thomas; Cordes, Sara

    2016-01-01

    Studies have stressed the importance of counting with children to promote formal numeracy abilities; however, little work has investigated when parents begin to engage in this behavior with their young children. In the current study, we investigated whether parents elaborated on numerical information when reading a counting book to their preverbal infants and whether developmental differences in numerical input exist even in the 1st year of life. Parents and their 5-10 months old infants were asked to read, as they would at home, two books to their infants: a counting book and another book that did not have numerical content. Parents' spontaneous statements rarely focused on number and those that did consisted primarily of counting, with little emphasis on labeling the cardinality of the set. However, developmental differences were observed even in this age range, such that parents were more likely to make numerical utterances when reading to older infants. Together, results are the first to characterize naturalistic reading behaviors between parents and their preverbal infants in the context of counting books, suggesting that although counting books promote numerical language in parents, infants still receive very little in the way of numerical input before the end of the 1st year of life. While little is known regarding the impact of number talk on the cognitive development of young infants, the current results may guide future work in this area by providing the first assessment of the characteristics of parental numerical input to preverbal infants.

  4. Future aeromedical assessment in general aviation: a contribution to the actual discussion

    PubMed Central

    Siedenburg, J

    2008-01-01

    The past years saw a transition of competencies from the Joint Aviation Authorities (JAA) to the European Aviation Safety Agency (EASA), which was founded in 2003, based on EU Regulation 1592/02. EASA started its work in the fields of Airworthiness and will soon its competencies inter alia to Flight Operations and Flight Crew Licensing, the latter including the requirements for aeromedical assessment. The appropriate new EU Regulation will most probably be published in April. It includes the Essential Requirements for Licensing and aeromedical certification. A proposal for a new Commission Regulation promulgates the Implementing Rules for Personnel Licensing, detailing – inter alia – the Medical Requirements (Annex II to the Regulation). The specific rules, numeric standards are published as Acceptable Means of Compliance (AMC) and Guidance Material (GM). The provisions are based on JAR-FCL 3 and have been transposed to the format choosen by EASA by a small working group of aeromedical experts (FCL.001). Comments received by the European Aviation Safety Agency (EASA) prompted the agencys statement that the JAR-FCL 3 requirements for private pilots were excessive and too demanding and that a better regulation in General Aviation had to be developed. Another working group (MDM.032), including one aeromedical specialist, was tasked to draft a set of lighter requirements for non-complex aircraft used in non-commercial operations. In this context a much lighter form of aeromedical assessment - involving self-declaration by the pilot and general practitioners as asessors – has been proposed. PMID:19048096

  5. Synchronization Analysis of Master-Slave Probabilistic Boolean Networks.

    PubMed

    Lu, Jianquan; Zhong, Jie; Li, Lulu; Ho, Daniel W C; Cao, Jinde

    2015-08-28

    In this paper, we analyze the synchronization problem of master-slave probabilistic Boolean networks (PBNs). The master Boolean network (BN) is a deterministic BN, while the slave BN is determined by a series of possible logical functions with certain probability at each discrete time point. In this paper, we firstly define the synchronization of master-slave PBNs with probability one, and then we investigate synchronization with probability one. By resorting to new approach called semi-tensor product (STP), the master-slave PBNs are expressed in equivalent algebraic forms. Based on the algebraic form, some necessary and sufficient criteria are derived to guarantee synchronization with probability one. Further, we study the synchronization of master-slave PBNs in probability. Synchronization in probability implies that for any initial states, the master BN can be synchronized by the slave BN with certain probability, while synchronization with probability one implies that master BN can be synchronized by the slave BN with probability one. Based on the equivalent algebraic form, some efficient conditions are derived to guarantee synchronization in probability. Finally, several numerical examples are presented to show the effectiveness of the main results.

  6. Synchronization Analysis of Master-Slave Probabilistic Boolean Networks

    PubMed Central

    Lu, Jianquan; Zhong, Jie; Li, Lulu; Ho, Daniel W. C.; Cao, Jinde

    2015-01-01

    In this paper, we analyze the synchronization problem of master-slave probabilistic Boolean networks (PBNs). The master Boolean network (BN) is a deterministic BN, while the slave BN is determined by a series of possible logical functions with certain probability at each discrete time point. In this paper, we firstly define the synchronization of master-slave PBNs with probability one, and then we investigate synchronization with probability one. By resorting to new approach called semi-tensor product (STP), the master-slave PBNs are expressed in equivalent algebraic forms. Based on the algebraic form, some necessary and sufficient criteria are derived to guarantee synchronization with probability one. Further, we study the synchronization of master-slave PBNs in probability. Synchronization in probability implies that for any initial states, the master BN can be synchronized by the slave BN with certain probability, while synchronization with probability one implies that master BN can be synchronized by the slave BN with probability one. Based on the equivalent algebraic form, some efficient conditions are derived to guarantee synchronization in probability. Finally, several numerical examples are presented to show the effectiveness of the main results. PMID:26315380

  7. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    NASA Astrophysics Data System (ADS)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  8. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  9. Comments on the AAMC policy statement recommending strategies for increasing the production of generalist physicians.

    PubMed

    Greer, D S; Bhak, K N; Zenker, B M

    1994-04-01

    The United States has a physician specialty imbalance, primarily a shortage of generalists (defined as family physicians, general internists, and general pediatricians) relative to other specialists. In recent years, the rising costs of health care, the expansion of managed care, and problems of access to care have accentuated the critical role that generalists must play in a cost-effective, accessible health care system. Despite numerous public and private initiatives designed to address the supply of generalist physicians, the ratio of generalists to specialists has been decreasing. Although the factors contributing to the shrinking proportion of generalists are many and are often outside the control of educators, there is evidence that medical schools can play a major role in influencing specialty choice. Recognizing the need to address the specialty imbalance in this country, the Association of American Medical Colleges (AAMC) appointed the Generalist Physician Task Force to develop a statement suggesting actions that the AAMC and its constituents could take to foster a greater representation of generalist physicians in the United States. The task force produced an Executive Summary, published as an AAMC policy statement in early 1993, that contained recommended strategies for medical schools, graduate medical education, and the practice environment. The authors of the present article critique these recommendations, provide a background and rationale for each of them, and give suggestions about how some of the recommendations might be implemented. While they are in general agreement with the AAMC policy statement, they feel the recommended strategies fall short of the need. They maintain that the AAMC statement represents an admirable but cautious approach to a daunting problem, and that the time is past when cautious approaches will suffice. The authors conclude with the hope that bolder initiatives will emerge from the new AAMC Office of Generalist Physician Programs.

  10. Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Ardani, S.; Kaihatu, J. M.

    2012-12-01

    Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques, MCMC

  11. Principle of maximum entanglement entropy and local physics of strongly correlated materials.

    PubMed

    Lanatà, Nicola; Strand, Hugo U R; Yao, Yongxin; Kotliar, Gabriel

    2014-07-18

    We argue that, because of quantum entanglement, the local physics of strongly correlated materials at zero temperature is described in a very good approximation by a simple generalized Gibbs distribution, which depends on a relatively small number of local quantum thermodynamical potentials. We demonstrate that our statement is exact in certain limits and present numerical calculations of the iron compounds FeSe and FeTe and of the elemental cerium by employing the Gutzwiller approximation that strongly support our theory in general.

  12. LETTER TO THE EDITOR: Thermally activated processes in magnetic systems consisting of rigid dipoles: equivalence of the Ito and Stratonovich stochastic calculus

    NASA Astrophysics Data System (ADS)

    Berkov, D. V.; Gorn, N. L.

    2002-04-01

    We demonstrate that the Ito and the Stratonovich stochastic calculus lead to identical results when applied to the stochastic dynamics study of magnetic systems consisting of dipoles with the constant magnitude, despite the multiplicative noise appearing in the corresponding Langevin equations. The immediate consequence of this statement is that any numerical method used for the solution of these equations will lead to the physically correct results.

  13. Instrumentation Analysis: An Automated Method for Producing Numeric Abstractions of Heap-Manipulating Programs

    DTIC Science & Technology

    2010-11-29

    Arbib and Suad Alagic. Proof rules for gotos. Acta Informatica , pages 139–148, 1979. 6.3 T. Ball, R. Majumdar, T. Millstein, and S. Rajamani...Press, January 1999. ISBN 0262032708. 3, 3.1, 3.3 323 B Bibliography M. Clint and C.A.R. Hoare. Program proving: Jumps and functions. Acta Informatica ...Goto statements: Semantics and deduction systems. Acta Informatica , pages 385–424, 1981. 6.3 324 B Bibliography Alain Deutsch. Interprocedural may

  14. Electron number probability distributions for correlated wave functions.

    PubMed

    Francisco, E; Martín Pendás, A; Blanco, M A

    2007-03-07

    Efficient formulas for computing the probability of finding exactly an integer number of electrons in an arbitrarily chosen volume are only known for single-determinant wave functions [E. Cances et al., Theor. Chem. Acc. 111, 373 (2004)]. In this article, an algebraic method is presented that extends these formulas to the case of multideterminant wave functions and any number of disjoint volumes. The derived expressions are applied to compute the probabilities within the atomic domains derived from the space partitioning based on the quantum theory of atoms in molecules. Results for a series of test molecules are presented, paying particular attention to the effects of electron correlation and of some numerical approximations on the computed probabilities.

  15. Computational flow development for unsteady viscous flows: Foundation of the numerical method

    NASA Technical Reports Server (NTRS)

    Bratanow, T.; Spehert, T.

    1978-01-01

    A procedure is presented for effective consideration of viscous effects in computational development of high Reynolds number flows. The procedure is based on the interpretation of the Navier-Stokes equations as vorticity transport equations. The physics of the flow was represented in a form suitable for numerical analysis. Lighthill's concept for flow development for computational purposes was adapted. The vorticity transport equations were cast in a form convenient for computation. A statement for these equations was written using the method of weighted residuals and applying the Galerkin criterion. An integral representation of the induced velocity was applied on the basis of the Biot-Savart law. Distribution of new vorticity, produced at wing surfaces over small computational time intervals, was assumed to be confined to a thin region around the wing surfaces.

  16. Feasibility study for a numerical aerodynamic simulation facility. Volume 3: FMP language specification/user manual

    NASA Technical Reports Server (NTRS)

    Kenner, B. G.; Lincoln, N. R.

    1979-01-01

    The manual is intended to show the revisions and additions to the current STAR FORTRAN. The changes are made to incorporate an FMP (Flow Model Processor) for use in the Numerical Aerodynamic Simulation Facility (NASF) for the purpose of simulating fluid flow over three-dimensional bodies in wind tunnel environments and in free space. The FORTRAN programming language for the STAR-100 computer contains both CDC and unique STAR extensions to the standard FORTRAN. Several of the STAR FORTRAN extensions to standard FOR-TRAN allow the FORTRAN user to exploit the vector processing capabilities of the STAR computer. In STAR FORTRAN, vectors can be expressed with an explicit notation, functions are provided that return vector results, and special call statements enable access to any machine instruction.

  17. Generation and Radiation of Acoustic Waves from a 2-D Shear Layer using the CE/SE Method

    NASA Technical Reports Server (NTRS)

    Loh, Ching Y.; Wang, Xiao Y.; Chang, Sin-Chung; Jorgenson, Philip C. E.

    2000-01-01

    In the present work, the generation and radiation of acoustic waves from a 2-D shear layer problem is considered. An acoustic source inside of a 2-D jet excites an instability wave in the shear layer, resulting in sound Mach radiation. The numerical solution is obtained by solving the Euler equations using the space time conservation element and solution element (CE/SE) method. Linearization is achieved through choosing a small acoustic source amplitude. The Euler equations are nondimensionalized as instructed in the problem statement. All other conditions are the same except that the Crocco's relation has a slightly different form. In the following, after a brief sketch of the CE/SE method, the numerical results for this problem are presented.

  18. How to model a negligible probability under the WTO sanitary and phytosanitary agreement?

    PubMed

    Powell, Mark R

    2013-06-01

    Since the 1997 EC--Hormones decision, World Trade Organization (WTO) Dispute Settlement Panels have wrestled with the question of what constitutes a negligible risk under the Sanitary and Phytosanitary Agreement. More recently, the 2010 WTO Australia--Apples Panel focused considerable attention on the appropriate quantitative model for a negligible probability in a risk assessment. The 2006 Australian Import Risk Analysis for Apples from New Zealand translated narrative probability statements into quantitative ranges. The uncertainty about a "negligible" probability was characterized as a uniform distribution with a minimum value of zero and a maximum value of 10(-6) . The Australia - Apples Panel found that the use of this distribution would tend to overestimate the likelihood of "negligible" events and indicated that a triangular distribution with a most probable value of zero and a maximum value of 10⁻⁶ would correct the bias. The Panel observed that the midpoint of the uniform distribution is 5 × 10⁻⁷ but did not consider that the triangular distribution has an expected value of 3.3 × 10⁻⁷. Therefore, if this triangular distribution is the appropriate correction, the magnitude of the bias found by the Panel appears modest. The Panel's detailed critique of the Australian risk assessment, and the conclusions of the WTO Appellate Body about the materiality of flaws found by the Panel, may have important implications for the standard of review for risk assessments under the WTO SPS Agreement. © 2012 Society for Risk Analysis.

  19. A methodology for the transfer of probabilities between accident severity categories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitlow, J. D.; Neuhauser, K. S.

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships.more » These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme.« less

  20. Modeling pore corrosion in normally open gold- plated copper connectors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battaile, Corbett Chandler; Moffat, Harry K.; Sun, Amy Cha-Tien

    2008-09-01

    The goal of this study is to model the electrical response of gold plated copper electrical contacts exposed to a mixed flowing gas stream consisting of air containing 10 ppb H{sub 2}S at 30 C and a relative humidity of 70%. This environment accelerates the attack normally observed in a light industrial environment (essentially a simplified version of the Battelle Class 2 environment). Corrosion rates were quantified by measuring the corrosion site density, size distribution, and the macroscopic electrical resistance of the aged surface as a function of exposure time. A pore corrosion numerical model was used to predict bothmore » the growth of copper sulfide corrosion product which blooms through defects in the gold layer and the resulting electrical contact resistance of the aged surface. Assumptions about the distribution of defects in the noble metal plating and the mechanism for how corrosion blooms affect electrical contact resistance were needed to complete the numerical model. Comparisons are made to the experimentally observed number density of corrosion sites, the size distribution of corrosion product blooms, and the cumulative probability distribution of the electrical contact resistance. Experimentally, the bloom site density increases as a function of time, whereas the bloom size distribution remains relatively independent of time. These two effects are included in the numerical model by adding a corrosion initiation probability proportional to the surface area along with a probability for bloom-growth extinction proportional to the corrosion product bloom volume. The cumulative probability distribution of electrical resistance becomes skewed as exposure time increases. While the electrical contact resistance increases as a function of time for a fraction of the bloom population, the median value remains relatively unchanged. In order to model this behavior, the resistance calculated for large blooms has been weighted more heavily.« less

  1. A PLUG-AND-PLAY ARCHITECTURE FOR PROBABILISTIC PROGRAMMING

    DTIC Science & Technology

    2017-04-01

    programs that use discrete numerical distributions, but even then, the space of possible outcomes may be uncountable (as a solution can be infinite...also identify conditions guaranteeing that all possible outcomes are finite (and then the probability space is discrete ). 2.2.2 The PlogiQL...and not determined at runtime. Nevertheless, the PRAiSE team plans to extend their solution to support numerical (continuous or discrete

  2. Complete synchronization of the global coupled dynamical network induced by Poisson noises.

    PubMed

    Guo, Qing; Wan, Fangyi

    2017-01-01

    The different Poisson noise-induced complete synchronization of the global coupled dynamical network is investigated. Based on the stability theory of stochastic differential equations driven by Poisson process, we can prove that Poisson noises can induce synchronization and sufficient conditions are established to achieve complete synchronization with probability 1. Furthermore, numerical examples are provided to show the agreement between theoretical and numerical analysis.

  3. Detection of cat-eye effect echo based on unit APD

    NASA Astrophysics Data System (ADS)

    Wu, Dong-Sheng; Zhang, Peng; Hu, Wen-Gang; Ying, Jia-Ju; Liu, Jie

    2016-10-01

    The cat-eye effect echo of optical system can be detected based on CCD, but the detection range is limited within several kilometers. In order to achieve long-range even ultra-long-range detection, it ought to select APD as detector because of the high sensitivity of APD. The detection system of cat-eye effect echo based on unit APD is designed in paper. The implementation scheme and key technology of the detection system is presented. The detection performances of the detection system including detection range, detection probability and false alarm probability are modeled. Based on the model, the performances of the detection system are analyzed using typical parameters. The results of numerical calculation show that the echo signal-to-noise ratio is greater than six, the detection probability is greater than 99.9% and the false alarm probability is less tan 0.1% within 20 km detection range. In order to verify the detection effect, we built the experimental platform of detection system according to the design scheme and carry out the field experiments. The experimental results agree well with the results of numerical calculation, which prove that the detection system based on the unit APD is feasible to realize remote detection for cat-eye effect echo.

  4. Evaluation of blade-strike models for estimating the biological performance of large Kaplan hydro turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Z.; Carlson, T. J.; Ploskey, G. R.

    2005-11-01

    Bio-indexing of hydro turbines has been identified as an important means to optimize passage conditions for fish by identifying operations for existing and new design turbines that minimize the probability of injury. Cost-effective implementation of bio-indexing requires the use of tools such as numerical and physical turbine models to generate hypotheses for turbine operations that can be tested at prototype scales using live fish. Blade strike has been proposed as an index variable for the biological performance of turbines. Report reviews an evaluation of the use of numerical blade-strike models as a means with which to predict the probability ofmore » blade strike and injury of juvenile salmon smolt passing through large Kaplan turbines on the mainstem Columbia River.« less

  5. Calculation of transmission probability by solving an eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Bubin, Sergiy; Varga, Kálmán

    2010-11-01

    The electron transmission probability in nanodevices is calculated by solving an eigenvalue problem. The eigenvalues are the transmission probabilities and the number of nonzero eigenvalues is equal to the number of open quantum transmission eigenchannels. The number of open eigenchannels is typically a few dozen at most, thus the computational cost amounts to the calculation of a few outer eigenvalues of a complex Hermitian matrix (the transmission matrix). The method is implemented on a real space grid basis providing an alternative to localized atomic orbital based quantum transport calculations. Numerical examples are presented to illustrate the efficiency of the method.

  6. Analysis of SET pulses propagation probabilities in sequential circuits

    NASA Astrophysics Data System (ADS)

    Cai, Shuo; Yu, Fei; Yang, Yiqun

    2018-05-01

    As the feature size of CMOS transistors scales down, single event transient (SET) has been an important consideration in designing logic circuits. Many researches have been done in analyzing the impact of SET. However, it is difficult to consider numerous factors. We present a new approach for analyzing the SET pulses propagation probabilities (SPPs). It considers all masking effects and uses SET pulses propagation probabilities matrices (SPPMs) to represent the SPPs in current cycle. Based on the matrix union operations, the SPPs in consecutive cycles can be calculated. Experimental results show that our approach is practicable and efficient.

  7. Influence of distributed delays on the dynamics of a generalized immune system cancerous cells interactions model

    NASA Astrophysics Data System (ADS)

    Piotrowska, M. J.; Bodnar, M.

    2018-01-01

    We present a generalisation of the mathematical models describing the interactions between the immune system and tumour cells which takes into account distributed time delays. For the analytical study we do not assume any particular form of the stimulus function describing the immune system reaction to presence of tumour cells but we only postulate its general properties. We analyse basic mathematical properties of the considered model such as existence and uniqueness of the solutions. Next, we discuss the existence of the stationary solutions and analytically investigate their stability depending on the forms of considered probability densities that is: Erlang, triangular and uniform probability densities separated or not from zero. Particular instability results are obtained for a general type of probability densities. Our results are compared with those for the model with discrete delays know from the literature. In addition, for each considered type of probability density, the model is fitted to the experimental data for the mice B-cell lymphoma showing mean square errors at the same comparable level. For estimated sets of parameters we discuss possibility of stabilisation of the tumour dormant steady state. Instability of this steady state results in uncontrolled tumour growth. In order to perform numerical simulation, following the idea of linear chain trick, we derive numerical procedures that allow us to solve systems with considered probability densities using standard algorithm for ordinary differential equations or differential equations with discrete delays.

  8. Nicotinamide and skin cancer chemoprevention: The jury is still out.

    PubMed

    Gilmore, Stephen J

    2018-02-01

    Following the publication of the results of a Phase III trial, the administration of oral nicotinamide has been widely advocated as effective in non-melanoma skin cancer chemoprevention in high-risk individuals. However, I performed a Bayesian analysis of the reported findings and show there is insufficient evidence to demonstrate its efficacy, highlighting the significant probability that the positive conclusions drawn will not be reproducible. Given the potential widespread use of oral nicotinamide, future position statements regarding its efficacy are likely to require higher standards of evidence. © 2017 The Australasian College of Dermatologists.

  9. In-service documentation tools and statements on palliative sedation in Germany--do they meet the EAPC framework recommendations? A qualitative document analysis.

    PubMed

    Stiel, Stephanie; Heckel, Maria; Christensen, Britta; Ostgathe, Christoph; Klein, Carsten

    2016-01-01

    Numerous (inter-)national guidelines and frameworks have been developed to provide recommendations for the application of palliative sedation (PS). However, they are still not widely known, and large variations in PS clinical practice can be found. This study aims to collect and describe contents from documents used in clinical practice and to compare to what extent they match the European Association for Palliative Care (EAPC) framework recommendations. In a national survey on PS in Germany 2012, participants were asked to upload their in-service templates, assessment tools, specific protocols, and in-service statements for the application and documentation of PS. These documents are analyzed by using systematic structured content analysis. Three hundred seven content units of 52 provided documents were coded. The analyzed templates are very heterogeneous and also contain items not mentioned in the EAPC framework. Among 11 scales for the evaluation of sedation level, the Ramsey Sedation Score (n = 5) and the Richmond-Agitation-Sedation-Scale (n = 2) were found most often. For symptom assessment, three different scales were provided one time respectively. In all six PS statements, the common core elements were possible indications for PS, instructions on dose titration, patient monitoring, and care. Wide congruency exists for physical and psychological indications. Most documents coincide on midazolam as a preferred drug and basic monitoring in regular intervals. Aspects such as pre-emptive discussion of the potential role of sedation, informational needs of relatives, and care for the medical professionals are mentioned rarely. The analyzed templates do neglect some points of the EAPC recommendations. However, they expand the ten-point scheme of the framework in some details. The findings may facilitate the development of standardized consensus documentation and monitoring draft as an operational statement.

  10. Exact numerical calculation of fixation probability and time on graphs.

    PubMed

    Hindersin, Laura; Möller, Marius; Traulsen, Arne; Bauer, Benedikt

    2016-12-01

    The Moran process on graphs is a popular model to study the dynamics of evolution in a spatially structured population. Exact analytical solutions for the fixation probability and time of a new mutant have been found for only a few classes of graphs so far. Simulations are time-expensive and many realizations are necessary, as the variance of the fixation times is high. We present an algorithm that numerically computes these quantities for arbitrary small graphs by an approach based on the transition matrix. The advantage over simulations is that the calculation has to be executed only once. Building the transition matrix is automated by our algorithm. This enables a fast and interactive study of different graph structures and their effect on fixation probability and time. We provide a fast implementation in C with this note (Hindersin et al., 2016). Our code is very flexible, as it can handle two different update mechanisms (Birth-death or death-Birth), as well as arbitrary directed or undirected graphs. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Critical spreading dynamics of parity conserving annihilating random walks with power-law branching

    NASA Astrophysics Data System (ADS)

    Laise, T.; dos Anjos, F. C.; Argolo, C.; Lyra, M. L.

    2018-09-01

    We investigate the critical spreading of the parity conserving annihilating random walks model with Lévy-like branching. The random walks are considered to perform normal diffusion with probability p on the sites of a one-dimensional lattice, annihilating in pairs by contact. With probability 1 - p, each particle can also produce two offspring which are placed at a distance r from the original site following a power-law Lévy-like distribution P(r) ∝ 1 /rα. We perform numerical simulations starting from a single particle. A finite-time scaling analysis is employed to locate the critical diffusion probability pc below which a finite density of particles is developed in the long-time limit. Further, we estimate the spreading dynamical exponents related to the increase of the average number of particles at the critical point and its respective fluctuations. The critical exponents deviate from those of the counterpart model with short-range branching for small values of α. The numerical data suggest that continuously varying spreading exponents sets up while the branching process still results in a diffusive-like spreading.

  12. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations

    PubMed Central

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360

  13. Using effort information with change-in-ratio data for population estimation

    USGS Publications Warehouse

    Udevitz, Mark S.; Pollock, Kenneth H.

    1995-01-01

    Most change-in-ratio (CIR) methods for estimating fish and wildlife population sizes have been based only on assumptions about how encounter probabilities vary among population subclasses. When information on sampling effort is available, it is also possible to derive CIR estimators based on assumptions about how encounter probabilities vary over time. This paper presents a generalization of previous CIR models that allows explicit consideration of a range of assumptions about the variation of encounter probabilities among subclasses and over time. Explicit estimators are derived under this model for specific sets of assumptions about the encounter probabilities. Numerical methods are presented for obtaining estimators under the full range of possible assumptions. Likelihood ratio tests for these assumptions are described. Emphasis is on obtaining estimators based on assumptions about variation of encounter probabilities over time.

  14. WSES Jerusalem guidelines for diagnosis and treatment of acute appendicitis.

    PubMed

    Di Saverio, Salomone; Birindelli, Arianna; Kelly, Micheal D; Catena, Fausto; Weber, Dieter G; Sartelli, Massimo; Sugrue, Michael; De Moya, Mark; Gomes, Carlos Augusto; Bhangu, Aneel; Agresta, Ferdinando; Moore, Ernest E; Soreide, Kjetil; Griffiths, Ewen; De Castro, Steve; Kashuk, Jeffry; Kluger, Yoram; Leppaniemi, Ari; Ansaloni, Luca; Andersson, Manne; Coccolini, Federico; Coimbra, Raul; Gurusamy, Kurinchi S; Campanile, Fabio Cesare; Biffl, Walter; Chiara, Osvaldo; Moore, Fred; Peitzman, Andrew B; Fraga, Gustavo P; Costa, David; Maier, Ronald V; Rizoli, Sandro; Balogh, Zsolt J; Bendinelli, Cino; Cirocchi, Roberto; Tonini, Valeria; Piccinini, Alice; Tugnoli, Gregorio; Jovine, Elio; Persiani, Roberto; Biondi, Antonio; Scalea, Thomas; Stahel, Philip; Ivatury, Rao; Velmahos, George; Andersson, Roland

    2016-01-01

    Acute appendicitis (AA) is among the most common cause of acute abdominal pain. Diagnosis of AA is challenging; a variable combination of clinical signs and symptoms has been used together with laboratory findings in several scoring systems proposed for suggesting the probability of AA and the possible subsequent management pathway. The role of imaging in the diagnosis of AA is still debated, with variable use of US, CT and MRI in different settings worldwide. Up to date, comprehensive clinical guidelines for diagnosis and management of AA have never been issued. In July 2015, during the 3rd World Congress of the WSES, held in Jerusalem (Israel), a panel of experts including an Organizational Committee and Scientific Committee and Scientific Secretariat, participated to a Consensus Conference where eight panelists presented a number of statements developed for each of the eight main questions about diagnosis and management of AA. The statements were then voted, eventually modified and finally approved by the participants to The Consensus Conference and lately by the board of co-authors. The current paper is reporting the definitive Guidelines Statements on each of the following topics: 1) Diagnostic efficiency of clinical scoring systems, 2) Role of Imaging, 3) Non-operative treatment for uncomplicated appendicitis, 4) Timing of appendectomy and in-hospital delay, 5) Surgical treatment 6) Scoring systems for intra-operative grading of appendicitis and their clinical usefulness 7) Non-surgical treatment for complicated appendicitis: abscess or phlegmon 8) Pre-operative and post-operative antibiotics.

  15. Current recommendations on the estimation of transition probabilities in Markov cohort models for use in health care decision-making: a targeted literature review.

    PubMed

    Olariu, Elena; Cadwell, Kevin K; Hancock, Elizabeth; Trueman, David; Chevrou-Severac, Helene

    2017-01-01

    Although Markov cohort models represent one of the most common forms of decision-analytic models used in health care decision-making, correct implementation of such models requires reliable estimation of transition probabilities. This study sought to identify consensus statements or guidelines that detail how such transition probability matrices should be estimated. A literature review was performed to identify relevant publications in the following databases: Medline, Embase, the Cochrane Library, and PubMed. Electronic searches were supplemented by manual-searches of health technology assessment (HTA) websites in Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and the UK. One reviewer assessed studies for eligibility. Of the 1,931 citations identified in the electronic searches, no studies met the inclusion criteria for full-text review, and no guidelines on transition probabilities in Markov models were identified. Manual-searching of the websites of HTA agencies identified ten guidelines on economic evaluations (Australia, Belgium, Canada, France, Germany, Ireland, Norway, Portugal, Sweden, and UK). All identified guidelines provided general guidance on how to develop economic models, but none provided guidance on the calculation of transition probabilities. One relevant publication was identified following review of the reference lists of HTA agency guidelines: the International Society for Pharmacoeconomics and Outcomes Research taskforce guidance. This provided limited guidance on the use of rates and probabilities. There is limited formal guidance available on the estimation of transition probabilities for use in decision-analytic models. Given the increasing importance of cost-effectiveness analysis in the decision-making processes of HTA bodies and other medical decision-makers, there is a need for additional guidance to inform a more consistent approach to decision-analytic modeling. Further research should be done to develop more detailed guidelines on the estimation of transition probabilities.

  16. An energy-dependent numerical model for the condensation probability, γ j

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerby, Leslie Marie

    The “condensation” probability, γ j, is an important variable in the preequilibrium stage of nuclear spallation reactions. It represents the probability that p j excited nucleons (excitons) will “condense” to form complex particle type j in the excited residual nucleus. In addition, it has a significant impact on the emission width, or probability of emitting fragment type j from the residual nucleus. Previous formulations for γ j were energy-independent and valid for fragments up to 4He only. This paper explores the formulation of a new model for γ j, one which is energy-dependent and valid for up to 28Mg, andmore » which provides improved fits compared to experimental fragment spectra.« less

  17. An energy-dependent numerical model for the condensation probability, γ j

    DOE PAGES

    Kerby, Leslie Marie

    2016-12-09

    The “condensation” probability, γ j, is an important variable in the preequilibrium stage of nuclear spallation reactions. It represents the probability that p j excited nucleons (excitons) will “condense” to form complex particle type j in the excited residual nucleus. In addition, it has a significant impact on the emission width, or probability of emitting fragment type j from the residual nucleus. Previous formulations for γ j were energy-independent and valid for fragments up to 4He only. This paper explores the formulation of a new model for γ j, one which is energy-dependent and valid for up to 28Mg, andmore » which provides improved fits compared to experimental fragment spectra.« less

  18. A comprehensive subaxial cervical spine injury severity assessment model using numeric scores and its predictive value for surgical intervention.

    PubMed

    Tsou, Paul M; Daffner, Scott D; Holly, Langston T; Shamie, A Nick; Wang, Jeffrey C

    2012-02-10

    Multiple factors contribute to the determination for surgical intervention in the setting of cervical spinal injury, yet to date no unified classification system exists that predicts this need. The goals of this study were twofold: to create a comprehensive subaxial cervical spine injury severity numeric scoring model, and to determine the predictive value of this model for the probability of surgical intervention. In a retrospective cohort study of 333 patients, neural impairment, patho-morphology, and available spinal canal sagittal diameter post-injury were selected as injury severity determinants. A common numeric scoring trend was created; smaller values indicated less favorable clinical conditions. Neural impairment was graded from 2-10, patho-morphology scoring ranged from 2-15, and post-injury available canal sagittal diameter (SD) was measured in millimeters at the narrowest point of injury. Logistic regression analysis was performed using the numeric scores to predict the probability for surgical intervention. Complete neurologic deficit was found in 39 patients, partial deficits in 108, root injuries in 19, and 167 were neurologically intact. The pre-injury mean canal SD was 14.6 mm; the post-injury measurement mean was 12.3 mm. The mean patho-morphology score for all patients was 10.9 and the mean neurologic function score was 7.6. There was a statistically significant difference in mean scores for neural impairment, canal SD, and patho-morphology for surgical compared to nonsurgical patients. At the lowest clinical score for each determinant, the probability for surgery was 0.949 for neural impairment, 0.989 for post-injury available canal SD, and 0.971 for patho-morphology. The unit odds ratio for each determinant was 1.73, 1.61, and 1.45, for neural impairment, patho-morphology, and canal SD scores, respectively. The subaxial cervical spine injury severity determinants of neural impairment, patho-morphology, and post-injury available canal SD have well defined probability for surgical intervention when scored separately. Our data showed that each determinant alone could act as a primary predictor for surgical intervention.

  19. How Parents Read Counting Books and Non-numerical Books to Their Preverbal Infants: An Observational Study

    PubMed Central

    Goldstein, Alison; Cole, Thomas; Cordes, Sara

    2016-01-01

    Studies have stressed the importance of counting with children to promote formal numeracy abilities; however, little work has investigated when parents begin to engage in this behavior with their young children. In the current study, we investigated whether parents elaborated on numerical information when reading a counting book to their preverbal infants and whether developmental differences in numerical input exist even in the 1st year of life. Parents and their 5–10 months old infants were asked to read, as they would at home, two books to their infants: a counting book and another book that did not have numerical content. Parents’ spontaneous statements rarely focused on number and those that did consisted primarily of counting, with little emphasis on labeling the cardinality of the set. However, developmental differences were observed even in this age range, such that parents were more likely to make numerical utterances when reading to older infants. Together, results are the first to characterize naturalistic reading behaviors between parents and their preverbal infants in the context of counting books, suggesting that although counting books promote numerical language in parents, infants still receive very little in the way of numerical input before the end of the 1st year of life. While little is known regarding the impact of number talk on the cognitive development of young infants, the current results may guide future work in this area by providing the first assessment of the characteristics of parental numerical input to preverbal infants. PMID:27493639

  20. Calculation of Radar Probability of Detection in K-Distributed Sea Clutter and Noise

    DTIC Science & Technology

    2011-04-01

    Laguerre polynomials are generated from a recurrence relation, and the nodes and weights are calculated from the eigenvalues and eigenvectors of a...B.P. Flannery, Numerical Recipes in Fortran, Second Edition, Cambridge University Press (1992). 12. W. Gautschi, Orthogonal Polynomials (in Matlab...the integration, with the nodes and weights calculated using matrix methods, so that a general purpose numerical integration routine is not required

  1. Higher order Stark effect and transition probabilities on hyperfine structure components of hydrogen like atoms

    NASA Astrophysics Data System (ADS)

    Pal'Chikov, V. G.

    2000-08-01

    A quantum-electrodynamical (QED) perturbation theory is developed for hydrogen and hydrogen-like atomic systems with interaction between bound electrons and radiative field being treated as the perturbation. The dependence of the perturbed energy of levels on hyperfine structure (hfs) effects and on the higher-order Stark effect is investigated. Numerical results have been obtained for the transition probability between the hfs components of hydrogen-like bismuth.

  2. Assessing Performance Tradeoffs in Undersea Distributed Sensor Networks

    DTIC Science & Technology

    2006-09-01

    time. We refer to this process as track - before - detect (see [5] for a description), since the final determination of a target presence is not made until...expressions for probability of successful search and probability of false search for modeling the track - before - detect process. We then describe a numerical...random manner (randomly sampled from a uniform distribution). II. SENSOR NETWORK PERFORMANCE MODELS We model the process of track - before - detect by

  3. Noise thresholds for optical quantum computers.

    PubMed

    Dawson, Christopher M; Haselgrove, Henry L; Nielsen, Michael A

    2006-01-20

    In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities <3 x 10(-3), and for depolarization probabilities <10(-4).

  4. Evaluating a linearized Euler equations model for strong turbulence effects on sound propagation.

    PubMed

    Ehrhardt, Loïc; Cheinet, Sylvain; Juvé, Daniel; Blanc-Benon, Philippe

    2013-04-01

    Sound propagation outdoors is strongly affected by atmospheric turbulence. Under strongly perturbed conditions or long propagation paths, the sound fluctuations reach their asymptotic behavior, e.g., the intensity variance progressively saturates. The present study evaluates the ability of a numerical propagation model based on the finite-difference time-domain solving of the linearized Euler equations in quantitatively reproducing the wave statistics under strong and saturated intensity fluctuations. It is the continuation of a previous study where weak intensity fluctuations were considered. The numerical propagation model is presented and tested with two-dimensional harmonic sound propagation over long paths and strong atmospheric perturbations. The results are compared to quantitative theoretical or numerical predictions available on the wave statistics, including the log-amplitude variance and the probability density functions of the complex acoustic pressure. The match is excellent for the evaluated source frequencies and all sound fluctuations strengths. Hence, this model captures these many aspects of strong atmospheric turbulence effects on sound propagation. Finally, the model results for the intensity probability density function are compared with a standard fit by a generalized gamma function.

  5. Numeracy and Communication with Patients: They Are Counting on Us

    PubMed Central

    Paasche-Orlow, Michael K.; Remillard, Janine T.; Bennett, Ian M.; Ben-Joseph, Elana Pearl; Batista, Rosanna M.; Hyde, James; Rudd, Rima E.

    2008-01-01

    Patient-centered interactive communication between physicians and patients is recommended to improve the quality of medical care. Numerical concepts are important components of such exchanges and include arithmetic and use of percentages, as well as higher level tasks like estimation, probability, problem-solving, and risk assessment - the basis of preventive medicine. Difficulty with numerical concepts may impede communication. The current evidence on prevalence, measurement, and outcomes related to numeracy is presented, along with a summary of best practices for communication of numerical information. This information is integrated into a hierarchical model of mathematical concepts and skills, which can guide clinicians toward numerical communication that is easier to use with patients. PMID:18830764

  6. Auto-correlation of journal impact factor for consensus research reporting statements: a cohort study.

    PubMed

    Shanahan, Daniel R

    2016-01-01

    Background. The Journal Citation Reports journal impact factors (JIFs) are widely used to rank and evaluate journals, standing as a proxy for the relative importance of a journal within its field. However, numerous criticisms have been made of use of a JIF to evaluate importance. This problem is exacerbated when the use of JIFs is extended to evaluate not only the journals, but the papers therein. The purpose of this study was therefore to investigate the relationship between the number of citations and journal IF for identical articles published simultaneously in multiple journals. Methods. Eligible articles were consensus research reporting statements listed on the EQUATOR Network website that were published simultaneously in three or more journals. The correlation between the citation count for each article and the median journal JIF over the published period, and between the citation count and number of article accesses was calculated for each reporting statement. Results. Nine research reporting statements were included in this analysis, representing 85 articles published across 58 journals in biomedicine. The number of citations was strongly correlated to the JIF for six of the nine reporting guidelines, with moderate correlation shown for the remaining three guidelines (median r = 0.66, 95% CI [0.45-0.90]). There was also a strong positive correlation between the number of citations and the number of article accesses (median r = 0.71, 95% CI [0.5-0.8]), although the number of data points for this analysis were limited. When adjusted for the individual reporting guidelines, each logarithm unit of JIF predicted a median increase of 0.8 logarithm units of citation counts (95% CI [-0.4-5.2]), and each logarithm unit of article accesses predicted a median increase of 0.1 logarithm units of citation counts (95% CI [-0.9-1.4]). This model explained 26% of the variance in citations (median adjusted r (2) = 0.26, range 0.18-1.0). Conclusion. The impact factor of the journal in which a reporting statement was published was shown to influence the number of citations that statement will gather over time. Similarly, the number of article accesses also influenced the number of citations, although to a lesser extent than the impact factor. This demonstrates that citation counts are not purely a reflection of scientific merit and the impact factor is, in fact, auto-correlated.

  7. Transmission of electrons inside the cryogenic pumps of ITER injector.

    PubMed

    Veltri, P; Sartori, E

    2016-02-01

    Large cryogenic pumps are installed in the vessel of large neutral beam injectors (NBIs) used to heat the plasma in nuclear fusion experiments. The operation of such pumps can be compromised by the presence of stray secondary electrons that are generated along the beam path. In this paper, we present a numerical model to analyze the propagation of the electrons inside the pump. The aim of the study is to quantify the power load on the active pump elements, via evaluation of the transmission probabilities across the domain of the pump. These are obtained starting from large datasets of particle trajectories, obtained by numerical means. The transmission probability of the electrons across the domain is calculated for the NBI of the ITER and for its prototype Megavolt ITer Injector and Concept Advancement (MITICA) and the results are discussed.

  8. Phase transition in the countdown problem

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Luque, Bartolo

    2012-07-01

    We present a combinatorial decision problem, inspired by the celebrated quiz show called Countdown, that involves the computation of a given target number T from a set of k randomly chosen integers along with a set of arithmetic operations. We find that the probability of winning the game evidences a threshold phenomenon that can be understood in the terms of an algorithmic phase transition as a function of the set size k. Numerical simulations show that such probability sharply transitions from zero to one at some critical value of the control parameter, hence separating the algorithm's parameter space in different phases. We also find that the system is maximally efficient close to the critical point. We derive analytical expressions that match the numerical results for finite size and permit us to extrapolate the behavior in the thermodynamic limit.

  9. Optimal Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  10. Stochastic dynamics and logistic population growth

    NASA Astrophysics Data System (ADS)

    Méndez, Vicenç; Assaf, Michael; Campos, Daniel; Horsthemke, Werner

    2015-06-01

    The Verhulst model is probably the best known macroscopic rate equation in population ecology. It depends on two parameters, the intrinsic growth rate and the carrying capacity. These parameters can be estimated for different populations and are related to the reproductive fitness and the competition for limited resources, respectively. We investigate analytically and numerically the simplest possible microscopic scenarios that give rise to the logistic equation in the deterministic mean-field limit. We provide a definition of the two parameters of the Verhulst equation in terms of microscopic parameters. In addition, we derive the conditions for extinction or persistence of the population by employing either the momentum-space spectral theory or the real-space Wentzel-Kramers-Brillouin approximation to determine the probability distribution function and the mean time to extinction of the population. Our analytical results agree well with numerical simulations.

  11. Probabilistic distribution and stochastic P-bifurcation of a hybrid energy harvester under colored noise

    NASA Astrophysics Data System (ADS)

    Mokem Fokou, I. S.; Nono Dueyou Buckjohn, C.; Siewe Siewe, M.; Tchawoua, C.

    2018-03-01

    In this manuscript, a hybrid energy harvesting system combining piezoelectric and electromagnetic transduction and subjected to colored noise is investigated. By using the stochastic averaging method, the stationary probability density functions of amplitudes are obtained and reveal interesting dynamics related to the long term behavior of the device. From stationary probability densities, we discuss the stochastic bifurcation through the qualitative change which shows that noise intensity, correlation time and other system parameters can be treated as bifurcation parameters. Numerical simulations are made for a comparison with analytical findings. The Mean first passage time (MFPT) is numerical provided in the purpose to investigate the system stability. By computing the Mean residence time (TMR), we explore the stochastic resonance phenomenon; we show how it is related to the correlation time of colored noise and high output power.

  12. Determining linear vibration frequencies of a ferromagnetic shell

    NASA Astrophysics Data System (ADS)

    Bagdoev, A. G.; Vardanyan, A. V.; Vardanyan, S. V.; Kukudzhanov, V. N.

    2007-10-01

    The problems of determining the roots of dispersion equations for free bending vibrations of thin magnetoelastic plates and shells are of both theoretical and practical interest, in particular, in studying vibrations of metallic structures used in controlled thermonuclear reactors. These problems were solved on the basis of the Kirchhoff hypothesis in [1-5]. In [6], an exact spatial approach to determining the vibration frequencies of thin plates was suggested, and it was shown that it completely agrees with the solution obtained according to the Kirchhoff hypothesis. In [7-9], this exact approach was used to solve the problem on vibrations of thin magnetoelastic plates, and it was shown by cumbersome calculations that the solutions obtained according to the exact theory and the Kirchhoff hypothesis differ substantially except in a single case. In [10], the equations of the dynamic theory of elasticity in the axisymmetric problem are given. In [11], the equations for the vibration frequencies of thin ferromagnetic plates with arbitrary conductivity were obtained in the exact statement. In [12], the Kirchhoff hypothesis was used to obtain dispersion relations for a magnetoelastic thin shell. In [5, 13-16], the relations for the Maxwell tensor and the ponderomotive force for magnetics were presented. In [17], the dispersion relations for thin ferromagnetic plates in the transverse field in the spatial statement were studied analytically and numerically. In the present paper, on the basis of the exact approach, we study free bending vibrations of a thin ferromagnetic cylindrical shell. We obtain the exact dispersion equation in the form of a sixth-order determinant, which can be solved numerically in the case of a magnetoelastic thin shell. The numerical results are presented in tables and compared with the results obtained by the Kirchhoff hypothesis. We show a large number of differences in the results, even for the least frequency.

  13. The electromagnetic pendulum in quickly changing magnetic field of constant intensity

    NASA Astrophysics Data System (ADS)

    Rodyukov, F. F.; Shepeljavyi, A. I.

    2018-05-01

    The Lagrange-Maxwell equations for the pendulum in the form of a conductive frame, which is suspended in a uniform sinusoidal electromagnetic field of constant intensity, are obtained. The procedure for obtaining simplified mathematical models by a traditional method of separating fast and slow motions with subsiquent averaging a fast time is used. It is shown that this traditional approach may lead to inappropriate mathematical models. Suggested ways on how this can be avoided for the case are considered. The main statements by numerical experiments are illustrated.

  14. Lens ray diagrams with a spreadsheet

    NASA Astrophysics Data System (ADS)

    González, Manuel I.

    2018-05-01

    Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful mixture of standard Excel functions allows to display a realistic automated ray diagram. The suggested spreadsheet is intended as an auxiliary didactic tool for instructors who wish to teach their students to create their own ray diagrams.

  15. Information Management System for the California State Water Resources Control Board (SWRCB)

    NASA Technical Reports Server (NTRS)

    Heald, T. C.; Redmann, G. H.

    1973-01-01

    A study was made to establish the requirements for an integrated state-wide information management system for water quality control and water quality rights for the State of California. The data sources and end requirements were analyzed for the data collected and used by the numerous agencies, both State and Federal, as well as the nine Regional Boards under the jurisdiction of the State Board. The report details the data interfaces and outlines the system design. A program plan and statement of work for implementation of the project is included.

  16. Numerical and analytical bounds on threshold error rates for hypergraph-product codes

    NASA Astrophysics Data System (ADS)

    Kovalev, Alexey A.; Prabhakar, Sanjay; Dumer, Ilya; Pryadko, Leonid P.

    2018-06-01

    We study analytically and numerically decoding properties of finite-rate hypergraph-product quantum low density parity-check codes obtained from random (3,4)-regular Gallager codes, with a simple model of independent X and Z errors. Several nontrivial lower and upper bounds for the decodable region are constructed analytically by analyzing the properties of the homological difference, equal minus the logarithm of the maximum-likelihood decoding probability for a given syndrome. Numerical results include an upper bound for the decodable region from specific heat calculations in associated Ising models and a minimum-weight decoding threshold of approximately 7 % .

  17. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  18. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.

  19. Steady state, relaxation and first-passage properties of a run-and-tumble particle in one-dimension

    NASA Astrophysics Data System (ADS)

    Malakar, Kanaya; Jemseena, V.; Kundu, Anupam; Vijay Kumar, K.; Sabhapandit, Sanjib; Majumdar, Satya N.; Redner, S.; Dhar, Abhishek

    2018-04-01

    We investigate the motion of a run-and-tumble particle (RTP) in one dimension. We find the exact probability distribution of the particle with and without diffusion on the infinite line, as well as in a finite interval. In the infinite domain, this probability distribution approaches a Gaussian form in the long-time limit, as in the case of a regular Brownian particle. At intermediate times, this distribution exhibits unexpected multi-modal forms. In a finite domain, the probability distribution reaches a steady-state form with peaks at the boundaries, in contrast to a Brownian particle. We also study the relaxation to the steady-state analytically. Finally we compute the survival probability of the RTP in a semi-infinite domain with an absorbing boundary condition at the origin. In the finite interval, we compute the exit probability and the associated exit times. We provide numerical verification of our analytical results.

  20. A numerical 4D Collision Risk Model

    NASA Astrophysics Data System (ADS)

    Schmitt, Pal; Culloch, Ross; Lieber, Lilian; Kregting, Louise

    2017-04-01

    With the growing number of marine renewable energy (MRE) devices being installed across the world, some concern has been raised about the possibility of harming mobile, marine fauna by collision. Although physical contact between a MRE device and an organism has not been reported to date, these novel sub-sea structures pose a challenge for accurately estimating collision risks as part of environmental impact assessments. Even if the animal motion is simplified to linear translation, ignoring likely evasive behaviour, the mathematical problem of establishing an impact probability is not trivial. We present a numerical algorithm to obtain such probability distributions using transient, four-dimensional simulations of a novel marine renewable device concept, Deep Green, Minesto's power plant and hereafter referred to as the 'kite' that flies in a figure-of-eight configuration. Simulations were carried out altering several configurations including kite depth, kite speed and kite trajectory while keeping the speed of the moving object constant. Since the kite assembly is defined as two parts in the model, a tether (attached to the seabed) and the kite, collision risk of each part is reported independently. By comparing the number of collisions with the number of collision-free simulations, a probability of impact for each simulated position in the cross- section of the area is considered. Results suggest that close to the bottom, where the tether amplitude is small, the path is always blocked and the impact probability is 100% as expected. However, higher up in the water column, the collision probability is twice as high in the mid line, where the tether passes twice per period than at the extremes of its trajectory. The collision probability distribution is much more complex in the upper end of the water column, where the kite and tether can simultaneously collide with the object. Results demonstrate the viability of such models, which can also incorporate empirical field data for assessing the probability of collision risk of animals with an MRE device under varying operating conditions.

  1. A spatially explicit model for an Allee effect: why wolves recolonize so slowly in Greater Yellowstone.

    PubMed

    Hurford, Amy; Hebblewhite, Mark; Lewis, Mark A

    2006-11-01

    A reduced probability of finding mates at low densities is a frequently hypothesized mechanism for a component Allee effect. At low densities dispersers are less likely to find mates and establish new breeding units. However, many mathematical models for an Allee effect do not make a distinction between breeding group establishment and subsequent population growth. Our objective is to derive a spatially explicit mathematical model, where dispersers have a reduced probability of finding mates at low densities, and parameterize the model for wolf recolonization in the Greater Yellowstone Ecosystem (GYE). In this model, only the probability of establishing new breeding units is influenced by the reduced probability of finding mates at low densities. We analytically and numerically solve the model to determine the effect of a decreased probability in finding mates at low densities on population spread rate and density. Our results suggest that a reduced probability of finding mates at low densities may slow recolonization rate.

  2. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  3. Cuba: Multidimensional numerical integration library

    NASA Astrophysics Data System (ADS)

    Hahn, Thomas

    2016-08-01

    The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.

  4. Secondary school science teaching, 1970--1992: Objectives as stated in periodical literature

    NASA Astrophysics Data System (ADS)

    Hemby, Brian Franklin

    Purpose of the study. The major purpose of this study was to identify and classify objectives for teaching science in secondary schools in the United States during the period 1970--1992. These objectives were identified by objective statements in articles from selected professional periodicals. Procedure. The 1970--1992 period was divided into two subperiods on the basis of major historical events. Selected professional periodicals were searched for statements of objectives of secondary school science teaching. These statements were catalogued into Knowledge, Process, Attitude and Interest, or Cultural Awareness categories. The resulting data were classified within and across the two subperiods according to frequency of occurrence, category, authorship, and year. Findings. The major findings of this investigation included the following: (1) Authors in Higher Education produced the most articles, both research-oriented and nonresearch-oriented, and the most statements in each subperiod. Miscellaneous authors produced the least articles and statements. (2) Statements in the Process category were most frequent in the two subperiods. (3) The "most important" objectives for secondary school science teaching were Philosophical, sociological, and political aspects (from the Cultural Awareness category), Processes, skills, and techniques (from the Process category), and Major facts, principles, or fundamentals (from the Knowledge category). (4) Attitude and Interest objectives were consistently ranked as least important throughout the study. (5) The ranking of "most important" objectives in research-oriented articles generally agreed with the ranking in articles as a whole. Conclusions. Based on the findings of this investigation, the following conclusions were made: (1) The objectives for teaching secondary school science were influenced by historical events, especially the Vietnam War, the Cold War, the AIDS pandemic, and the publication of A Nation at Risk: The Imperative for Educational Reform. (2) Authors in Higher Education wrote more articles about the objectives for the teaching of secondary school science than those in the other categories. This was probably a reflection of the "publish or perish" environment in many colleges and universities. (3) The most important objectives for secondary school science teaching were Philosophical, sociological, and political aspects, Processes, skills, and techniques, and Major facts, principles, or fundamentals. The preponderance of these objectives is most likely a result of cultural and social unrest during this period. (4) The number of research-oriented articles, as a percentage of all articles, doubled from the first subperiod to the second subperiod. There appears to be a trend during the second subperiod toward more data-based articles.

  5. Aging ballistic Lévy walks

    NASA Astrophysics Data System (ADS)

    Magdziarz, Marcin; Zorawik, Tomasz

    2017-02-01

    Aging can be observed for numerous physical systems. In such systems statistical properties [like probability distribution, mean square displacement (MSD), first-passage time] depend on a time span ta between the initialization and the beginning of observations. In this paper we study aging properties of ballistic Lévy walks and two closely related jump models: wait-first and jump-first. We calculate explicitly their probability distributions and MSDs. It turns out that despite similarities these models react very differently to the delay ta. Aging weakly affects the shape of probability density function and MSD of standard Lévy walks. For the jump models the shape of the probability density function is changed drastically. Moreover for the wait-first jump model we observe a different behavior of MSD when ta≪t and ta≫t .

  6. Short-term capture of the Earth-Moon system

    NASA Astrophysics Data System (ADS)

    Qi, Yi; de Ruiter, Anton

    2018-06-01

    In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.

  7. Transition probability, dynamic regimes, and the critical point of financial crisis

    NASA Astrophysics Data System (ADS)

    Tang, Yinan; Chen, Ping

    2015-07-01

    An empirical and theoretical analysis of financial crises is conducted based on statistical mechanics in non-equilibrium physics. The transition probability provides a new tool for diagnosing a changing market. Both calm and turbulent markets can be described by the birth-death process for price movements driven by identical agents. The transition probability in a time window can be estimated from stock market indexes. Positive and negative feedback trading behaviors can be revealed by the upper and lower curves in transition probability. Three dynamic regimes are discovered from two time periods including linear, quasi-linear, and nonlinear patterns. There is a clear link between liberalization policy and market nonlinearity. Numerical estimation of a market turning point is close to the historical event of the US 2008 financial crisis.

  8. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    NASA Astrophysics Data System (ADS)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  9. On the estimation of phase synchronization, spurious synchronization and filtering

    NASA Astrophysics Data System (ADS)

    Rios Herrera, Wady A.; Escalona, Joaquín; Rivera López, Daniel; Müller, Markus F.

    2016-12-01

    Phase synchronization, viz., the adjustment of instantaneous frequencies of two interacting self-sustained nonlinear oscillators, is frequently used for the detection of a possible interrelationship between empirical data recordings. In this context, the proper estimation of the instantaneous phase from a time series is a crucial aspect. The probability that numerical estimates provide a physically relevant meaning depends sensitively on the shape of its power spectral density. For this purpose, the power spectrum should be narrow banded possessing only one prominent peak [M. Chavez et al., J. Neurosci. Methods 154, 149 (2006)]. If this condition is not fulfilled, band-pass filtering seems to be the adequate technique in order to pre-process data for a posterior synchronization analysis. However, it was reported that band-pass filtering might induce spurious synchronization [L. Xu et al., Phys. Rev. E 73, 065201(R), (2006); J. Sun et al., Phys. Rev. E 77, 046213 (2008); and J. Wang and Z. Liu, EPL 102, 10003 (2013)], a statement that without further specification causes uncertainty over all measures that aim to quantify phase synchronization of broadband field data. We show by using signals derived from different test frameworks that appropriate filtering does not induce spurious synchronization. Instead, filtering in the time domain tends to wash out existent phase interrelations between signals. Furthermore, we show that measures derived for the estimation of phase synchronization like the mean phase coherence are also useful for the detection of interrelations between time series, which are not necessarily derived from coupled self-sustained nonlinear oscillators.

  10. CRIB; the mineral resources data bank of the U.S. Geological Survey

    USGS Publications Warehouse

    Calkins, James Alfred; Kays, Olaf; Keefer, Eleanor K.

    1973-01-01

    The recently established Computerized Resources Information Bank (CRIB) of the U.S. Geological Survey is expected to play an increasingly important role in the study of United States' mineral resources. CRIB provides a rapid means for organizing and summarizing information on mineral resources and for displaying the results. CRIB consists of a set of variable-length records containing the basic information needed to characterize one or more mineral commodities, a mineral deposit, or several related deposits. The information consists of text, numeric data, and codes. Some topics covered are: name, location, commodity information, geology, production, reserves, potential resources, and references. The data are processed by the GIPSY program, which performs all the processing tasks needed to build, operate, and maintain the CRIB file. The sophisticated retrieval program allows the user to make highly selective searches of the files for words, parts of words, phrases, numeric data, word ranges, numeric ranges, and others, and to interrelate variables by logic statements to any degree of refinement desired. Three print options are available, or the retrieved data can be passed to another program for further processing.

  11. An Entropy-Based Approach to Nonlinear Stability

    NASA Technical Reports Server (NTRS)

    Merriam, Marshal L.

    1989-01-01

    Many numerical methods used in computational fluid dynamics (CFD) incorporate an artificial dissipation term to suppress spurious oscillations and control nonlinear instabilities. The same effect can be accomplished by using upwind techniques, sometimes augmented with limiters to form Total Variation Diminishing (TVD) schemes. An analysis based on numerical satisfaction of the second law of thermodynamics allows many such methods to be compared and improved upon. A nonlinear stability proof is given for discrete scalar equations arising from a conservation law. Solutions to such equations are bounded in the L sub 2 norm if the second law of thermodynamics is satisfied in a global sense over a periodic domain. It is conjectured that an analogous statement is true for discrete equations arising from systems of conservation laws. Analysis and numerical experiments suggest that a more restrictive condition, a positive entropy production rate in each cell, is sufficient to exclude unphysical phenomena such as oscillations and expansion shocks. Construction of schemes which satisfy this condition is demonstrated for linear and nonlinear wave equations and for the one-dimensional Euler equations.

  12. Investigation of possibility of surface rupture derived from PFDHA and calculation of surface displacement based on dislocation

    NASA Astrophysics Data System (ADS)

    Inoue, N.; Kitada, N.; Irikura, K.

    2013-12-01

    A probability of surface rupture is important to configure the seismic source, such as area sources or fault models, for a seismic hazard evaluation. In Japan, Takemura (1998) estimated the probability based on the historical earthquake data. Kagawa et al. (2004) evaluated the probability based on a numerical simulation of surface displacements. The estimated probability indicates a sigmoid curve and increases between Mj (the local magnitude defined and calculated by Japan Meteorological Agency) =6.5 and Mj=7.0. The probability of surface rupture is also used in a probabilistic fault displacement analysis (PFDHA). The probability is determined from the collected earthquake catalog, which were classified into two categories: with surface rupture or without surface rupture. The logistic regression is performed for the classified earthquake data. Youngs et al. (2003), Ross and Moss (2011) and Petersen et al. (2011) indicate the logistic curves of the probability of surface rupture by normal, reverse and strike-slip faults, respectively. Takao et al. (2013) shows the logistic curve derived from only Japanese earthquake data. The Japanese probability curve shows the sharply increasing in narrow magnitude range by comparison with other curves. In this study, we estimated the probability of surface rupture applying the logistic analysis to the surface displacement derived from a surface displacement calculation. A source fault was defined in according to the procedure of Kagawa et al. (2004), which determined a seismic moment from a magnitude and estimated the area size of the asperity and the amount of slip. Strike slip and reverse faults were considered as source faults. We applied Wang et al. (2003) for calculations. The surface displacements with defined source faults were calculated by varying the depth of the fault. A threshold value as 5cm of surface displacement was used to evaluate whether a surface rupture reach or do not reach to the surface. We carried out the logistic regression analysis to the calculated displacements, which were classified by the above threshold. The estimated probability curve indicated the similar trend to the result of Takao et al. (2013). The probability of revere faults is larger than that of strike slip faults. On the other hand, PFDHA results show different trends. The probability of reverse faults at higher magnitude is lower than that of strike slip and normal faults. Ross and Moss (2011) suggested that the sediment and/or rock over the fault compress and not reach the displacement to the surface enough. The numerical theory applied in this study cannot deal with a complex initial situation such as topography.

  13. Comments of statistical issue in numerical modeling for underground nuclear test monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W.L.; Anderson, K.K.

    1993-03-01

    The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less

  14. Final Environmental Impact Statement for Langley

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The Langley Research Center is described, together with the nature of its activities, from which it can be seen that the Center is basically not a major pollution source. Geographical, geological, and climatic charateristics of the site are also described. inasmuch as they influence both the choice of disposal methods and the environmental effects of the pollutants. The known or probable pollution sources at the Center are described. Where the intensities of these sources might exceed the recommended guide-lines, the corrective actions that have been taken or are being taken are described. The entire inventory of pollution sources and control methods is summarized in an appendix.

  15. [GINA 2014: Yin and Yang].

    PubMed

    Kardos, P

    2014-12-01

    After 8 years the Global Initiative for Asthma (GINA) presented a fully revised report. In May 2014 the new GINA was published online [www.ginasthma.org]. On a live GINA Session at the European Respiratory Society (ERS) conference 2014 in Munich members of the board of directors and of the science committee presented the new contents, e.g. the GINA statement from page one, that GINA is "Not a guideline, but a practical approach to managing asthma in clinical practice"--was explicitly emphasized on the ERS. This may reflect a changing claim towards a more pragmatic attempt (but probably also the fear of liability). © Georg Thieme Verlag KG Stuttgart · New York.

  16. Adult acute megakaryoblastic leukemia: rare association with cytopenias of undetermined significance and p210 and p190 BCR–ABL transcripts

    PubMed Central

    Trifa, Adrian; Selicean, Cristina; Moisoiu, Vlad; Frinc, Ioana; Zdrenghea, Mihnea; Tomuleasa, Ciprian

    2017-01-01

    Acute megakaryocytic leukemia (M7-AML) is a rare form of acute myeloid leukemia (AML), which is associated with poor prognosis. The case presented in the current report is a statement for the difficult diagnosis and clinical management of M7-AML in the context of a previous hematologic disorder of undetermined significance and associated genetic abnormalities. Probably, following the complete hematologic remission and further with induction chemotherapy plus tyrosine kinase inhibitor therapy, the clinical management of this case will be followed by a allogeneic bone marrow transplantation, the only proven therapy to improve overall survival. PMID:29089774

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Senate consideration of S. 1730 resulted in a recommendation for passage of a bill which would open up onshore oil and gas leasing and eliminate the known geological structure (KGS) requirement for competitive bidding. The Energy and Natural Resources Committee report reviews the 1920 Mineral Leasing Act, and notes that 95% of outstanding leases were awarded on a noncompetitive basis using the KGS designation. The report examines problems under the current simultaneous oil and gas (SIMO) lottery system, the probable impacts of amending the 1920 law, and analyzes each section of the proposed bill. The report concludes with correspondence, statements,more » and an explanation of changes which will be required in current law.« less

  18. Numerical model for thermodynamical behaviors of unsaturated soil

    NASA Astrophysics Data System (ADS)

    Miyamoto, Yuji; Yamada, Mitsuhide; Sako, Kazunari; Araki, Kohei; Kitamura, Ryosuke

    Kitamura et al. have proposed the numerical models to establish the unsaturated soil mechanics aided by probability theory and statistics, and to apply the unsaturated soil mechanics to the geo-simulator, where the numerical model for the thermodynamical behaviors of unsaturated soil are essential. In this paper the thermodynamics is introduced to investigate the heat transfer through unsaturated soil and the evaporation of pore water in soil based on the first and second laws of thermodynamics, i.e., the conservation of energy, and increasing entropy. On the other hand the lysimeter equipment is used to obtain the data for the evaporation of pore water during fine days and seepage of rain water during rainy days. The numerical simulation is carried out by using the proposed numerical model and the results are compared with those obtained from the lysimeter test.

  19. The Psychology of Hazard Risk Perception

    NASA Astrophysics Data System (ADS)

    Thompson, K. F.

    2012-12-01

    A critical step in preparing for natural hazards is understanding the risk: what is the hazard, its likelihood and range of impacts, and what are the vulnerabilities of the community? Any hazard forecast naturally includes a degree of uncertainty, and often these uncertainties are expressed in terms of probabilities. There is often a strong understanding of probability among the physical scientists and emergency managers who create hazard forecasts and issue watches, warnings, and evacuation orders, and often such experts expect similar levels of risk fluency among the general public—indeed, the Working Group on California Earthquake Probabilities (WGCEP) states in the introduction to its earthquake rupture forecast maps that "In daily living, people are used to making decisions based on probabilities—from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [1] However, cognitive psychologists have shown in numerous studies [see, e.g., 2-5] that the WGCEP's expectation of probability literacy is inaccurate. People neglect, distort, misjudge, or misuse probability information, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [6]. Even the most ubiquitous of probabilistic information—weather forecasts—are systematically misinterpreted [7]. So while disaster risk analysis and assessment is undoubtedly a critical step in public preparedness and hazard mitigation plans, it is equally important that scientists and practitioners understand the common psychological barriers to accurate probability perception before they attempt to communicate hazard risks to the public. This paper discusses several common, systematic distortions in probability perception and use, including: the influence of personal experience on use of statistical information; temporal discounting and construal level theory; the effect of instrumentality on risk perception; and the impact of "false alarms" or "near misses." We conclude with practical recommendations for ways that risk communications may best be presented to avoid (or, in some cases, to capitalize on) these typical psychological hurdles to the understanding of risk. 1 http://www.scec.org/ucerf/ 2 Kahneman, D. & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, XLVII: 263-291. 3 Hau, R., Pleskac, T. J., Kiefer, J., & Hertwig, R. (2008). The Description/Experience Gap in Risky Choice: The Role of Sample Size and Experienced Probabilities. Journal of Behavioral Decision Making, 21: 493-518. 4 Lichtenstein, S., Slovic, P., Fischhoff, B., Layman, M., & Combs, B. (1978). Judged frequency of lethal events. JEP: Human Learning and Memory, 4, 551-578. 5 Hertwig, R., Barron, G., Weber, E. U., & Erev, I. (2006). The role of information sampling in risky choice. In K. Fiedler, & P. Juslin (Eds.), Information sampling and adaptive cognition. (pp. 75-91). New York: Cambridge U Press. 6 Budescu, DV, Weinberg, S & Wallsten, TS (1987). Decisions based on numerically and verbally expressed uncertainties. JEP: Human Perception and Performance, 14(2), 281-294. 7 Gigerenzer, G., Hertwig, R., Van Den Broek, E., Fasolo, B., & Katsikopoulos, K. V. (2005). "A 30% chance of rain tomorrow": How does the public understand probabilistic weather forecasts? Risk Analysis, 25(3), 623-629.

  20. Preformulation considerations for controlled release dosage forms. Part III. Candidate form selection using numerical weighting and scoring.

    PubMed

    Chrzanowski, Frank

    2008-01-01

    Two numerical methods, Decision Analysis (DA) and Potential Problem Analysis (PPA) are presented as alternative selection methods to the logical method presented in Part I. In DA properties are weighted and outcomes are scored. The weighted scores for each candidate are totaled and final selection is based on the totals. Higher scores indicate better candidates. In PPA potential problems are assigned a seriousness factor and test outcomes are used to define the probability of occurrence. The seriousness-probability products are totaled and forms with minimal scores are preferred. DA and PPA have never been compared to the logical-elimination method. Additional data were available for two forms of McN-5707 to provide complete preformulation data for five candidate forms. Weight and seriousness factors (independent variables) were obtained from a survey of experienced formulators. Scores and probabilities (dependent variables) were provided independently by Preformulation. The rankings of the five candidate forms, best to worst, were similar for all three methods. These results validate the applicability of DA and PPA for candidate form selection. DA and PPA are particularly applicable in cases where there are many candidate forms and where each form has some degree of unfavorable properties.

  1. Communicating weather forecast uncertainty: Do individual differences matter?

    PubMed

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Probability density function evolution of power systems subject to stochastic variation of renewable energy

    NASA Astrophysics Data System (ADS)

    Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.

    2018-05-01

    As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.

  3. Enforcing positivity in intrusive PC-UQ methods for reactive ODE systems

    DOE PAGES

    Najm, Habib N.; Valorani, Mauro

    2014-04-12

    We explore the relation between the development of a non-negligible probability of negative states and the instability of numerical integration of the intrusive Galerkin ordinary differential equation system describing uncertain chemical ignition. To prevent this instability without resorting to either multi-element local polynomial chaos (PC) methods or increasing the order of the PC representation in time, we propose a procedure aimed at modifying the amplitude of the PC modes to bring the probability of negative state values below a user-defined threshold. This modification can be effectively described as a filtering procedure of the spectral PC coefficients, which is applied on-the-flymore » during the numerical integration when the current value of the probability of negative states exceeds the prescribed threshold. We demonstrate the filtering procedure using a simple model of an ignition process in a batch reactor. This is carried out by comparing different observables and error measures as obtained by non-intrusive Monte Carlo and Gauss-quadrature integration and the filtered intrusive procedure. Lastly, the filtering procedure has been shown to effectively stabilize divergent intrusive solutions, and also to improve the accuracy of stable intrusive solutions which are close to the stability limits.« less

  4. On the probability of exceeding allowable leak rates through degraded steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cizelj, L.; Sorsek, I.; Riesch-Oppermann, H.

    1997-02-01

    This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds themore » predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.« less

  5. Global stability of a multiple infected compartments model for waterborne diseases

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Cao, Jinde

    2014-10-01

    In this paper, mathematical analysis is carried out for a multiple infected compartments model for waterborne diseases, such as cholera, giardia, and rotavirus. The model accounts for both person-to-person and water-to-person transmission routes. Global stability of the equilibria is studied. In terms of the basic reproduction number R0, we prove that, if R0⩽1, then the disease-free equilibrium is globally asymptotically stable and the infection always disappears; whereas if R0>1, there exists a unique endemic equilibrium which is globally asymptotically stable for the corresponding fast-slow system. Numerical simulations verify our theoretical results and present that the decay rate of waterborne pathogens has a significant impact on the epidemic growth rate. Also, we observe numerically that the unique endemic equilibrium is globally asymptotically stable for the whole system. This statement indicates that the present method need to be improved by other techniques.

  6. The Mpemba effect: When can hot water freeze faster than cold?

    NASA Astrophysics Data System (ADS)

    Jeng, Monwhea

    2006-06-01

    We review the Mpemba effect, where initially hot water freezes faster than initially cold water. Although the effect might appear impossible, it has been observed in numerous experiments and was discussed by Aristotle, Francis Bacon, Roger Bacon, and Descartes. It has a rich and fascinating history, including the story of the secondary school student, Erasto Mpemba, who reintroduced the effect to the twentieth century scientific community. The phenomenon is simple to describe and illustrates numerous important issues about the scientific method: the role of skepticism in scientific inquiry, the influence of theory on experiment and observation, the need for precision in the statement of a scientific hypothesis, and the nature of falsifiability. Proposed theoretical mechanisms for the Mpemba effect and the results of contemporary experiments on the phenomenon are surveyed. The observation that hot water pipes are more likely to burst than cold water pipes is also discussed.

  7. Transmission of electrons inside the cryogenic pumps of ITER injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veltri, P., E-mail: pierluigi.veltri@igi.cnr.it; Sartori, E.

    2016-02-15

    Large cryogenic pumps are installed in the vessel of large neutral beam injectors (NBIs) used to heat the plasma in nuclear fusion experiments. The operation of such pumps can be compromised by the presence of stray secondary electrons that are generated along the beam path. In this paper, we present a numerical model to analyze the propagation of the electrons inside the pump. The aim of the study is to quantify the power load on the active pump elements, via evaluation of the transmission probabilities across the domain of the pump. These are obtained starting from large datasets of particlemore » trajectories, obtained by numerical means. The transmission probability of the electrons across the domain is calculated for the NBI of the ITER and for its prototype Megavolt ITer Injector and Concept Advancement (MITICA) and the results are discussed.« less

  8. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  9. A Stochastic Tick-Borne Disease Model: Exploring the Probability of Pathogen Persistence.

    PubMed

    Maliyoni, Milliward; Chirove, Faraimunashe; Gaff, Holly D; Govinder, Keshlan S

    2017-09-01

    We formulate and analyse a stochastic epidemic model for the transmission dynamics of a tick-borne disease in a single population using a continuous-time Markov chain approach. The stochastic model is based on an existing deterministic metapopulation tick-borne disease model. We compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in tick-borne disease dynamics. The probability of disease extinction and that of a major outbreak are computed and approximated using the multitype Galton-Watson branching process and numerical simulations, respectively. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that a disease outbreak is more likely if the disease is introduced by infected deer as opposed to infected ticks. These insights demonstrate the importance of host movement in the expansion of tick-borne diseases into new geographic areas.

  10. Peelle's pertinent puzzle using the Monte Carlo technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawano, Toshihiko; Talou, Patrick; Burr, Thomas

    2009-01-01

    We try to understand the long-standing problem of the Peelle's Pertinent Puzzle (PPP) using the Monte Carlo technique. We allow the probability density functions to be any kind of form to assume the impact of distribution, and obtain the least-squares solution directly from numerical simulations. We found that the standard least squares method gives the correct answer if a weighting function is properly provided. Results from numerical simulations show that the correct answer of PPP is 1.1 {+-} 0.25 if the common error is multiplicative. The thought-provoking answer of 0.88 is also correct, if the common error is additive, andmore » if the error is proportional to the measured values. The least squares method correctly gives us the most probable case, where the additive component has a negative value. Finally, the standard method fails for PPP due to a distorted (non Gaussian) joint distribution.« less

  11. Efficient Simulation Budget Allocation for Selecting an Optimal Subset

    NASA Technical Reports Server (NTRS)

    Chen, Chun-Hung; He, Donghai; Fu, Michael; Lee, Loo Hay

    2008-01-01

    We consider a class of the subset selection problem in ranking and selection. The objective is to identify the top m out of k designs based on simulated output. Traditional procedures are conservative and inefficient. Using the optimal computing budget allocation framework, we formulate the problem as that of maximizing the probability of correc tly selecting all of the top-m designs subject to a constraint on the total number of samples available. For an approximation of this corre ct selection probability, we derive an asymptotically optimal allocat ion and propose an easy-to-implement heuristic sequential allocation procedure. Numerical experiments indicate that the resulting allocatio ns are superior to other methods in the literature that we tested, and the relative efficiency increases for larger problems. In addition, preliminary numerical results indicate that the proposed new procedur e has the potential to enhance computational efficiency for simulation optimization.

  12. Analytical study of nano-scale logical operations

    NASA Astrophysics Data System (ADS)

    Patra, Moumita; Maiti, Santanu K.

    2018-07-01

    A complete analytical prescription is given to perform three basic (OR, AND, NOT) and two universal (NAND, NOR) logic gates at nano-scale level using simple tailor made geometries. Two different geometries, ring-like and chain-like, are taken into account where in each case the bridging conductor is coupled to a local atomic site through a dangling bond whose site energy can be controlled by means of external gate electrode. The main idea is that when injecting electron energy matches with site energy of local atomic site transmission probability drops exactly to zero, whereas the junction exhibits finite transmission for other energies. Utilizing this prescription we perform logical operations, and, we strongly believe that the proposed results can be verified in laboratory. Finally, we numerically compute two-terminal transmission probability considering general models and the numerical results match exactly well with our analytical findings.

  13. Delay-induced stochastic bifurcations in a bistable system under white noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Zhongkui, E-mail: sunzk@nwpu.edu.cn; Fu, Jin; Xu, Wei

    2015-08-15

    In this paper, the effects of noise and time delay on stochastic bifurcations are investigated theoretically and numerically in a time-delayed Duffing-Van der Pol oscillator subjected to white noise. Due to the time delay, the random response is not Markovian. Thereby, approximate methods have been adopted to obtain the Fokker-Planck-Kolmogorov equation and the stationary probability density function for amplitude of the response. Based on the knowledge that stochastic bifurcation is characterized by the qualitative properties of the steady-state probability distribution, it is found that time delay and feedback intensity as well as noise intensity will induce the appearance of stochasticmore » P-bifurcation. Besides, results demonstrated that the effects of the strength of the delayed displacement feedback on stochastic bifurcation are accompanied by the sensitive dependence on time delay. Furthermore, the results from numerical simulations best confirm the effectiveness of the theoretical analyses.« less

  14. On the delay analysis of a TDMA channel with finite buffer capacity

    NASA Technical Reports Server (NTRS)

    Yan, T.-Y.

    1982-01-01

    The throughput performance of a TDMA channel with finite buffer capacity for transmitting data messages is considered. Each station has limited message buffer capacity and has Poisson message arrivals. Message arrivals will be blocked if the buffers are congested. Using the embedded Markov chain model, the solution procedure for the limiting system-size probabilities is presented in a recursive fashion. Numerical examples are given to demonstrate the tradeoffs between the blocking probabilities and the buffer sizing strategy.

  15. Nonidentifiability of population size from capture-recapture data with heterogeneous detection probabilities

    USGS Publications Warehouse

    Link, W.A.

    2003-01-01

    Heterogeneity in detection probabilities has long been recognized as problematic in mark-recapture studies, and numerous models developed to accommodate its effects. Individual heterogeneity is especially problematic, in that reasonable alternative models may predict essentially identical observations from populations of substantially different sizes. Thus even with very large samples, the analyst will not be able to distinguish among reasonable models of heterogeneity, even though these yield quite distinct inferences about population size. The problem is illustrated with models for closed and open populations.

  16. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  17. A Theory of Relational Ageism: A Discourse Analysis of the 2015 White House Conference on Aging.

    PubMed

    Gendron, Tracey L; Inker, Jennifer; Welleford, Elizabeth Ayn

    2018-03-19

    The widespread use of ageist language is generally accepted as commonplace and routine in most cultures and settings. In order to disrupt ageism, we must examine the use of ageist language and sentiments among those on the front line of providing advocacy, services, and policy for older adults; the professional culture of the aging services network. The recorded video segments from the sixth White House Conference on Aging (WHCOA) provided a unique opportunity to examine discourse used by professionals and appointed representatives in the field of aging within a professional sociocultural context. A qualitative discourse analysis of video recordings was used to analyze the 15 video fragments that comprised the recorded sessions of the 2015 WHCOA. About 26 instances were identified that captured statements expressing personal age, aging or an age-related characteristic negatively in regard to self or other (microageism), and/or statements expressing global negative opinions or beliefs about aging and older adults based on group membership (macroageism). A theoretical pathway was established that represents the dynamic process by which ageist statements were expressed and reinforced (relational ageism). Numerous instances of ageism were readily identified as part of a live streamed and publically accessible professional conference attended and presented by representatives of the aging services network. To make meaningful gains in the movement to disrupt ageism and promote optimal aging for all individuals, we must raise awareness of the relational nature, expression, and perpetuation of ageism.

  18. Technical notes and correspondence: Stochastic robustness of linear time-invariant control systems

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Ray, Laura R.

    1991-01-01

    A simple numerical procedure for estimating the stochastic robustness of a linear time-invariant system is described. Monte Carlo evaluations of the system's eigenvalues allows the probability of instability and the related stochastic root locus to be estimated. This analysis approach treats not only Gaussian parameter uncertainties but non-Gaussian cases, including uncertain-but-bounded variation. Confidence intervals for the scalar probability of instability address computational issues inherent in Monte Carlo simulation. Trivial extensions of the procedure admit consideration of alternate discriminants; thus, the probabilities that stipulated degrees of instability will be exceeded or that closed-loop roots will leave desirable regions can also be estimated. Results are particularly amenable to graphical presentation.

  19. Nonparametric probability density estimation by optimization theoretic techniques

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1976-01-01

    Two nonparametric probability density estimators are considered. The first is the kernel estimator. The problem of choosing the kernel scaling factor based solely on a random sample is addressed. An interactive mode is discussed and an algorithm proposed to choose the scaling factor automatically. The second nonparametric probability estimate uses penalty function techniques with the maximum likelihood criterion. A discrete maximum penalized likelihood estimator is proposed and is shown to be consistent in the mean square error. A numerical implementation technique for the discrete solution is discussed and examples displayed. An extensive simulation study compares the integrated mean square error of the discrete and kernel estimators. The robustness of the discrete estimator is demonstrated graphically.

  20. Frequency analysis of uncertain structures using imprecise probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modares, Mehdi; Bergerson, Joshua

    2015-01-01

    Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less

  1. Probability of stress-corrosion fracture under random loading.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    A method is developed for predicting the probability of stress-corrosion fracture of structures under random loadings. The formulation is based on the cumulative damage hypothesis and the experimentally determined stress-corrosion characteristics. Under both stationary and nonstationary random loadings, the mean value and the variance of the cumulative damage are obtained. The probability of stress-corrosion fracture is then evaluated using the principle of maximum entropy. It is shown that, under stationary random loadings, the standard deviation of the cumulative damage increases in proportion to the square root of time, while the coefficient of variation (dispersion) decreases in inversed proportion to the square root of time. Numerical examples are worked out to illustrate the general results.

  2. Nonlinear resonance scattering of femtosecond X-ray pulses on atoms in plasmas

    NASA Astrophysics Data System (ADS)

    Rosmej, F. B.; Astapenko, V. A.; Lisitsa, V. S.; Moroz, N. N.

    2017-11-01

    It is shown that for sufficiently short pulses the resonance scattering probability becomes a nonlinear function of the pulse duration. For fs X-ray pulses scattered on atoms in plasmas maxima and minima develop in the nonlinear regime whereas in the limit of long pulses the probability becomes linear and turns over into the standard description of the electromagnetic pulse scattering. Numerical calculations are carried out in terms of a generalized scattering probability for the total time of pulse duration including fine structure splitting and ion Doppler broadening in hot plasmas. For projected X-ray monocycles, the generalized nonlinear approach differs by 1-2 orders of magnitude from the standard theory.

  3. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  4. Statistical mechanics of Fermi-Pasta-Ulam chains with the canonical ensemble

    NASA Astrophysics Data System (ADS)

    Demirel, Melik C.; Sayar, Mehmet; Atılgan, Ali R.

    1997-03-01

    Low-energy vibrations of a Fermi-Pasta-Ulam-Β (FPU-Β) chain with 16 repeat units are analyzed with the aid of numerical experiments and the statistical mechanics equations of the canonical ensemble. Constant temperature numerical integrations are performed by employing the cubic coupling scheme of Kusnezov et al. [Ann. Phys. 204, 155 (1990)]. Very good agreement is obtained between numerical results and theoretical predictions for the probability distributions of the generalized coordinates and momenta both of the chain and of the thermal bath. It is also shown that the average energy of the chain scales linearly with the bath temperature.

  5. Study on the tumor-induced angiogenesis using mathematical models.

    PubMed

    Suzuki, Takashi; Minerva, Dhisa; Nishiyama, Koichi; Koshikawa, Naohiko; Chaplain, Mark Andrew Joseph

    2018-01-01

    We studied angiogenesis using mathematical models describing the dynamics of tip cells. We reviewed the basic ideas of angiogenesis models and its numerical simulation technique to produce realistic computer graphics images of sprouting angiogenesis. We examined the classical model of Anderson-Chaplain using fundamental concepts of mass transport and chemical reaction with ECM degradation included. We then constructed two types of numerical schemes, model-faithful and model-driven ones, where new techniques of numerical simulation are introduced, such as transient probability, particle velocity, and Boolean variables. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  6. The Case of James Leininger: An American Case of the Reincarnation Type.

    PubMed

    Tucker, Jim B

    2016-01-01

    Numerous cases of young children who report memories of previous lives have been studied over the last 50 years. Though such cases are more easily found in cultures that have a general belief in reincarnation, they occur in the West as well. This article describes the case of James Leininger, an American child who at age two began having intense nightmares of a plane crash. He then described being an American pilot who was killed when his plane was shot down by the Japanese. He gave details that included the name of an American aircraft carrier, the first and last name of a friend who was on the ship with him, and a location and other specifics about the fatal crash. His parents eventually discovered a close correspondence between James׳s statements and the death of a World War II pilot named James Huston. Documentation of James׳s statements that was made before Huston was identified includes a television interview with his parents that never aired but which the author has been able to review. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Computing exact bundle compliance control charts via probability generating functions.

    PubMed

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  8. Emergence of heat extremes attributable to anthropogenic influences

    NASA Astrophysics Data System (ADS)

    King, Andrew D.; Black, Mitchell T.; Min, Seung-Ki; Fischer, Erich M.; Mitchell, Daniel M.; Harrington, Luke J.; Perkins-Kirkpatrick, Sarah E.

    2016-04-01

    Climate scientists have demonstrated that a substantial fraction of the probability of numerous recent extreme events may be attributed to human-induced climate change. However, it is likely that for temperature extremes occurring over previous decades a fraction of their probability was attributable to anthropogenic influences. We identify the first record-breaking warm summers and years for which a discernible contribution can be attributed to human influence. We find a significant human contribution to the probability of record-breaking global temperature events as early as the 1930s. Since then, all the last 16 record-breaking hot years globally had an anthropogenic contribution to their probability of occurrence. Aerosol-induced cooling delays the timing of a significant human contribution to record-breaking events in some regions. Without human-induced climate change recent hot summers and years would be very unlikely to have occurred.

  9. Gravity and count probabilities in an expanding universe

    NASA Technical Reports Server (NTRS)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  10. Diagnostic causal reasoning with verbal information.

    PubMed

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Visual representation of statistical information improves diagnostic inferences in doctors and their patients.

    PubMed

    Garcia-Retamero, Rocio; Hoffrage, Ulrich

    2013-04-01

    Doctors and patients have difficulty inferring the predictive value of a medical test from information about the prevalence of a disease and the sensitivity and false-positive rate of the test. Previous research has established that communicating such information in a format the human mind is adapted to-namely natural frequencies-as compared to probabilities, boosts accuracy of diagnostic inferences. In a study, we investigated to what extent these inferences can be improved-beyond the effect of natural frequencies-by providing visual aids. Participants were 81 doctors and 81 patients who made diagnostic inferences about three medical tests on the basis of information about prevalence of a disease, and the sensitivity and false-positive rate of the tests. Half of the participants received the information in natural frequencies, while the other half received the information in probabilities. Half of the participants only received numerical information, while the other half additionally received a visual aid representing the numerical information. In addition, participants completed a numeracy scale. Our study showed three important findings: (1) doctors and patients made more accurate inferences when information was communicated in natural frequencies as compared to probabilities; (2) visual aids boosted accuracy even when the information was provided in natural frequencies; and (3) doctors were more accurate in their diagnostic inferences than patients, though differences in accuracy disappeared when differences in numerical skills were controlled for. Our findings have important implications for medical practice as they suggest suitable ways to communicate quantitative medical data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. A numerical study of multiple adiabatic shear bands evolution in a 304LSS thick-walled cylinder

    NASA Astrophysics Data System (ADS)

    Liu, Mingtao; Hu, Haibo; Fan, Cheng; Tang, Tiegang

    2017-01-01

    The self-organization of multiple shear bands in a 304L stainless steel(304LSS) thick-walled cylinder (TWC) was numerically studied. The microstructures of material lead to the non-uniform distribution of the local yield stress, which play a key role in the formation of spontaneous shear localization. We introduced a probability factor satisfied the Gaussian distribution into the macroscopic constitutive relationship to describe the non-uniformity of local yield stress. Using the probability factor, the initiation and propagation of multiple shear bands in TWC were numerically replicated in our 2D FEM simulation. Experimental results in the literature indicated that the machined surface at the internal boundary of a 304L stainless steel cylinder provides a work-hardened layer (about 20˜30μm) which has significantly different microstructures from the base material. The work-hardened layer leads to the phenomenon that most shear bands propagate along a given direction, clockwise or counterclockwise. In our simulation, periodical single direction spiral perturbations were applied to describe the grain orientation in the work-hardened layer, and the single direction spiral pattern of shear bands was successfully replicated.

  13. An attentional theory of emotional influences on risky decisions.

    PubMed

    Levine, Daniel S; Ramirez, Patrick A

    2013-01-01

    It is well known that choices between gambles can depend not only on the probabilities of gains or losses but also on the emotional richness of the items to be gained or lost. Rottenstreich and Hsee (2001) demonstrated that overweighting of low probabilities is magnified if the possible events are emotionally rich, such as a kiss versus an amount of money. Ramirez (2010) showed that persistence in the face of comparable numerically presented losses is greater when the scenario involves taking care of a pet (emotionally richer) versus a business (emotionally poorer). Much of this phenomenon is captured in a neural network model of the Rottenstreich-Hsee data (Levine, 2012). The model is based on interactions among the orbitofrontal cortex, amygdala, cingulate, striatum, thalamus, and premotor cortex that implement categorization of multiattribute vectors representing choice options, in a manner consistent with the gists of fuzzy trace theory. Before categorization, the vectors are weighted by selective attention to attributes that are either emotionally salient or task relevant, with increasing emotional arousal shifting the attentional weights away from numerical attributes such as precise probabilities. This interpretation is supported by the data of Hsee and Rottenstreich (2004) showing that how much participants would pay to save endangered animals is not influenced by the number to be saved if they see pictures but is influenced by the number if they are given verbal descriptions. The theory suggests a few open questions. How are the selective attentional signals represented in the interactions between prefrontal cortex and subcortical areas? Would the salience of numerical attributes still be reduced with high arousal in highly numerate participants? Would the differences between the pet and business scenarios be altered if the positive or negative feedback participants received were shown via pictures rather than numbers? Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Link importance incorporated failure probability measuring solution for multicast light-trees in elastic optical networks

    NASA Astrophysics Data System (ADS)

    Li, Xin; Zhang, Lu; Tang, Ying; Huang, Shanguo

    2018-03-01

    The light-tree-based optical multicasting (LT-OM) scheme provides a spectrum- and energy-efficient method to accommodate emerging multicast services. Some studies focus on the survivability technologies for LTs against a fixed number of link failures, such as single-link failure. However, a few studies involve failure probability constraints when building LTs. It is worth noting that each link of an LT plays different important roles under failure scenarios. When calculating the failure probability of an LT, the importance of its every link should be considered. We design a link importance incorporated failure probability measuring solution (LIFPMS) for multicast LTs under independent failure model and shared risk link group failure model. Based on the LIFPMS, we put forward the minimum failure probability (MFP) problem for the LT-OM scheme. Heuristic approaches are developed to address the MFP problem in elastic optical networks. Numerical results show that the LIFPMS provides an accurate metric for calculating the failure probability of multicast LTs and enhances the reliability of the LT-OM scheme while accommodating multicast services.

  15. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description.

    PubMed

    Zhang, Wenyi; He, Zhengbing; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers.

  16. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description

    PubMed Central

    Zhang, Wenyi; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers. PMID:28829834

  17. Anticipating abrupt shifts in temporal evolution of probability of eruption

    NASA Astrophysics Data System (ADS)

    Rohmer, J.; Loschetter, A.

    2016-04-01

    Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value to a state of high probability value. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to > 70% could be identified up to 1-3 h in advance. This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.

  18. An add-in implementation of the RESAMPLING syntax under Microsoft EXCEL.

    PubMed

    Meineke, I

    2000-10-01

    The RESAMPLING syntax defines a set of powerful commands, which allow the programming of probabilistic statistical models with few, easily memorized statements. This paper presents an implementation of the RESAMPLING syntax using Microsoft EXCEL with Microsoft WINDOWS(R) as a platform. Two examples are given to demonstrate typical applications of RESAMPLING in biomedicine. Details of the implementation with special emphasis on the programming environment are discussed at length. The add-in is available electronically to interested readers upon request. The use of the add-in facilitates numerical statistical analyses of data from within EXCEL in a comfortable way.

  19. Consumers' preferences for the communication of risk information in drug advertising.

    PubMed

    Davis, Joel J

    2007-01-01

    Research was conducted to identify consumers' preferences regarding the form, content, and placement of drug side-effect information in direct-to-consumer (DTC) advertising. Specific questions explored preferences for the presence or absence of numeric information, the use of placebo and discontinuation groups as a context for understanding drug risk, the sequence in which side effects are presented, and the placement of side-effect statements on DTC Web sites. Consumers prefer detailed, readily accessible risk information--preferences that are a major departure from current advertiser practices and from what current and proposed Food and Drug Administration (FDA) regulations require.

  20. Some new surprises in chaos.

    PubMed

    Bunimovich, Leonid A; Vela-Arevalo, Luz V

    2015-09-01

    "Chaos is found in greatest abundance wherever order is being sought.It always defeats order, because it is better organized"Terry PratchettA brief review is presented of some recent findings in the theory of chaotic dynamics. We also prove a statement that could be naturally considered as a dual one to the Poincaré theorem on recurrences. Numerical results demonstrate that some parts of the phase space of chaotic systems are more likely to be visited earlier than other parts. A new class of chaotic focusing billiards is discussed that clearly violates the main condition considered to be necessary for chaos in focusing billiards.

  1. Extended Importance Sampling for Reliability Analysis under Evidence Theory

    NASA Astrophysics Data System (ADS)

    Yuan, X. K.; Chen, B.; Zhang, B. Q.

    2018-05-01

    In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.

  2. Intrinsic whole number bias in humans.

    PubMed

    Alonso-Díaz, Santiago; Piantadosi, Steven T; Hayden, Benjamin Y; Cantlon, Jessica F

    2018-06-25

    Humans have great difficulty comparing quotients including fractions, proportions, and probabilities and often erroneously isolate the whole numbers of the numerators and denominators to compare them. Some have argued that the whole number bias is a compensatory strategy to deal with difficult comparisons. We examined adult humans' preferences for gambles that differed only in numerosity, and not in factors that influence their expected value (probabilities and stakes). Subjects consistently preferred gambles with more winning balls to ones with fewer, even though the probabilities were mathematically identical, replicating prior results. In a second experiment, we found that subjects accurately represented the relative probabilities of the choice options during rapid nonverbal probability judgments but nonetheless showed biases based on whole numbers. We mathematically formalized and quantitatively evaluated cognitive rules based on existing hypotheses that attempt to explain subjects' whole number biases during quotient comparisons. The results show that the whole number bias is intrinsic to the way humans solve quotient comparisons rather than a compensatory strategy. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. The MAPS Reporting Statement for Studies Mapping onto Generic Preference-Based Outcome Measures: Explanation and Elaboration.

    PubMed

    Petrou, Stavros; Rivero-Arias, Oliver; Dakin, Helen; Longworth, Louise; Oppe, Mark; Froud, Robert; Gray, Alastair

    2015-10-01

    The process of "mapping" is increasingly being used to predict health utilities, for application within health economic evaluations, using data on other indicators or measures of health. Guidance for the reporting of mapping studies is currently lacking. The overall objective of this research was to develop a checklist of essential items, which authors should consider when reporting mapping studies. The MAPS (MApping onto Preference-based measures reporting Standards) statement is a checklist, which aims to promote complete and transparent reporting by researchers. This paper provides a detailed explanation and elaboration of the items contained within the MAPS statement. In the absence of previously published reporting checklists or reporting guidance documents, a de novo list of reporting items and accompanying explanations was created. A two-round, modified Delphi survey, with representatives from academia, consultancy, health technology assessment agencies and the biomedical journal editorial community, was used to identify a list of essential reporting items from this larger list. From the initial de novo list of 29 candidate items, a set of 23 essential reporting items was developed. The items are presented numerically and categorised within six sections, namely, (i) title and abstract, (ii) introduction, (iii) methods, (iv) results, (v) discussion and (vi) other. For each item, we summarise the recommendation, illustrate it using an exemplar of good reporting practice identified from the published literature, and provide a detailed explanation to accompany the recommendation. It is anticipated that the MAPS statement will promote clarity, transparency and completeness of reporting of mapping studies. It is targeted at researchers developing mapping algorithms, peer reviewers and editors involved in the manuscript review process for mapping studies, and the funders of the research. The MAPS working group plans to assess the need for an update of the reporting checklist in 5 years' time.

  4. Probability and Statistics in Sensor Performance Modeling

    DTIC Science & Technology

    2010-12-01

    language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of

  5. Effects of numerical dissipation and unphysical excursions on scalar-mixing estimates in large-eddy simulations

    NASA Astrophysics Data System (ADS)

    Sharan, Nek; Matheou, Georgios; Dimotakis, Paul

    2017-11-01

    Artificial numerical dissipation decreases dispersive oscillations and can play a key role in mitigating unphysical scalar excursions in large eddy simulations (LES). Its influence on scalar mixing can be assessed through the resolved-scale scalar, Z , its probability density function (PDF), variance, spectra, and the budget of the horizontally averaged equation for Z2. LES of incompressible temporally evolving shear flow enabled us to study the influence of numerical dissipation on unphysical scalar excursions and mixing estimates. Flows with different mixing behavior, with both marching and non-marching scalar PDFs, are studied. Scalar fields for each flow are compared for different grid resolutions and numerical scalar-convection term schemes. As expected, increasing numerical dissipation enhances scalar mixing in the development stage of shear flow characterized by organized large-scale pairings with a non-marching PDF, but has little influence in the self-similar stage of flows with marching PDFs. Flow parameters and regimes sensitive to numerical dissipation help identify approaches to mitigate unphysical excursions while minimizing dissipation.

  6. Crash probability estimation via quantifying driver hazard perception.

    PubMed

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Realistic Covariance Prediction for the Earth Science Constellation

    NASA Technical Reports Server (NTRS)

    Duncan, Matthew; Long, Anne

    2006-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.

  8. Constructing event trees for volcanic crises

    USGS Publications Warehouse

    Newhall, C.; Hoblitt, R.

    2002-01-01

    Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.

  9. Smisc - A collection of miscellaneous functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landon Sego, PNNL

    2015-08-31

    A collection of functions for statistical computing and data manipulation. These include routines for rapidly aggregating heterogeneous matrices, manipulating file names, loading R objects, sourcing multiple R files, formatting datetimes, multi-core parallel computing, stream editing, specialized plotting, etc. Smisc-package A collection of miscellaneous functions allMissing Identifies missing rows or columns in a data frame or matrix as.numericSilent Silent wrapper for coercing a vector to numeric comboList Produces all possible combinations of a set of linear model predictors cumMax Computes the maximum of the vector up to the current index cumsumNA Computes the cummulative sum of a vector without propogating NAsmore » d2binom Probability functions for the sum of two independent binomials dataIn A flexible way to import data into R. dbb The Beta-Binomial Distribution df2list Row-wise conversion of a data frame to a list dfplapply Parallelized single row processing of a data frame dframeEquiv Examines the equivalence of two dataframes or matrices dkbinom Probability functions for the sum of k independent binomials factor2character Converts all factor variables in a dataframe to character variables findDepMat Identify linearly dependent rows or columns in a matrix formatDT Converts date or datetime strings into alternate formats getExtension Filename manipulations: remove the extension or path, extract the extension or path getPath Filename manipulations: remove the extension or path, extract the extension or path grabLast Filename manipulations: remove the extension or path, extract the extension or path ifelse1 Non-vectorized version of ifelse integ Simple numerical integration routine interactionPlot Two-way Interaction Plot with Error Bar linearMap Linear mapping of a numerical vector or scalar list2df Convert a list to a data frame loadObject Loads and returns the object(s) in an ".Rdata" file more Display the contents of a file to the R terminal movAvg2 Calculate the moving average using a 2-sided window openDevice Opens a graphics device based on the filename extension p2binom Probability functions for the sum of two independent binomials padZero Pad a vector of numbers with zeros parseJob Parses a collection of elements into (almost) equal sized groups pbb The Beta-Binomial Distribution pcbinom A continuous version of the binomial cdf pkbinom Probability functions for the sum of k independent binomials plapply Simple parallelization of lapply plotFun Plot one or more functions on a single plot PowerData An example of power data pvar Prints the name and value of one or more objects qbb The Beta-Binomial Distribution rbb And numerous others (space limits reporting).« less

  10. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  11. The one-dimensional minesweeper game: What are your chances of winning?

    NASA Astrophysics Data System (ADS)

    Rodríguez-Achach, M.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Huerta-Quintanilla, R.; Canto-Lugo, E.

    2016-04-01

    Minesweeper is a famous computer game consisting usually in a two-dimensional lattice, where cells can be empty or mined and gamers are required to locate the mines without dying. Even if minesweeper seems to be a very simple system, it has some complex and interesting properties as NP-completeness. In this paper and for the one-dimensional case, given a lattice of n cells and m mines, we calculate the winning probability. By numerical simulations this probability is also estimated. We also find out by mean of these simulations that there exists a critical density of mines that minimize the probability of winning the game. Analytical results and simulations are compared showing a very good agreement.

  12. Analytic barrage attack model. Final report, January 1986-January 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St Ledger, J.W.; Naegeli, R.E.; Dowden, N.A.

    An analytic model is developed for a nuclear barrage attack, assuming weapons with no aiming error and a cookie-cutter damage function. The model is then extended with approximations for the effects of aiming error and distance damage sigma. The final result is a fast running model which calculates probability of damage for a barrage attack. The probability of damage is accurate to within seven percent or better, for weapon reliabilities of 50 to 100 percent, distance damage sigmas of 0.5 or less, and zero to very large circular error probabilities. FORTRAN 77 coding is included in the report for themore » analytic model and for a numerical model used to check the analytic results.« less

  13. Nonrandom network connectivity comes in pairs.

    PubMed

    Hoffmann, Felix Z; Triesch, Jochen

    2017-01-01

    Overrepresentation of bidirectional connections in local cortical networks has been repeatedly reported and is a focus of the ongoing discussion of nonrandom connectivity. Here we show in a brief mathematical analysis that in a network in which connection probabilities are symmetric in pairs, P ij = P ji , the occurrences of bidirectional connections and nonrandom structures are inherently linked; an overabundance of reciprocally connected pairs emerges necessarily when some pairs of neurons are more likely to be connected than others. Our numerical results imply that such overrepresentation can also be sustained when connection probabilities are only approximately symmetric.

  14. A Numerical Round Robin for the Reliability Prediction of Structural Ceramics

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Janosik, Lesley A.

    1993-01-01

    A round robin has been conducted on integrated fast fracture design programs for brittle materials. An informal working group (WELFEP-WEakest Link failure probability prediction by Finite Element Postprocessors) was formed to discuss and evaluate the implementation of the programs examined in the study. Results from the study have provided insight on the differences between the various programs examined. Conclusions from the study have shown that when brittle materials are used in design, analysis must understand how to apply the concepts presented herein to failure probability analysis.

  15. Diversity Order Analysis of Dual-Hop Relaying with Partial Relay Selection

    NASA Astrophysics Data System (ADS)

    Bao, Vo Nguyen Quoc; Kong, Hyung Yun

    In this paper, we study the performance of dual hop relaying in which the best relay selected by partial relay selection will help the source-destination link to overcome the channel impairment. Specifically, closed-form expressions for outage probability, symbol error probability and achievable diversity gain are derived using the statistical characteristic of the signal-to-noise ratio. Numerical investigation shows that the system achieves diversity of two regardless of relay number and also confirms the correctness of the analytical results. Furthermore, the performance loss due to partial relay selection is investigated.

  16. Multipartite nonlocality and random measurements

    NASA Astrophysics Data System (ADS)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  17. A Dasymetric-Based Monte Carlo Simulation Approach to the Probabilistic Analysis of Spatial Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; Piburn, Jesse O; McManamay, Ryan A

    2017-01-01

    Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.

  18. Neural implementation of operations used in quantum cognition.

    PubMed

    Busemeyer, Jerome R; Fakhari, Pegah; Kvam, Peter

    2017-11-01

    Quantum probability theory has been successfully applied outside of physics to account for numerous findings from psychology regarding human judgement and decision making behavior. However, the researchers who have made these applications do not rely on the hypothesis that the brain is some type of quantum computer. This raises the question of how could the brain implement quantum algorithms other than quantum physical operations. This article outlines one way that a neural based system could perform the computations required by applications of quantum probability to human behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. APOLLO II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, R.; Mondot, J.; Stankovski, Z.

    1988-11-01

    APOLLO II is a new, multigroup transport code under development at the Commissariat a l'Energie Atomique. The code has a modular structure and uses sophisticated software for data structuralization, dynamic memory management, data storage, and user macrolanguage. This paper gives an overview of the main methods used in the code for (a) multidimensional collision probability calculations, (b) leakage calculations, and (c) homogenization procedures. Numerical examples are given to demonstrate the potential of the modular structure of the code and the novel multilevel flat-flux representation used in the calculation of the collision probabilities.

  20. The ACT transport: Panacea for the 80's or designer's illusion (panel discussion)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A panel discussion was held which attempted to make an objective and pragmatic assessment of the standing of active control technology. The discussion focused on the standing of active control technology relative to civil air transport applications, the value as opposed to the cost of the projected benefits, the need for research, development, and demonstration, the role of government and industry in developing the technology, the major obstacles to its implementation, and the probable timing of the full utilization of active control technology in commercial transportation. An edited transcription of the prepared statements of the panel members and the subsequent open discussion between the panel and the audience is presented.

  1. Predictive acute toxicity tests with pesticides.

    PubMed

    Brown, V K

    1983-01-01

    By definition pesticides are biocidal products and this implies a probability that pesticides may be acutely toxic to species other than the designated target species. The ways in which pesticides are manufactured, formulated, packaged, distributed and used necessitates a potential for the exposure of non-target species although the technology exists to minimize adventitious exposure. The occurrence of deliberate exposure of non-target species due to the misuse of pesticides is known to happen. The array of predictive acute toxicity tests carried out on pesticides and involving the use of laboratory animals can be justified as providing data on which hazard assessment can be based. This paper addresses the justification and rationale of this statement.

  2. Ethics and epistemology of accurate prediction in clinical research.

    PubMed

    Hey, Spencer Phillips

    2015-07-01

    All major research ethics policies assert that the ethical review of clinical trial protocols should include a systematic assessment of risks and benefits. But despite this policy, protocols do not typically contain explicit probability statements about the likely risks or benefits involved in the proposed research. In this essay, I articulate a range of ethical and epistemic advantages that explicit forecasting would offer to the health research enterprise. I then consider how some particular confidence levels may come into conflict with the principles of ethical research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Fluctuation theorem for the effusion of an ideal gas.

    PubMed

    Cleuren, B; Van den Broeck, C; Kawai, R

    2006-08-01

    The probability distribution of the entropy production for the effusion of an ideal gas between two compartments is calculated explicitly. The fluctuation theorem is verified. The analytic results are in good agreement with numerical data from hard disk molecular dynamics simulations.

  4. A quadrature based method of moments for nonlinear Fokker-Planck equations

    NASA Astrophysics Data System (ADS)

    Otten, Dustin L.; Vedula, Prakash

    2011-09-01

    Fokker-Planck equations which are nonlinear with respect to their probability densities and occur in many nonequilibrium systems relevant to mean field interaction models, plasmas, fermions and bosons can be challenging to solve numerically. To address some underlying challenges, we propose the application of the direct quadrature based method of moments (DQMOM) for efficient and accurate determination of transient (and stationary) solutions of nonlinear Fokker-Planck equations (NLFPEs). In DQMOM, probability density (or other distribution) functions are represented using a finite collection of Dirac delta functions, characterized by quadrature weights and locations (or abscissas) that are determined based on constraints due to evolution of generalized moments. Three particular examples of nonlinear Fokker-Planck equations considered in this paper include descriptions of: (i) the Shimizu-Yamada model, (ii) the Desai-Zwanzig model (both of which have been developed as models of muscular contraction) and (iii) fermions and bosons. Results based on DQMOM, for the transient and stationary solutions of the nonlinear Fokker-Planck equations, have been found to be in good agreement with other available analytical and numerical approaches. It is also shown that approximate reconstruction of the underlying probability density function from moments obtained from DQMOM can be satisfactorily achieved using a maximum entropy method.

  5. Outage analysis of relay-assisted underwater wireless optical communication systems

    NASA Astrophysics Data System (ADS)

    Tabeshnezhad, Azadeh; Pourmina, Mohammad Ali

    2017-12-01

    In this paper, we theoretically evaluate the outage probabilities of underwater wireless optical communication (UWOC) systems. Our derivations are general as the channel model under consideration takes into account all of the channel degrading effects, namely absorption, scattering, and turbulence-induced fading. We numerically show that the UWOC systems, due to the severe channel impairments, cannot typically support longer link ranges than 100 m. Therefore, in this paper, in order to increase the transmission reliability and hence extend the viable communication range of UWOC systems, we apply decode-and-forward (DF) relay-assisted communications either in the form of multi-hop transmission, where multiple intermediate relays are serially employed between the source and destination, or parallel relaying in which multiple DF relays are distributed among the source-to-destination path to cooperate in the end-to-end transmission. Our numerical results reveal that multi-hop transmission, owing to the distance-dependency of all of the channel degrading effects, can tremendously improve the end-to-end outage probability and increase the accessible link ranges to hundreds of meter. For example, a dual-hop transmission in a 45 m coastal water link can provide up to 41 dB performance improvement at the outage probability of 10-9.

  6. Bayesian block-diagonal variable selection and model averaging

    PubMed Central

    Papaspiliopoulos, O.; Rossell, D.

    2018-01-01

    Summary We propose a scalable algorithmic framework for exact Bayesian variable selection and model averaging in linear models under the assumption that the Gram matrix is block-diagonal, and as a heuristic for exploring the model space for general designs. In block-diagonal designs our approach returns the most probable model of any given size without resorting to numerical integration. The algorithm also provides a novel and efficient solution to the frequentist best subset selection problem for block-diagonal designs. Posterior probabilities for any number of models are obtained by evaluating a single one-dimensional integral, and other quantities of interest such as variable inclusion probabilities and model-averaged regression estimates are obtained by an adaptive, deterministic one-dimensional numerical integration. The overall computational cost scales linearly with the number of blocks, which can be processed in parallel, and exponentially with the block size, rendering it most adequate in situations where predictors are organized in many moderately-sized blocks. For general designs, we approximate the Gram matrix by a block-diagonal matrix using spectral clustering and propose an iterative algorithm that capitalizes on the block-diagonal algorithms to explore efficiently the model space. All methods proposed in this paper are implemented in the R library mombf. PMID:29861501

  7. Driven fragmentation of granular gases.

    PubMed

    Cruz Hidalgo, Raúl; Pagonabarraga, Ignacio

    2008-06-01

    The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the long velocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f(c) approximately exp(-cn) , with n approximately 1.2 , regarding less the fragmentation mechanisms.

  8. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    NASA Astrophysics Data System (ADS)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  9. A Numerical Method for Obtaining Monoenergetic Neutron Flux Distributions and Transmissions in Multiple-Region Slabs

    NASA Technical Reports Server (NTRS)

    Schneider, Harold

    1959-01-01

    This method is investigated for semi-infinite multiple-slab configurations of arbitrary width, composition, and source distribution. Isotropic scattering in the laboratory system is assumed. Isotropic scattering implies that the fraction of neutrons scattered in the i(sup th) volume element or subregion that will make their next collision in the j(sup th) volume element or subregion is the same for all collisions. These so-called "transfer probabilities" between subregions are calculated and used to obtain successive-collision densities from which the flux and transmission probabilities directly follow. For a thick slab with little or no absorption, a successive-collisions technique proves impractical because an unreasonably large number of collisions must be followed in order to obtain the flux. Here the appropriate integral equation is converted into a set of linear simultaneous algebraic equations that are solved for the average total flux in each subregion. When ordinary diffusion theory applies with satisfactory precision in a portion of the multiple-slab configuration, the problem is solved by ordinary diffusion theory, but the flux is plotted only in the region of validity. The angular distribution of neutrons entering the remaining portion is determined from the known diffusion flux and the remaining region is solved by higher order theory. Several procedures for applying the numerical method are presented and discussed. To illustrate the calculational procedure, a symmetrical slab ia vacuum is worked by the numerical, Monte Carlo, and P(sub 3) spherical harmonics methods. In addition, an unsymmetrical double-slab problem is solved by the numerical and Monte Carlo methods. The numerical approach proved faster and more accurate in these examples. Adaptation of the method to anisotropic scattering in slabs is indicated, although no example is included in this paper.

  10. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  11. An Assessment of the Likelihood, Frequency, and Content of Verbal Communication Between Radiologists and Women Receiving Screening and Diagnostic Mammography

    PubMed Central

    Carney, Patricia A.; Kettler, Mark; Cook, Andrea J.; Geller, Berta M.; Karliner, Leah; Miglioretti, Diana L.; Bowles, Erin Aiello; Buist, Diana S.; Gallagher, Thomas H.; Elmore, Joann G.

    2009-01-01

    Rationale & Objective Research on communication between radiologists and women undergoing screening and diagnostic mammography is limited. We describe community radiologists’ communication practices with patients regarding screening and diagnostic mammogram results and factors associated with frequency of communication. Materials & Methods We received surveys from 257 radiologists (70% of those eligible) about the extent to which they talk to women as part of their healthcare visit for either screening or diagnostic mammograms, whether this occurs if the exam assessment is positive or negative, and how they use estimates of patient risk to convey information about an abnormal exam where the specific finding of cancer is not yet known. We also assessed characteristics of the radiologists to identify associations with more or less frequent communication at the time of the mammogram. Results Two hundred and forty-three radiologists provided complete data (95%). Very few (<6%) reported routinely communicating with women when screening mammograms were either normal or abnormal. Less than half (47%) routinely communicated with women when their diagnostic mammograms were normal, while 77% often or always communicated with women when their diagnostic exams were abnormal. For positive diagnostic exams, female radiologists were more likely to be frequent communicators compared to males (87.1% to 72.8%; p-value = 0.02) and those who spend 40-79% of their time in breast imaging (94.6%) were more likely to be frequent communicators compared to those who spend less time (67.2%-78.9%; p-value = 0.02). Most radiologists convey risk information using general rather than numeric statements (57.7% vs. 28.5%). Conclusions Radiologists are most likely to convey information about diagnostic mammographic findings when results are abnormal. Most radiologists convey risk information using general rather than numeric statements. PMID:19442539

  12. N-mix for fish: estimating riverine salmonid habitat selection via N-mixture models

    USGS Publications Warehouse

    Som, Nicholas A.; Perry, Russell W.; Jones, Edward C.; De Juilio, Kyle; Petros, Paul; Pinnix, William D.; Rupert, Derek L.

    2018-01-01

    Models that formulate mathematical linkages between fish use and habitat characteristics are applied for many purposes. For riverine fish, these linkages are often cast as resource selection functions with variables including depth and velocity of water and distance to nearest cover. Ecologists are now recognizing the role that detection plays in observing organisms, and failure to account for imperfect detection can lead to spurious inference. Herein, we present a flexible N-mixture model to associate habitat characteristics with the abundance of riverine salmonids that simultaneously estimates detection probability. Our formulation has the added benefits of accounting for demographics variation and can generate probabilistic statements regarding intensity of habitat use. In addition to the conceptual benefits, model application to data from the Trinity River, California, yields interesting results. Detection was estimated to vary among surveyors, but there was little spatial or temporal variation. Additionally, a weaker effect of water depth on resource selection is estimated than that reported by previous studies not accounting for detection probability. N-mixture models show great promise for applications to riverine resource selection.

  13. Hierarchical models and the analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.

    2003-01-01

    Management of birds often requires analysis of collections of estimates. We describe a hierarchical modeling approach to the analysis of these data, in which parameters associated with the individual species estimates are treated as random variables, and probability statements are made about the species parameters conditioned on the data. A Markov-Chain Monte Carlo (MCMC) procedure is used to fit the hierarchical model. This approach is computer intensive, and is based upon simulation. MCMC allows for estimation both of parameters and of derived statistics. To illustrate the application of this method, we use the case in which we are interested in attributes of a collection of estimates of population change. Using data for 28 species of grassland-breeding birds from the North American Breeding Bird Survey, we estimate the number of species with increasing populations, provide precision-adjusted rankings of species trends, and describe a measure of population stability as the probability that the trend for a species is within a certain interval. Hierarchical models can be applied to a variety of bird survey applications, and we are investigating their use in estimation of population change from survey data.

  14. Effect of a Reminder Statement on Echocardiography Reports on Referrals for Implantable Cardioverter-Defibrillators for Primary Prevention.

    PubMed

    Chokshi, Moulin; McNamara, Robert L; Rajeswaran, Yasotha; Lampert, Rachel

    2017-02-01

    Numerous trials show the benefit of implantable cardioverter-defibrillators (ICDs) for primary prevention in patients with low ejection fraction (EF), a class I indication. However, underutilization is well documented. We retrospectively reviewed charts to see whether placing a reminder statement into echocardiogram reports for appropriate patients increased adherence to guidelines. From January through June 2013, a brief reminder of the ICD guidelines was automatically inserted into echocardiogram reports with EF ≤ 35% (reminder period). Charts were reviewed to determine if these patients (1) were referred to Electrophysiology (EP) within 6 months of the index echo and (2) received an ICD within 6 months of EP referral. Chart review of all patients who had an echocardiogram performed between March and August 2012 with an EF ≤ 35% provided a control period. More patients were referred to EP in the reminder period compared with control period, 68% (54 of 80) versus 51% (53 of 104), p = 0.03. There was also a higher rate of discussions in the reminder period between patients and physicians about ICD therapy (71% vs 54%, p = 0.02). Among patients appropriate for ICD, 52% of patients during the reminder period received an ICD versus 38% of patients during the control period (p = 0.11). A simple reminder statement on echocardiography reports led to a significant improvement in appropriate EP referrals and a trend toward increased ICD implantation in appropriate patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Global climate change and children's health.

    PubMed

    Shea, Katherine M

    2007-11-01

    There is broad scientific consensus that Earth's climate is warming rapidly and at an accelerating rate. Human activities, primarily the burning of fossil fuels, are very likely (>90% probability) to be the main cause of this warming. Climate-sensitive changes in ecosystems are already being observed, and fundamental, potentially irreversible, ecological changes may occur in the coming decades. Conservative environmental estimates of the impact of climate changes that are already in process indicate that they will result in numerous health effects to children. The nature and extent of these changes will be greatly affected by actions taken or not taken now at the global level. Physicians have written on the projected effects of climate change on public health, but little has been written specifically on anticipated effects of climate change on children's health. Children represent a particularly vulnerable group that is likely to suffer disproportionately from both direct and indirect adverse health effects of climate change. Pediatric health care professionals should understand these threats, anticipate their effects on children's health, and participate as children's advocates for strong mitigation and adaptation strategies now. Any solutions that address climate change must be developed within the context of overall sustainability (the use of resources by the current generation to meet current needs while ensuring that future generations will be able to meet their needs). Pediatric health care professionals can be leaders in a move away from a traditional focus on disease prevention to a broad, integrated focus on sustainability as synonymous with health. This policy statement is supported by a technical report that examines in some depth the nature of the problem of climate change, likely effects on children's health as a result of climate change, and the critical importance of responding promptly and aggressively to reduce activities that are contributing to this change.

  16. Snoring and its management.

    PubMed

    Calhoun, Karen H; Templer, Jerry; Patenaude, Bart

    2006-01-01

    There are numerous strategies, devices and procedures available to treat snoring. The surgical procedures have an overall success rate of 60-70%, but this probably decreases over time, especially if there is weight gain. There are no long-term rigorously-designed studies comparing the various procedures for decreasing snoring.

  17. Application of the string method to the study of critical nuclei in capillary condensation.

    PubMed

    Qiu, Chunyin; Qian, Tiezheng; Ren, Weiqing

    2008-10-21

    We adopt a continuum description for liquid-vapor phase transition in the framework of mean-field theory and use the string method to numerically investigate the critical nuclei for capillary condensation in a slit pore. This numerical approach allows us to determine the critical nuclei corresponding to saddle points of the grand potential function in which the chemical potential is given in the beginning. The string method locates the minimal energy path (MEP), which is the most probable transition pathway connecting two metastable/stable states in configuration space. From the MEP, the saddle point is determined and the corresponding energy barrier also obtained (for grand potential). Moreover, the MEP shows how the new phase (liquid) grows out of the old phase (vapor) along the most probable transition pathway, from the birth of a critical nucleus to its consequent expansion. Our calculations run from partial wetting to complete wetting with a variable strength of attractive wall potential. In the latter case, the string method presents a unified way for computing the critical nuclei, from film formation at solid surface to bulk condensation via liquid bridge. The present application of the string method to the numerical study of capillary condensation shows the great power of this method in evaluating the critical nuclei in various liquid-vapor phase transitions.

  18. Statistics of concentrations due to single air pollution sources to be applied in numerical modelling of pollutant dispersion

    NASA Astrophysics Data System (ADS)

    Tumanov, Sergiu

    A test of goodness of fit based on rank statistics was applied to prove the applicability of the Eggenberger-Polya discrete probability law to hourly SO 2-concentrations measured in the vicinity of single sources. With this end in view, the pollutant concentration was considered an integral quantity which may be accepted if one properly chooses the unit of measurement (in this case μg m -3) and if account is taken of the limited accuracy of measurements. The results of the test being satisfactory, even in the range of upper quantiles, the Eggenberger-Polya law was used in association with numerical modelling to estimate statistical parameters, e.g. quantiles, cumulative probabilities of threshold concentrations to be exceeded, and so on, in the grid points of a network covering the area of interest. This only needs accurate estimations of means and variances of the concentration series which can readily be obtained through routine air pollution dispersion modelling.

  19. Generic dynamical features of quenched interacting quantum systems: Survival probability, density imbalance, and out-of-time-ordered correlator

    NASA Astrophysics Data System (ADS)

    Torres-Herrera, E. J.; García-García, Antonio M.; Santos, Lea F.

    2018-02-01

    We study numerically and analytically the quench dynamics of isolated many-body quantum systems. Using full random matrices from the Gaussian orthogonal ensemble, we obtain analytical expressions for the evolution of the survival probability, density imbalance, and out-of-time-ordered correlator. They are compared with numerical results for a one-dimensional-disordered model with two-body interactions and shown to bound the decay rate of this realistic system. Power-law decays are seen at intermediate times, and dips below the infinite time averages (correlation holes) occur at long times for all three quantities when the system exhibits level repulsion. The fact that these features are shared by both the random matrix and the realistic disordered model indicates that they are generic to nonintegrable interacting quantum systems out of equilibrium. Assisted by the random matrix analytical results, we propose expressions that describe extremely well the dynamics of the realistic chaotic system at different time scales.

  20. Probability distribution of haplotype frequencies under the two-locus Wright-Fisher model by diffusion approximation.

    PubMed

    Boitard, Simon; Loisel, Patrice

    2007-05-01

    The probability distribution of haplotype frequencies in a population, and the way it is influenced by genetical forces such as recombination, selection, random drift ...is a question of fundamental interest in population genetics. For large populations, the distribution of haplotype frequencies for two linked loci under the classical Wright-Fisher model is almost impossible to compute because of numerical reasons. However the Wright-Fisher process can in such cases be approximated by a diffusion process and the transition density can then be deduced from the Kolmogorov equations. As no exact solution has been found for these equations, we developed a numerical method based on finite differences to solve them. It applies to transient states and models including selection or mutations. We show by several tests that this method is accurate for computing the conditional joint density of haplotype frequencies given that no haplotype has been lost. We also prove that it is far less time consuming than other methods such as Monte Carlo simulations.

  1. Estimating state-transition probabilities for unobservable states using capture-recapture/resighting data

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.

    2002-01-01

    Temporary emigration was identified some time ago as causing potential problems in capture-recapture studies, and in the last five years approaches have been developed for dealing with special cases of this general problem. Temporary emigration can be viewed more generally as involving transitions to and from an unobservable state, and frequently the state itself is one of biological interest (e.g., 'nonbreeder'). Development of models that permit estimation of relevant parameters in the presence of an unobservable state requires either extra information (e.g., as supplied by Pollock's robust design) or the following classes of model constraints: reducing the order of Markovian transition probabilities, imposing a degree of determinism on transition probabilities, removing state specificity of survival probabilities, and imposing temporal constancy of parameters. The objective of the work described in this paper is to investigate estimability of model parameters under a variety of models that include an unobservable state. Beginning with a very general model and no extra information, we used numerical methods to systematically investigate the use of ancillary information and constraints to yield models that are useful for estimation. The result is a catalog of models for which estimation is possible. An example analysis of sea turtle capture-recapture data under two different models showed similar point estimates but increased precision for the model that incorporated ancillary data (the robust design) when compared to the model with deterministic transitions only. This comparison and the results of our numerical investigation of model structures lead to design suggestions for capture-recapture studies in the presence of an unobservable state.

  2. Impact of the infectious period on epidemics

    NASA Astrophysics Data System (ADS)

    Wilkinson, Robert R.; Sharkey, Kieran J.

    2018-05-01

    The duration of the infectious period is a crucial determinant of the ability of an infectious disease to spread. We consider an epidemic model that is network based and non-Markovian, containing classic Kermack-McKendrick, pairwise, message passing, and spatial models as special cases. For this model, we prove a monotonic relationship between the variability of the infectious period (with fixed mean) and the probability that the infection will reach any given subset of the population by any given time. For certain families of distributions, this result implies that epidemic severity is decreasing with respect to the variance of the infectious period. The striking importance of this relationship is demonstrated numerically. We then prove, with a fixed basic reproductive ratio (R0), a monotonic relationship between the variability of the posterior transmission probability (which is a function of the infectious period) and the probability that the infection will reach any given subset of the population by any given time. Thus again, even when R0 is fixed, variability of the infectious period tends to dampen the epidemic. Numerical results illustrate this but indicate the relationship is weaker. We then show how our results apply to message passing, pairwise, and Kermack-McKendrick epidemic models, even when they are not exactly consistent with the stochastic dynamics. For Poissonian contact processes, and arbitrarily distributed infectious periods, we demonstrate how systems of delay differential equations and ordinary differential equations can provide upper and lower bounds, respectively, for the probability that any given individual has been infected by any given time.

  3. Optical rogue-wave-like extreme value fluctuations in fiber Raman amplifiers.

    PubMed

    Hammani, Kamal; Finot, Christophe; Dudley, John M; Millot, Guy

    2008-10-13

    We report experimental observation and characterization of rogue wave-like extreme value statistics arising from pump-signal noise transfer in a fiber Raman amplifier. Specifically, by exploiting Raman amplification with an incoherent pump, the amplified signal is shown to develop a series of temporal intensity spikes whose peak power follows a power-law probability distribution. The results are interpreted using a numerical model of the Raman gain process using coupled nonlinear Schrödinger equations, and the numerical model predicts results in good agreement with experiment.

  4. Non-equilibrium many-body dynamics following a quantum quench

    NASA Astrophysics Data System (ADS)

    Vyas, Manan

    2017-12-01

    We study analytically and numerically the non-equilibrium dynamics of an isolated interacting many-body quantum system following a random quench. We model the system Hamiltonian by Embedded Gaussian Orthogonal Ensemble (EGOE) of random matrices with one plus few-body interactions for fermions. EGOE are paradigmatic models to study the crossover from integrability to chaos in interacting many-body quantum systems. We obtain a generic formulation, based on spectral variances, for describing relaxation dynamics of survival probabilities as a function of rank of interactions. Our analytical results are in good agreement with numerics.

  5. Normal tissue complication probability modelling of tissue fibrosis following breast radiotherapy

    NASA Astrophysics Data System (ADS)

    Alexander, M. A. R.; Brooks, W. A.; Blake, S. W.

    2007-04-01

    Cosmetic late effects of radiotherapy such as tissue fibrosis are increasingly regarded as being of importance. It is generally considered that the complication probability of a radiotherapy plan is dependent on the dose uniformity, and can be reduced by using better compensation to remove dose hotspots. This work aimed to model the effects of improved dose homogeneity on complication probability. The Lyman and relative seriality NTCP models were fitted to clinical fibrosis data for the breast collated from the literature. Breast outlines were obtained from a commercially available Rando phantom using the Osiris system. Multislice breast treatment plans were produced using a variety of compensation methods. Dose-volume histograms (DVHs) obtained for each treatment plan were reduced to simple numerical parameters using the equivalent uniform dose and effective volume DVH reduction methods. These parameters were input into the models to obtain complication probability predictions. The fitted model parameters were consistent with a parallel tissue architecture. Conventional clinical plans generally showed reducing complication probabilities with increasing compensation sophistication. Extremely homogenous plans representing idealized IMRT treatments showed increased complication probabilities compared to conventional planning methods, as a result of increased dose to areas receiving sub-prescription doses using conventional techniques.

  6. Generalized Maximum Entropy

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John

    2005-01-01

    A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].

  7. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    PubMed

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.

  8. Surgical Craniotomy for Intracerebral Haemorrhage.

    PubMed

    Mendelow, A David

    2015-01-01

    Craniotomy is probably indicated for patients with superficial spontaneous lobar supratentorial intracerebral haemorrhage (ICH) when the level of consciousness drops below 13 within the first 8 h of the onset of the haemorrhage. Once the level drops below 9, it is probably too late to consider craniotomy for these patients, so clinical vigilance is paramount. While this statement is only backed up by evidence that is moderately strong, meta-analysis of available data suggests that it is true in the rather limited number of patients with ICH. Meta-analyses like this can often predict the results of future prospective randomised controlled trials a decade or more before the trials are completed and published. Countless such examples exist in the literature, as is the case for thrombolysis in patients with myocardial infarction in the last millennium: meta-analysis determined the efficacy more than a decade BEFORE the last trial (ISIS-2) confirmed the benefit of thrombolysis for myocardial infarction. Careful examination of the meta-analysis' Forest plots in this chapter will demonstrate why this statement is made at the outset. Other meta-analyses of surgery for ICH have also indicated that minimal interventional techniques using topical thrombolysis or endoscopy via burrholes or even twist drill aspiration may be particularly successful for the treatment of supratentorial ICH, especially when the clot is deep seated. Ongoing clinical trials (CLEAR III and MISTIE III) should confirm this in the fullness of time. There are 2 exceptions to these generalisations. First, based on trial evidence, aneurysmal ICH is best treated with surgery. Second, cerebellar ICH represents a special case because of the development of hydrocephalus, which may require expeditious drainage as the intracranial pressure rises. The cerebellar clot will then require evacuation, usually via posterior fossa craniectomy, rather than craniotomy. Technical advances suggest that image-guided surgery may improve the completeness of surgical evacuation and outcomes, regardless of which surgical technique is employed. © 2016 S. Karger AG, Basel.

  9. Official Positions for FRAX® clinical regarding international differences from Joint Official Positions Development Conference of the International Society for Clinical Densitometry and International Osteoporosis Foundation on FRAX®.

    PubMed

    Cauley, Jane A; El-Hajj Fuleihan, Ghada; Arabi, Asma; Fujiwara, Saeko; Ragi-Eis, Sergio; Calderon, Andrew; Chionh, Siok Bee; Chen, Zhao; Curtis, Jeffrey R; Danielson, Michelle E; Hanley, David A; Kroger, Heikki; Kung, Annie W C; Lesnyak, Olga; Nieves, Jeri; Pluskiewicz, Wojciech; El Rassi, Rola; Silverman, Stuart; Schott, Anne-Marie; Rizzoli, Rene; Luckey, Marjorie

    2011-01-01

    Osteoporosis is a serious worldwide epidemic. Increased risk of fractures is the hallmark of the disease and is associated with increased morbidity, mortality and economic burden. FRAX® is a web-based tool developed by the Sheffield WHO Collaborating Center team, that integrates clinical risk factors, femoral neck BMD, country specific mortality and fracture data and calculates the 10 year fracture probability in order to help health care professionals identify patients who need treatment. However, only 31 countries have a FRAX® calculator at the time paper was accepted for publication. In the absence of a FRAX® model for a particular country, it has been suggested to use a surrogate country for which the epidemiology of osteoporosis most closely approximates the index country. More specific recommendations for clinicians in these countries are not available. In North America, concerns have also been raised regarding the assumptions used to construct the US ethnic specific FRAX® calculators with respect to the correction factors applied to derive fracture probabilities in Blacks, Asians and Hispanics in comparison to Whites. In addition, questions were raised about calculating fracture risk in other ethnic groups e.g., Native Americans and First Canadians. In order to provide additional guidance to clinicians, a FRAX® International Task Force was formed to address specific questions raised by physicians in countries without FRAX® calculators and seeking to integrate FRAX® into their clinical practice. The main questions that the task force tried to answer were the following: The Task Force members conducted appropriate literature reviews and developed preliminary statements that were discussed and graded by a panel of experts at the ISCD-IOF joint conference. The statements approved by the panel of experts are discussed in the current paper. Copyright © 2011. Published by Elsevier Inc.

  10. Nuclear proliferation: Learning from the Iraq experience. Hearing before the Committee on Foreign Relations, United States Senate, One Hundred Second Congress, First Session, October 17 and 23, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    Most of this hearings record is devoted to brief statements to the committee and prepared statements submitted for the record by: (1) Dr. David Kay, Deputy Leader, IAEA Action Team for Nuclear Inspections; and (2) Dr. Hans Blix, Director General, IAEA. Dr. Kay spent considerable time in Iraq during the seven IAEA inspections of Iraqi facilities between May 14-23, 1991 and October 11-21, 1991. He says (1) it is overwhelmingly clear that Iraq had a clandestine nuclear weapons program of considerable breadth; and (2) there is a very high probability that Iraq is still withholding information from the inspection effortmore » of the IAEA. He concludes that IAEA, with firm backing of the U.N. Security Council and a minimum of constraints, has a substantial proven capacity to carry out inspections. Dr. Blix reviews briefly the history of the IAEA inspection effort, starting with the 1950s' Atoms for Peace Program. He emphasizes that the one factor that enabled IAEA inspectors to find out in 5 months in Iraq what had not been uncovered in 10 years, was intelligence information; further, IAEA will make special efforts in the future to obtain such intelligence information.« less

  11. Probability surveys as an approach for assessing zooplankton community and biomass trends in Lake Superior

    EPA Science Inventory

    Freshwater ecosystems harbor a rich diversity of species and habitats and also provide critical resources to people. The condition of these ecosystems can be degraded by numerous environmental stressors, such as increases in pollution, habitat alteration, introduction of invasive...

  12. Probability of Survival Decision Aid (PSDA)

    DTIC Science & Technology

    2008-03-01

    and weight numeric values were based on the survey data for U.S. population in the National Health and Nutrition Examination Surveys (NHANES...Study of Channel Swiming . Clin. Sci. 19: 257, 1960. 21. Romet, T. T. , C. J. Brooks, S. M. Fairburn, and P. Potter. Immersed clo insulation in

  13. Simulation study of traffic car accidents at a single lane roundabout

    NASA Astrophysics Data System (ADS)

    Echab, H.; Lakouari, N.; Ez-Zahraouy, H.; Benyoussef, A.

    2016-07-01

    In this paper, using the Nagel-Schreckenberg model, we numerically investigate the probability Pac of entering/circulating car accidents to occur at single-lane roundabout under the expanded open boundary. The roundabout consists of N on-ramps (respectively, off-ramps). The boundary is controlled by the injecting rates α1,α2 and the extracting rate β. The simulation results show that, depending on the injecting rates, the car accidents are more likely to happen when the capacity of the rotary is set to its maximum. Moreover, we found that the large values of rotary size L and the probability of preferential Pexit are reliable to improve safety and reduce accidents. However, the usage of indicator, the increase of β and/or N provokes an increase of car accident probability.

  14. Linking of uniform random polygons in confined spaces

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Karadayi, E.; Saito, M.

    2007-03-01

    In this paper, we study the topological entanglement of uniform random polygons in a confined space. We derive the formula for the mean squared linking number of such polygons. For a fixed simple closed curve in the confined space, we rigorously show that the linking probability between this curve and a uniform random polygon of n vertices is at least 1-O\\big(\\frac{1}{\\sqrt{n}}\\big) . Our numerical study also indicates that the linking probability between two uniform random polygons (in a confined space), of m and n vertices respectively, is bounded below by 1-O\\big(\\frac{1}{\\sqrt{mn}}\\big) . In particular, the linking probability between two uniform random polygons, both of n vertices, is bounded below by 1-O\\big(\\frac{1}{n}\\big) .

  15. Persistence and extinction for a class of stochastic SIS epidemic models with nonlinear incidence rate

    NASA Astrophysics Data System (ADS)

    Teng, Zhidong; Wang, Lei

    2016-06-01

    In this paper, a class of stochastic SIS epidemic models with nonlinear incidence rate is investigated. It is shown that the extinction and persistence of the disease in probability are determined by a threshold value R˜0. That is, if R˜0 < 1 and an additional condition holds then disease dies out, and if R˜0 > 1 then disease is weak permanent with probability one. To obtain the permanence in the mean of the disease, a new quantity R̂0 is introduced, and it is proved that if R̂0 > 1 the disease is permanent in the mean with probability one. Furthermore, the numerical simulations are presented to illustrate some open problems given in Remarks 1-3 and 5 of this paper.

  16. Performance of cellular frequency-hopped spread-spectrum radio networks

    NASA Astrophysics Data System (ADS)

    Gluck, Jeffrey W.; Geraniotis, Evaggelos

    1989-10-01

    Multiple access interference is characterized for cellular mobile networks, in which users are assumed to be Poisson-distributed in the plane and employ frequency-hopped spread-spectrum signaling with transmitter-oriented assignment of frequency-hopping patterns. Exact expressions for the bit error probabilities are derived for binary coherently demodulated systems without coding. Approximations for the packet error probability are derived for coherent and noncoherent systems and these approximations are applied when forward-error-control coding is employed. In all cases, the effects of varying interference power are accurately taken into account according to some propagation law. Numerical results are given in terms of bit error probability for the exact case and throughput for the approximate analyses. Comparisons are made with previously derived bounds and it is shown that these tend to be very pessimistic.

  17. A Semi-Analytical Method for the PDFs of A Ship Rolling in Random Oblique Waves

    NASA Astrophysics Data System (ADS)

    Liu, Li-qin; Liu, Ya-liu; Xu, Wan-hai; Li, Yan; Tang, You-gang

    2018-03-01

    The PDFs (probability density functions) and probability of a ship rolling under the random parametric and forced excitations were studied by a semi-analytical method. The rolling motion equation of the ship in random oblique waves was established. The righting arm obtained by the numerical simulation was approximately fitted by an analytical function. The irregular waves were decomposed into two Gauss stationary random processes, and the CARMA (2, 1) model was used to fit the spectral density function of parametric and forced excitations. The stochastic energy envelope averaging method was used to solve the PDFs and the probability. The validity of the semi-analytical method was verified by the Monte Carlo method. The C11 ship was taken as an example, and the influences of the system parameters on the PDFs and probability were analyzed. The results show that the probability of ship rolling is affected by the characteristic wave height, wave length, and the heading angle. In order to provide proper advice for the ship's manoeuvring, the parametric excitations should be considered appropriately when the ship navigates in the oblique seas.

  18. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    PubMed

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Bending instability in galactic discs: advocacy of the linear theory

    NASA Astrophysics Data System (ADS)

    Rodionov, S. A.; Sotnikova, N. Ya.

    2013-09-01

    We demonstrate that in N-body simulations of isolated disc galaxies, there is numerical vertical heating which slowly increases the vertical velocity dispersion and the disc thickness. Even for models with over a million particles in a disc, this heating can be significant. Such an effect is just the same as in numerical experiments by Sellwood. We also show that in a stellar disc, outside a boxy/peanut bulge, if it presents, the saturation level of the bending instability is rather close to the value predicted by the linear theory. We pay attention to the fact that the bending instability develops and decays very fast, so it cannot play any role in secular vertical heating. However, the bending instability defines the minimal value of the ratio between the vertical and radial velocity dispersions σz/σR ≈ 0.3 (so indirectly the minimal thickness), which stellar discs in real galaxies may have. We demonstrate that observations confirm the last statement.

  20. A mean spherical model for soft potentials: The hard core revealed as a perturbation

    NASA Technical Reports Server (NTRS)

    Rosenfeld, Y.; Ashcroft, N. W.

    1978-01-01

    The mean spherical approximation for fluids is extended to treat the case of dense systems interacting via soft-potentials. The extension takes the form of a generalized statement concerning the behavior of the direct correlation function c(r) and radial distribution g(r). From a detailed analysis that views the hard core portion of a potential as a perturbation on the whole, a specific model is proposed which possesses analytic solutions for both Coulomb and Yukawa potentials, in addition to certain other remarkable properties. A variational principle for the model leads to a relatively simple method for obtaining numerical solutions.

  1. Proceedings of The 1980 Army Numerical Analysis and Computers Conference (17th) Held at Moffett Field, California on 20-21 February 1980.

    DTIC Science & Technology

    1980-08-01

    relationship would be the solution of ZIG(s) F(s)) z n u(nT) fg(t) u(t) f(nT-t) u(nT-t) dt na-to -W 1 (12) The mean value theorem of the integral...4.16) corresponds to a positive eigenvalue X of (2.12) and conversely via the relationships (4.17a) a - (4.17b) = Because a is a monotone increasing...statements show how the time dependent displacements for any location of the foundation can be found, this information is used with relationships for

  2. ESDAPT - APT PROGRAMMING EDITOR AND INTERPRETER

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1994-01-01

    ESDAPT is a graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. ESDAPT has a graphical user interface that provides the user with an APT syntax sensitive text editor and windows for displaying geometry and tool paths. APT geometry statement can also be created using menus and screen picks. ESDAPT interprets APT geometry statements and displays the results in its view windows. Tool paths are generated by batching the APT source to an APT processor (COSMIC P-APT recommended). The tool paths are then displayed in the view windows. Hardcopy output of the view windows is in color PostScript format. ESDAPT is written in C-language, yacc, lex, and XView for use on Sun4 series computers running SunOS. ESDAPT requires 4Mb of disk space, 7Mb of RAM, and MIT's X Window System, Version 11 Release 4, or OpenWindows version 3 for execution. Program documentation in PostScript format and an executable for OpenWindows version 3 are provided on the distribution media. The standard distribution medium for ESDAPT is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. This program was developed in 1992.

  3. The possibility of a universal declaration of biomedical ethics

    PubMed Central

    Hedayat, K M

    2007-01-01

    Statements on issues in biomedical ethics, purporting to represent international interests, have been put forth by numerous groups. Most of these groups are composed of thinkers in the tradition of European secularism, and do not take into account the values of other ethical systems. One fifth of the world's population is accounted for by Islam, which is a universal religion, with more than 1400 years of scholarship. Although many values are held in common by secular ethical systems and Islam, their inferences are different. The question, “Is it possible to derive a truly universal declaration of biomedical ethics?” is discussed here by examining the value and extent of personal autonomy in Western and Islamic biomedical ethical constructs. These constructs are then tested vis‐à‐vis the issue of abortion. It is concluded that having a universal declaration of biomedical ethics in practice is not possible, although there are many conceptual similarities and agreements between secular and Islamic value systems, unless a radical paradigm shift occurs in segments of the world's deliberative bodies. The appellation “universal” should not be used on deliberative statements unless the ethical values of all major schools of thought are satisfied. PMID:17209104

  4. The level and determinants of mission statement use: a questionnaire survey.

    PubMed

    Desmidt, Sebastian; Prinzie, Anita; Heene, Aimé

    2008-10-01

    Although mission statements are one of the most popular management instruments, little is known about the nature and direction of the presumed relationship between mission statements and organizational performance. In particular, empirical insights into the degree of mission statement use by individual organizational members are insufficient. We address the observed knowledge gap by (a) measuring the level of mission statement use (e.g., explaining the mission statement, making linkages to extant programs or practices, communicating enthusiasm, and adapting the mission statement to the personal work situation) by individual organizational members, and (b) identifying the antecedents that influence mission statement use. Questionnaires were used to collect data from a sample of 510 nurses from three Flemish hospitals. Mission statement use was measured by means of Fairhurst's Management of Meaning Scale. Antecedents of mission statement use were derived from the Theory of Planned Behavior and the mission statement literature. The findings indicate that mission statement use is low on average. Attitude, subjective norm, perceived behavioral control, and formal involvement in mission statement communication proved to be significant determinants of mission statement use and accounted for 43% of the variance. The results of the conducted regression analyses indicate that nurses (a) who have a positive attitude towards the mission statement, (b) who perceive pressure from superiors and colleagues to use the mission statement, (c) who feel they are in control of performing such behavior, and (d) who are formally involved in the mission statement communication processes are more likely to use the mission statement. Furthermore, the results indicated that demographic characteristics are not associated with mission statement use. To effectively increase mission statement use, investments should focus on redesigning a work environment that stresses the importance of the organizational mission statement and provides detailed information on the ways that individual organizational members can contribute in realizing the mission statement.

  5. Comparing the basins of attraction for several methods in the circular Sitnikov problem with spheroid primaries

    NASA Astrophysics Data System (ADS)

    Zotos, Euaggelos E.

    2018-06-01

    The circular Sitnikov problem, where the two primary bodies are prolate or oblate spheroids, is numerically investigated. In particular, the basins of convergence on the complex plane are revealed by using a large collection of numerical methods of several order. We consider four cases, regarding the value of the oblateness coefficient which determines the nature of the roots (attractors) of the system. For all cases we use the iterative schemes for performing a thorough and systematic classification of the nodes on the complex plane. The distribution of the iterations as well as the probability and their correlations with the corresponding basins of convergence are also discussed. Our numerical computations indicate that most of the iterative schemes provide relatively similar convergence structures on the complex plane. However, there are some numerical methods for which the corresponding basins of attraction are extremely complicated with highly fractal basin boundaries. Moreover, it is proved that the efficiency strongly varies between the numerical methods.

  6. Numerical computation of solar neutrino flux attenuated by the MSW mechanism

    NASA Astrophysics Data System (ADS)

    Kim, Jai Sam; Chae, Yoon Sang; Kim, Jung Dae

    1999-07-01

    We compute the survival probability of an electron neutrino in its flight through the solar core experiencing the Mikheyev-Smirnov-Wolfenstein effect with all three neutrino species considered. We adopted a hybrid method that uses an accurate approximation formula in the non-resonance region and numerical integration in the non-adiabatic resonance region. The key of our algorithm is to use the importance sampling method for sampling the neutrino creation energy and position and to find the optimum radii to start and stop numerical integration. We further developed a parallel algorithm for a message passing parallel computer. By using an idea of job token, we have developed a dynamical load balancing mechanism which is effective under any irregular load distributions

  7. Modelling chemical reactions in dc plasma inside oxygen bubbles in water

    NASA Astrophysics Data System (ADS)

    Takeuchi, N.; Ishii, Y.; Yasuoka, K.

    2012-02-01

    Plasmas generated inside oxygen bubbles in water have been developed for water purification. Zero-dimensional numerical simulations were used to investigate the chemical reactions in plasmas driven by dc voltage. The numerical and experimental results of the concentrations of hydrogen peroxide and ozone in the solution were compared with a discharge current between 1 and 7 mA. Upon increasing the water vapour concentration inside bubbles, we saw from the numerical results that the concentration of hydrogen peroxide increased with discharge current, whereas the concentration of ozone decreased. This finding agreed with the experimental results. With an increase in the discharge current, the heat flux from the plasma to the solution increased, and a large amount of water was probably vaporized into the bubbles.

  8. Clients' interpretation of risks provided in genetic counseling.

    PubMed Central

    Wertz, D C; Sorenson, J R; Heeren, T C

    1986-01-01

    Clients in 544 genetic counseling sessions who were given numeric risks of having a child with a birth defect between 0% and 50% were asked to interpret these numeric risks on a five-point scale, ranging from very low to very high. Whereas clients' modal interpretation varied directly with numeric risks between 0% and 15%, the modal category of client risk interpretation remained "moderate" at risks between 15% and 50%. Uncertainty about normalcy of the next child increased as numeric risk increased, and few clients were willing to indicate that the child would probably or definitely be affected regardless of the numeric risk. Characteristics associated with clients' "pessimistic" interpretations of risk, identified by stepwise linear regression, included increased numeric risk, discussion in depth during the counseling session of whether they would have a child, have a living affected child, discussion of the effects of an affected child on relationships with client's other children, and seriousness of the disorder in question (causes intellectual impairment). Client interpretations are discussed in terms of recent developments in cognitive theory, including heuristics that influence judgments about risks, and implications for genetic counseling. PMID:3752089

  9. Multistage variable probability forest volume inventory. [the Defiance Unit of the Navajo Nation

    NASA Technical Reports Server (NTRS)

    Anderson, J. E. (Principal Investigator)

    1979-01-01

    An inventory scheme based on the use of computer processed LANDSAT MSS data was developed. Output from the inventory scheme provides an estimate of the standing net saw timber volume of a major timber species on a selected forested area of the Navajo Nation. Such estimates are based on the values of parameters currently used for scaled sawlog conversion to mill output. The multistage variable probability sampling appears capable of producing estimates which compare favorably with those produced using conventional techniques. In addition, the reduction in time, manpower, and overall costs lend it to numerous applications.

  10. Drug-associated pancreatitis: facts and fiction.

    PubMed

    Rünzi, M; Layer, P

    1996-07-01

    In the past, numerous reports on drugs probably causing acute pancreatitis have been published. However, most of these case reports were anecdotal with a lack of obvious evidence and did not present a comprehensive summary. Although drug-associated pancreatitis is rare, it is gaining increasing importance with the introduction of several potent new agents, i.e., anti-acquired immunodeficiency syndrome drugs. The following comprehensive review scrutinizes the evidence present in the world literature on drugs associated with acute or chronic pancreatitis and, based on this, categorizes in a definite, probable, or possible causality. In addition, explanations for the pathophysiological mechanisms are discussed.

  11. Continuation of probability density functions using a generalized Lyapunov approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baars, S., E-mail: s.baars@rug.nl; Viebahn, J.P., E-mail: viebahn@cwi.nl; Mulder, T.E., E-mail: t.e.mulder@uu.nl

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  12. Methods for Combining Payload Parameter Variations with Input Environment

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.; Straayer, J. W.

    1975-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occuring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular value of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the methods are also presented.

  13. Existence, uniqueness and regularity of a time-periodic probability density distribution arising in a sedimentation-diffusion problem

    NASA Technical Reports Server (NTRS)

    Nitsche, Ludwig C.; Nitsche, Johannes M.; Brenner, Howard

    1988-01-01

    The sedimentation and diffusion of a nonneutrally buoyant Brownian particle in vertical fluid-filled cylinder of finite length which is instantaneously inverted at regular intervals are investigated analytically. A one-dimensional convective-diffusive equation is derived to describe the temporal and spatial evolution of the probability density; a periodicity condition is formulated; the applicability of Fredholm theory is established; and the parameter-space regions are determined within which the existence and uniqueness of solutions are guaranteed. Numerical results for sample problems are presented graphically and briefly characterized.

  14. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  15. Numerical detection of the Gardner transition in a mean-field glass former.

    PubMed

    Charbonneau, Patrick; Jin, Yuliang; Parisi, Giorgio; Rainone, Corrado; Seoane, Beatriz; Zamponi, Francesco

    2015-07-01

    Recent theoretical advances predict the existence, deep into the glass phase, of a novel phase transition, the so-called Gardner transition. This transition is associated with the emergence of a complex free energy landscape composed of many marginally stable sub-basins within a glass metabasin. In this study, we explore several methods to detect numerically the Gardner transition in a simple structural glass former, the infinite-range Mari-Kurchan model. The transition point is robustly located from three independent approaches: (i) the divergence of the characteristic relaxation time, (ii) the divergence of the caging susceptibility, and (iii) the abnormal tail in the probability distribution function of cage order parameters. We show that the numerical results are fully consistent with the theoretical expectation. The methods we propose may also be generalized to more realistic numerical models as well as to experimental systems.

  16. From stochastic processes to numerical methods: A new scheme for solving reaction subdiffusion fractional partial differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angstmann, C.N.; Donnelly, I.C.; Henry, B.I., E-mail: B.Henry@unsw.edu.au

    We have introduced a new explicit numerical method, based on a discrete stochastic process, for solving a class of fractional partial differential equations that model reaction subdiffusion. The scheme is derived from the master equations for the evolution of the probability density of a sum of discrete time random walks. We show that the diffusion limit of the master equations recovers the fractional partial differential equation of interest. This limiting procedure guarantees the consistency of the numerical scheme. The positivity of the solution and stability results are simply obtained, provided that the underlying process is well posed. We also showmore » that the method can be applied to standard reaction–diffusion equations. This work highlights the broader applicability of using discrete stochastic processes to provide numerical schemes for partial differential equations, including fractional partial differential equations.« less

  17. Building Resiliency to Childhood Trauma through Arts-Based Learning

    ERIC Educational Resources Information Center

    Smilan, Cathy

    2009-01-01

    Natural disasters are among the numerous events known to have a significant probability of producing trauma in school-age children, given the critical mental, physical, social, and emotional development that occurs during childhood. Studies involving children who have experienced natural disasters point to a significant increase in psychological…

  18. Color Your Classroom II. A Math Curriculum Guide.

    ERIC Educational Resources Information Center

    Mississippi State Dept. of Education, Jackson.

    This math curriculum guide, correlated with the numerical coding of the Math Skills List published by the Migrant Student Record Transfer System, covers 10 learning areas: readiness, number meaning, whole numbers, fractions, decimals, percent, measurement, geometry, probability and statistics, and sets. Each exercise is illustrated by a large…

  19. Numerical Estimation of Information Theoretic Measures for Large Data Sets

    DTIC Science & Technology

    2013-01-30

    probability including a new indifference rule,” J. Inst. of Actuaries Students’ Soc. 73, 285–334 (1947). 7. M. Hutter and M. Zaffalon, “Distribution...Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, Dover Publications, New York (1972). 13. K.B. Oldham et al., An Atlas

  20. Optimal search strategies of space-time coupled random walkers with finite lifetimes

    NASA Astrophysics Data System (ADS)

    Campos, D.; Abad, E.; Méndez, V.; Yuste, S. B.; Lindenberg, K.

    2015-05-01

    We present a simple paradigm for detection of an immobile target by a space-time coupled random walker with a finite lifetime. The motion of the walker is characterized by linear displacements at a fixed speed and exponentially distributed duration, interrupted by random changes in the direction of motion and resumption of motion in the new direction with the same speed. We call these walkers "mortal creepers." A mortal creeper may die at any time during its motion according to an exponential decay law characterized by a finite mean death rate ωm. While still alive, the creeper has a finite mean frequency ω of change of the direction of motion. In particular, we consider the efficiency of the target search process, characterized by the probability that the creeper will eventually detect the target. Analytic results confirmed by numerical results show that there is an ωm-dependent optimal frequency ω =ωopt that maximizes the probability of eventual target detection. We work primarily in one-dimensional (d =1 ) domains and examine the role of initial conditions and of finite domain sizes. Numerical results in d =2 domains confirm the existence of an optimal frequency of change of direction, thereby suggesting that the observed effects are robust to changes in dimensionality. In the d =1 case, explicit expressions for the probability of target detection in the long time limit are given. In the case of an infinite domain, we compute the detection probability for arbitrary times and study its early- and late-time behavior. We further consider the survival probability of the target in the presence of many independent creepers beginning their motion at the same location and at the same time. We also consider a version of the standard "target problem" in which many creepers start at random locations at the same time.

Top