Science.gov

Sample records for uncertainty principle

  1. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    NASA Astrophysics Data System (ADS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-03-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found.

  2. Semiclassical localization and uncertainty principle

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.; Ferri, G. L.; Olivares, F.

    2008-07-01

    Semiclassical localizability problems in phase space constitute a manifestation of the uncertainty principle, one of the cornerstones of our present understanding of Nature. We revisit the subject here within the framework of the celebrated semiclassical Husimi distributions and their associated Wehrl entropy. By recourse to the concept of escort distributions, a well-established statistical concept, we show that it is possible to significantly improve on the current phase-space classical-localization power, thus approaching more closely than before the bounds imposed by Husimi's thermal uncertainty relation.

  3. Extended uncertainty from first principles

    NASA Astrophysics Data System (ADS)

    Costa Filho, Raimundo N.; Braga, João P. M.; Lira, Jorge H. S.; Andrade, José S.

    2016-04-01

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  4. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  5. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  6. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  7. Curriculum in Art Education: The Uncertainty Principle.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1989-01-01

    Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the…

  8. Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.

    ERIC Educational Resources Information Center

    McKerrow, K. Kelly; McKerrow, Joan E.

    1991-01-01

    The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)

  9. An uncertainty principle for unimodular quantum groups

    SciTech Connect

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  10. A Principle of Uncertainty for Information Seeking.

    ERIC Educational Resources Information Center

    Kuhlthau, Carol C.

    1993-01-01

    Proposes an uncertainty principle for information seeking based on the results of a series of studies that investigated the user's perspective of the information search process. Constructivist theory is discussed as a conceptual framework for studying the user's perspective, and areas for further research are suggested. (Contains 44 references.)…

  11. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  12. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  13. Hardy Uncertainty Principle, Convexity and Parabolic Evolutions

    NASA Astrophysics Data System (ADS)

    Escauriaza, L.; Kenig, C. E.; Ponce, G.; Vega, L.

    2015-11-01

    We give a new proof of the L 2 version of Hardy's uncertainty principle based on calculus and on its dynamical version for the heat equation. The reasonings rely on new log-convexity properties and the derivation of optimal Gaussian decay bounds for solutions to the heat equation with Gaussian decay at a future time.We extend the result to heat equations with lower order variable coefficient.

  14. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  15. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  16. Dilaton cosmology, noncommutativity, and generalized uncertainty principle

    SciTech Connect

    Vakili, Babak

    2008-02-15

    The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

  17. Lorentz invariance violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag

    2016-01-01

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  18. Uncertainty and critical-level population principles.

    PubMed

    Blackorby, C; Bossert, W; Donaldson, D

    1998-02-01

    The authors explore social evaluation with a variable population size in the presence of uncertainty using a timeless framework based upon individual lifetime utilities rather than utilities during a single period. Some standard fixed-population axioms are complemented with an axiom termed the "independence of the utilities of unconcerned individuals," a variable-population version of "separability with respect to unconcerned individuals." Characterizations of expected-utility versions of critical-level generalized utilitarian rules are provided. The principles evaluate lotteries over possible states of the world on the basis of the sum of the expected values of differences between transformed utility levels and a transformed critical level, conditional upon the agents being alive in the states being considered. At the same time, the critical-level utilitarian value functions applied to weighted individual expected utilities can be used. Weights are determined by the anonymity axiom. PMID:12348432

  19. Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael

    1993-01-01

    Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)

  20. Incorporation of generalized uncertainty principle into Lifshitz field theories

    SciTech Connect

    Faizal, Mir; Majumder, Barun

    2015-06-15

    In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.

  1. Open Timelike Curves Violate Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Pienaar, J. L.; Ralph, T. C.; Myers, C. R.

    2013-02-01

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenbergs uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.

  2. Potential Wells and the Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Owens, Constance; Blado, Gardo; Meyers, Vincent

    2014-03-01

    Out of the four fundamental forces, we have yet to be able to unify gravity with the other three forces. This predicament has kept scientists from being able to explain systems that use both general relativity (GR) and quantum mechanics (QM). The quest to quantize gravity, in other words to make GR a quantum theory, has been at the forefront of physics research in recent decades. Incorporating gravity into QM changes the laws of ordinary quantum mechanics. Potential wells are a common tool used to study particle behavior in quantum mechanics. At first they were simply theoretical toy models, but within time it was discovered that potential wells could actually be used to model real-life situations and thus have proven to be very useful theoretically and experimentally. For example, the double square well (DSW) can be used to model the potential experienced by an electron in a diatomic molecule. DSWs can also be used to study bilayer systems. In this paper we derive the results for the finite square well and the DSW using a form of the generalized uncertainty principle to study and discuss how the incorporation of gravity modifies these results. We also discuss applications and the effects of gravity on quantum tunneling.

  3. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  4. Uncertainty principle for proper time and mass

    NASA Astrophysics Data System (ADS)

    Kudaka, Shoju; Matsumoto, Shuichi

    1999-03-01

    We review Bohr's reasoning in the Bohr-Einstein debate on the photon box experiment. The essential point of his reasoning leads us to an uncertainty relation between the proper time and the rest mass of the clock. It is shown that this uncertainty relation can be derived if only we take the fundamental point of view that the proper time should be included as a dynamic variable in the Lagrangian describing the system of the clock. Some problems and some positive aspects of our approach are then discussed.

  5. Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2016-06-01

    In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordström black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.

  6. Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2016-01-01

    In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordström black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.

  7. Microscopic black hole stabilization via the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Vayenas, Constantinos G.; Grigoriou, Dimitrios

    2015-01-01

    Due to the Heisenberg uncertainty principle, gravitational confinement of two- or three-rotating particle systems can lead to microscopic Planckian or sub-Planckian black holes with a size of order their Compton wavelength. Some properties of such states are discussed in terms of the Schwarzschild geodesics of general relativity and compared with properties computed via the combination of special relativity, equivalence principle, Newton's gravitational law and Compton wavelength. It is shown that the generalized uncertainty principle (GUP) provides a satisfactory fit of the Schwarzschild radius and Compton wavelength of such microscopic, particle-like, black holes.

  8. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

  9. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert

  10. Single-Slit Diffraction and the Uncertainty Principle

    ERIC Educational Resources Information Center

    Rioux, Frank

    2005-01-01

    A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

  11. “Stringy” coherent states inspired by generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  12. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  13. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  14. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  15. Uncertainty principle for Gabor systems and the Zak transform

    SciTech Connect

    Czaja, Wojciech; Zienkiewicz, Jacek

    2006-12-15

    We show that if g(set-membership sign)L{sup 2}(R) is a generator of a Gabor orthonormal basis with the lattice ZxZ, then its Zak transform Z(g) satisfies {nabla}Z(g)(negated-set-membership sign)L{sup 2}([0,1){sup 2}). This is a generalization and extension of the Balian-Low uncertainty principle.

  16. Quantum black hole in the generalized uncertainty principle framework

    SciTech Connect

    Bina, A.; Moslehi, A.; Jalalzadeh, S.

    2010-01-15

    In this paper we study the effects of the generalized uncertainty principle (GUP) on canonical quantum gravity of black holes. Through the use of modified partition function that involves the effects of the GUP, we obtain the thermodynamical properties of the Schwarzschild black hole. We also calculate the Hawking temperature and entropy for the modification of the Schwarzschild black hole in the presence of the GUP.

  17. Weak Values, the Reconstruction Problem, and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    de Gosson, Charlyne; de Gosson, Maurice

    2016-03-01

    Closely associated with the notion of weak value is the problem of reconstructing the post-selected state: this is the so-called reconstruction problem. We show that the reconstruction problem can be solved by inversion of the cross-Wigner transform, using an ancillary state. We thereafter show, using the multidimensional Hardy uncertainty principle, that maximally concentrated cross-Wigner transforms corresponds to the case where a weak measurement reduces to an ordinary von Neumann measurement.

  18. Generalized uncertainty principle: implications for black hole complementarity

    NASA Astrophysics Data System (ADS)

    Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han

    2014-12-01

    At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.

  19. Uncertainty principle of genetic information in a living cell

    PubMed Central

    Strippoli, Pierluigi; Canaider, Silvia; Noferini, Francesco; D'Addabbo, Pietro; Vitale, Lorenza; Facchin, Federica; Lenzi, Luca; Casadei, Raffaella; Carinci, Paolo; Zannotti, Maria; Frabetti, Flavia

    2005-01-01

    Background Formal description of a cell's genetic information should provide the number of DNA molecules in that cell and their complete nucleotide sequences. We pose the formal problem: can the genome sequence forming the genotype of a given living cell be known with absolute certainty so that the cell's behaviour (phenotype) can be correlated to that genetic information? To answer this question, we propose a series of thought experiments. Results We show that the genome sequence of any actual living cell cannot physically be known with absolute certainty, independently of the method used. There is an associated uncertainty, in terms of base pairs, equal to or greater than μs (where μ is the mutation rate of the cell type and s is the cell's genome size). Conclusion This finding establishes an "uncertainty principle" in genetics for the first time, and its analogy with the Heisenberg uncertainty principle in physics is discussed. The genetic information that makes living cells work is thus better represented by a probabilistic model rather than as a completely defined object. PMID:16197549

  20. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  1. Classical Dynamics Based on the Minimal Length Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2016-02-01

    In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.

  2. Generalized uncertainty principle in Bianchi type I quantum cosmology

    NASA Astrophysics Data System (ADS)

    Vakili, B.; Sepangi, H. R.

    2007-07-01

    We study a quantum Bianchi type I model in which the dynamical variables of the corresponding minisuperspace obey the generalized Heisenberg algebra. Such a generalized uncertainty principle has its origin in the existence of a minimal length suggested by quantum gravity and sting theory. We present approximate analytical solutions to the corresponding Wheeler DeWitt equation in the limit where the scale factor of the universe is small and compare the results with the standard commutative and noncommutative quantum cosmology. Similarities and differences of these solutions are also discussed.

  3. Conflict between the Uncertainty Principle and wave mechanics

    NASA Astrophysics Data System (ADS)

    Bourdillon, Antony

    The traveling wave group that is defined on conserved physical values is the vehicle of transmission for a unidirectional photon or free particle having a wide wave front. As a stable wave packet, it expresses internal periodicity combined with group localization. Heisenberg's Uncertainty Principle is precisely derived from it. The wave group demonstrates serious conflict between the Principle and wave mechanics. Also derived is the phase velocity beyond the horizon set by the speed of light. In this space occurs the reduction of the wave packet which occurs in measurement and which is represented by comparing phase velocities in the direction of propagation with the transverse plane. The new description of the wavefunction for the stable free particle or antiparticle contains variables that were previously ignored. Deterministic physics must always appear probabilistic when hidden variables are bypassed. Secondary hidden variables always occur in measurement. The wave group turns out to be probabilistic. It is ubiquitous in physics and has many consequences.

  4. Long-range mutual information and topological uncertainty principle

    NASA Astrophysics Data System (ADS)

    Jian, Chao-Ming; Kim, Isaac; Qi, Xiao-Liang

    Ordered phases in Landau paradigm can be diagnosed by a local order parameter, whereas topologically ordered phases cannot be detected in such a way. In this paper, we propose long-range mutual information (LRMI) as a unified diagnostic for both conventional long-range order and topological order. Using the LRMI, we characterize orders in n +1D gapped systems as m-membrane condensates with 0 <= m <= n-1. The familiar conventional order and 2 +1D topological orders are respectively identified as 0-membrane and 1-membrane condensates. We propose and study the topological uncertainty principle, which describes the non-commuting nature of non-local order parameters in topological orders.

  5. Scalar field cosmology modified by the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Paliathanasis, Andronikos; Pan, Supriya; Pramanik, Souvik

    2015-12-01

    We consider quintessence scalar field cosmology in which the Lagrangian of the scalar field is modified by the generalized uncertainty principle. We show that the perturbation terms that arise from the deformed algebra are equivalent with the existence of a second scalar field, where the two fields interact in the kinetic part. Moreover, we consider a spatially flat Friedmann-Lemaître-Robertson-Walker spacetime, and we derive the gravitational field equations. We show that the modified equation of state parameter w GUP can cross the phantom divide line; that is w GUP < -1. Furthermore, we derive the field equations in the dimensionless parameters, the dynamical system that arises is a singular perturbation system in which we study the existence of the fixed points in the slow manifold. Finally, we perform numerical simulations for some well known models and we show that for these models with the specific initial conditions, the parameter w GUP crosses the phantom barrier.

  6. Effects of the generalised uncertainty principle on quantum tunnelling

    NASA Astrophysics Data System (ADS)

    Blado, Gardo; Prescott, Trevor; Jennings, James; Ceyanes, Joshuah; Sepulveda, Rafael

    2016-03-01

    In a previous paper (Blado et al 2014 Eur. J. Phys. 35 065011), we showed that quantum gravity effects can be discussed with only a background in non-relativistic quantum mechanics at the undergraduate level by looking at the effect of the generalised uncertainty principle (GUP) on the finite and infinite square wells. In this paper, we derive the GUP corrections to the tunnelling probability of simple quantum mechanical systems which are accessible to undergraduates (alpha decay, simple models of quantum cosmogenesis and gravitational tunnelling radiation) and which employ the WKB approximation, a topic discussed in undergraduate quantum mechanics classes. It is shown that the GUP correction increases the tunnelling probability in each of the examples discussed.

  7. Generalized Uncertainty Principle and Thermostatistics: A Semiclassical Approach

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, Mohammad; Pedram, Pouria

    2016-04-01

    We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.

  8. Molecular Response Theory in Terms of the Uncertainty Principle.

    PubMed

    Harde, Hermann; Grischkowsky, Daniel

    2015-08-27

    We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared. PMID:26280761

  9. Effect of the Generalized Uncertainty Principle on post-inflation preheating

    SciTech Connect

    Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr

    2011-12-01

    We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.

  10. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  11. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the

  12. Entropy of the Randall-Sundrum brane world with the generalized uncertainty principle

    SciTech Connect

    Kim, Wontae; Park, Young-Jai; Kim, Yong-Wan

    2006-11-15

    By introducing the generalized uncertainty principle, we calculate the entropy of the bulk scalar field on the Randall-Sundrum brane background without any cutoff. We obtain the entropy of the massive scalar field proportional to the horizon area. Here, we observe that the mass contribution to the entropy exists in contrast to all previous results of the usual black hole cases with the generalized uncertainty principle.

  13. Path Integral for Dirac oscillator with generalized uncertainty principle

    SciTech Connect

    Benzair, H.; Boudjedaa, T.; Merad, M.

    2012-12-15

    The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.

  14. Uncertainty Principle--Limited Experiments: Fact or Academic Pipe-Dream?

    ERIC Educational Resources Information Center

    Albergotti, J. Clifton

    1973-01-01

    The question of whether modern experiments are limited by the uncertainty principle or by the instruments used to perform the experiments is discussed. Several key experiments show that the instruments limit our knowledge and the principle remains of strictly academic concern. (DF)

  15. Generalized uncertainty principle and the conformally coupled scalar field quantum cosmology

    NASA Astrophysics Data System (ADS)

    Pedram, Pouria

    2015-03-01

    We exactly solve the Wheeler-DeWitt equation for the closed homogeneous and isotropic quantum cosmology in the presence of a conformally coupled scalar field and in the context of the generalized uncertainty principle. This form of generalized uncertainty principle is motivated by the black hole physics and it predicts a minimal length uncertainty proportional to the Planck length. We construct wave packets in momentum minisuperspace which closely follow classical trajectories and strongly peak on them upon choosing appropriate initial conditions. Moreover, based on the DeWitt criterion, we obtain wave packets that exhibit singularity-free behavior.

  16. Uncertainty principle for experimental measurements: Fast versus slow probes

    PubMed Central

    Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.

    2016-01-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments – angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy – suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902

  17. Uncertainty principle for experimental measurements: Fast versus slow probes.

    PubMed

    Hansmann, P; Ayral, T; Tejeda, A; Biermann, S

    2016-01-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments - angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy - suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902

  18. Uncertainty principle for experimental measurements: Fast versus slow probes

    NASA Astrophysics Data System (ADS)

    Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.

    2016-02-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments - angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy - suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates.

  19. Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students' depictions

    NASA Astrophysics Data System (ADS)

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-12-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students’ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students’ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students’ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students’ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

  20. Symplectic quantization, inequivalent quantum theories, and Heisenberg's principle of uncertainty

    SciTech Connect

    Montesinos, Merced; Torres del Castillo, G.F.

    2004-09-01

    We analyze the quantum dynamics of the nonrelativistic two-dimensional isotropic harmonic oscillator in Heisenberg's picture. Such a system is taken as a toy model to analyze some of the various quantum theories that can be built from the application of Dirac's quantization rule to the various symplectic structures recently reported for this classical system. It is pointed out that that these quantum theories are inequivalent in the sense that the mean values for the operators (observables) associated with the same physical classical observable do not agree with each other. The inequivalence does not arise from ambiguities in the ordering of operators but from the fact of having several symplectic structures defined with respect to the same set of coordinates. It is also shown that the uncertainty relations between the fundamental observables depend on the particular quantum theory chosen. It is important to emphasize that these (somehow paradoxical) results emerge from the combination of two paradigms: Dirac's quantization rule and the usual Copenhagen interpretation of quantum mechanics.

  1. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  2. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  3. Double Special Relativity with a Minimum Speed and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Nassif, Cláudio

    The present work aims to search for an implementation of a new symmetry in the spacetime by introducing the idea of an invariant minimum speed scale (V). Such a lowest limit V, being unattainable by the particles, represents a fundamental and preferred reference frame connected to a universal background field (a vacuum energy) that breaks Lorentz symmetry. So there emerges a new principle of symmetry in the spacetime at the subatomic level for very low energies close to the background frame (v ≈ V), providing a fundamental understanding for the uncertainty principle, i.e. the uncertainty relations should emerge from the spacetime with an invariant minimum speed.

  4. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    PubMed

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks. PMID:12758217

  5. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  6. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  7. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  8. Generalized uncertainty principle corrections to the simple harmonic oscillator in phase space

    NASA Astrophysics Data System (ADS)

    Das, Saurya; Robbins, Matthew P. G.; Walton, Mark A.

    2016-01-01

    We compute Wigner functions for the harmonic oscillator including corrections from generalized uncertainty principles (GUPs), and study the corresponding marginal probability densities and other properties. We show that the GUP corrections to the Wigner functions can be significant, and comment on their potential measurability in the laboratory.

  9. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an

  10. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    SciTech Connect

    Tawfik, A.

    2013-07-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  11. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  12. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron

  13. Certifying Einstein-Podolsky-Rosen steering via the local uncertainty principle

    NASA Astrophysics Data System (ADS)

    Zhen, Yi-Zheng; Zheng, Yu-Lin; Cao, Wen-Fei; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai

    2016-01-01

    Uncertainty principle lies at the heart of quantum mechanics, while nonlocality is an intriguing phenomenon of quantum mechanics to rule out local causal theories. One subtle form of nonlocality is so-called Einstein-Podolsky-Rosen (EPR) steering, which holds the potential for shared entanglement verification even if the one-sided measurement device is untrusted. However, certifying EPR steering remains a big challenge presently. Here, we employ the local uncertainty relation to provide an experimental friendly approach for EPR steering verification. We show that the strength of EPR steering is quantitatively linked to the strength of the uncertainty relation, as well as the amount of entanglement. We find also that the realignment method works for detecting EPR steering of an arbitrary dimensional system.

  14. Energy distribution of massless particles on black hole backgrounds with generalized uncertainty principle

    SciTech Connect

    Li Zhongheng

    2009-10-15

    We derive new formulas for the spectral energy density and total energy density of massless particles in a general spherically symmetric static metric from a generalized uncertainty principle. Compared with blackbody radiation, the spectral energy density is strongly damped at high frequencies. For large values of r, the spectral energy density diminishes when r grows, but at the event horizon, the spectral energy density vanishes and therefore thermodynamic quantities near a black hole, calculated via the generalized uncertainty principle, do not require any cutoff parameter. We find that the total energy density can be expressed in terms of Hurwitz zeta functions. It should be noted that at large r (low local temperature), the difference between the total energy density and the Stefan-Boltzmann law is too small to be observed. However, as r approaches an event horizon, the effect of the generalized uncertainty principle becomes more and more important, which may be observable. As examples, the spectral energy densities in the background metric of a Schwarzschild black hole and of a Schwarzschild black hole plus quintessence are discussed. It is interesting to note that the maximum of the distribution shifts to higher frequencies when the quintessence equation of state parameter w decreases.

  15. The uncertainty principle enables non-classical dynamics in an interferometer.

    PubMed

    Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

    2014-01-01

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics. PMID:25105741

  16. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGESBeta

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  17. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    SciTech Connect

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.

  18. Lifespan of rotating black hole in the frame of generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    He, Tangmei; Zhang, Jingyi; Yang, Jinbo; Tan, Hongwei

    2016-01-01

    In this paper, the lifespan under the generalized uncertainty principle (GUP) of rotating black hole is derived through the corrected radiation energy flux and the first law of the thermodynamics of black hole. The radiation energy flux indicates that there exist the highest temperature and the minimum mass both of which are relevant to the initial mass of the black hole in the final stage of the radiation. The lifespan of rotating black hole includes three terms: the dominant term is just the lifespan in the flat spacetime; the other two terms are individually induced by the rotation and the GUP.

  19. Key Rate Available from Mismatched Measurements in the BB84 Protocol and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Matsumoto, Ryutaroh; Watanabe, Shun

    We consider the mismatched measurements in the BB84 quantum key distribution protocol, in which measuring bases are different from transmitting bases. We give a lower bound on the amount of a secret key that can be extracted from the mismatched measurements. Our lower bound shows that we can extract a secret key from the mismatched measurements with certain quantum channels, such as the channel over which the Hadamard matrix is applied to each qubit with high probability. Moreover, the entropic uncertainty principle implies that one cannot extract the secret key from both matched measurements and mismatched ones simultaneously, when we use the standard information reconciliation and privacy amplification procedure.

  20. The Symplectic Camel and the Uncertainty Principle: The Tip of an Iceberg?

    NASA Astrophysics Data System (ADS)

    de Gosson, Maurice A.

    2009-02-01

    We show that the strong form of Heisenberg’s inequalities due to Robertson and Schrödinger can be formally derived using only classical considerations. This is achieved using a statistical tool known as the “minimum volume ellipsoid” together with the notion of symplectic capacity, which we view as a topological measure of uncertainty invariant under Hamiltonian dynamics. This invariant provides a right measurement tool to define what “quantum scale” is. We take the opportunity to discuss the principle of the symplectic camel, which is at the origin of the definition of symplectic capacities, and which provides an interesting link between classical and quantum physics.

  1. J-holomorphic maps and the uncertainty principle in geometric quantum mechanics

    NASA Astrophysics Data System (ADS)

    Sanborn, Barbara

    The theory of geometric quantum mechanics describes a quantum system as a Hamiltonian dynamical system, with a complex projective Hilbert space as its phase space. The Kähler structure of the projective space provides quantum mechanics with a Riemannian metric in addition to the symplectic structure characteristic of classical mechanics. By including aspects of the symplectic topology of the quantum phase space, the geometric theory is extended and enriched. In particular, the quantum uncertainty principle is naturally expressed as an inequality from J-holomorphic map theory.

  2. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect

    Said, Jackson Levi; Adami, Kristian Zarb

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  3. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    SciTech Connect

    Tallacchini, Mariachiara . E-mail: mariachiara.tallacchini@unimi.it

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  4. Covariant energy–momentum and an uncertainty principle for general relativity

    SciTech Connect

    Cooperstock, F.I.; Dupre, M.J.

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  5. Constraints on the Generalized Uncertainty Principle from black-hole thermodynamics

    NASA Astrophysics Data System (ADS)

    Gangopadhyay, Sunandan; Dutta, Abhijit; Faizal, Mir

    2015-10-01

    In this paper, we calculate the modification to the thermodynamics of a Schwarzschild black hole in higher dimensions due to the Generalized Uncertainty Principle (GUP). We use the fact that the leading-order corrections to the entropy of a black hole has to be logarithmic in nature to restrict the form of the GUP. We observe that in six dimensions, the usual GUP produces the correct form for the leading-order corrections to the entropy of a black hole. However, in five and seven dimensions a linear GUP, which is obtained by a combination of DSR with the usual GUP, is needed to produce the correct form of the corrections to the entropy of a black hole. Finally, we demonstrate that in five dimensions, a new form of GUP containing quadratic and cubic powers of the momentum also produces the correct form for the leading-order corrections to the entropy of a black hole.

  6. Quantum corrections to the thermodynamics of Schwarzschild-Tangherlini black hole and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Feng, Z. W.; Li, H. L.; Zu, X. T.; Yang, S. Z.

    2016-04-01

    We investigate the thermodynamics of Schwarzschild-Tangherlini black hole in the context of the generalized uncertainty principle (GUP). The corrections to the Hawking temperature, entropy and the heat capacity are obtained via the modified Hamilton-Jacobi equation. These modifications show that the GUP changes the evolution of the Schwarzschild-Tangherlini black hole. Specially, the GUP effect becomes susceptible when the radius or mass of the black hole approaches the order of Planck scale, it stops radiating and leads to a black hole remnant. Meanwhile, the Planck scale remnant can be confirmed through the analysis of the heat capacity. Those phenomena imply that the GUP may give a way to solve the information paradox. Besides, we also investigate the possibilities to observe the black hole at the Large Hadron Collider (LHC), and the results demonstrate that the black hole cannot be produced in the recent LHC.

  7. Minimal length uncertainty principle and the trans-Planckian problem of black hole physics

    NASA Astrophysics Data System (ADS)

    Brout, R.; Gabriel, Cl.; Lubo, M.; Spindel, Ph.

    1999-02-01

    The minimal length uncertainty principle of Kemf, Mangano and Mann (KMM), as derived from a mutilated quantum commutator between coordinate and momentum, is applied to describe the modes and wave packets of Hawking particles evaporated from a black hole. The trans-Planckian problem is successfully confronted in that the Hawking particle no longer hugs the horizon at arbitrarily close distances. Rather the mode of Schwarzschild frequency ω deviates from the conventional trajectory when the coordinate r is given by \\|r-2M\\|~=βHω/2π in units of the nonlocal distance legislated into the uncertainty relation. Wave packets straddle the horizon and spread out to fill the whole nonlocal region. The charge carried by the packet (in the sense of the amount of ``stuff'' carried by the Klein-Gordon field) is not conserved in the non-local region and rapidly decreases to zero as time decreases. Read in the forward temporal direction, the non-local region thus is the seat of production of the Hawking particle and its partner. The KMM model was inspired by string theory for which the mutilated commutator has been proposed to describe an effective theory of high momentum scattering of zero mass modes. It is here interpreted in terms of dissipation which gives rise to the Hawking particle into a reservoir of other modes (of as yet unknown origin). On this basis it is conjectured that the Bekenstein-Hawking entropy finds its origin in the fluctuations of fields extending over the nonlocal region.

  8. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  9. Principles for Robust On-orbit Uncertainties Traceable to the SI (Invited)

    NASA Astrophysics Data System (ADS)

    Shirley, E. L.; Dykema, J. A.; Fraser, G. T.; Anderson, J.

    2009-12-01

    Climate-change research requires space-based measurements of the Earth’s spectral radiance, reflectance, and atmospheric properties with unprecedented accuracy. Increases in measurement accuracy would improve and accelerate the quantitative determination of decadal climate change. The increases would also permit attribution of climate change to anthropogenic causes and foster understanding of climate evolution on an accelerated time scale. Beyond merely answering key questions about global climate change, accurate measurements would also be of benefit by testing and refining climate models to enhance and quantify their predictive value. Accurate measurements imply traceability to the SI system of units. In this regard, traceability is a property of the result of a measurement, or the value of a standard, whereby it can be related to international standards through an unbroken chain of comparisons, all having stated (and realistic) uncertainties. SI-traceability allows one to compare measurements independent of locale, time, or sensor. In this way, SI-traceability alleviates the urgency to maintain a false assurance of measurement accuracy by having an unbroken time series of observations continually adjusted so that measurement results obtained with a given instrument match the measurement results of its recent predecessors. Moreover, to make quantitative inferences from measurement results obtained in various contexts, which might range, for instance, from radiometry to atmospheric chemistry, having SI-traceability throughout all work is essential. One can derive principles for robust claims of SI-traceability from lessons learned by the scientific community. In particular, National Measurement Institutes (NMIs), such as NIST, use several strategies in their realization of practical SI-traceable measurements of the highest accuracy: (1.) basing ultimate standards on fundamental physical phenomena, such as the Quantum Hall resistance, instead of measurement artifacts; (2.) developing a variety of approaches to measure a given physical quantity; (3.) conducting intercomparisons of measurements performed by different institutions; (4.) perpetually seeking complete understanding of all sources of measurement bias and uncertainty; (5.) rigorously analyzing measurement uncertainties; and (6.) maintaining a high level of transparency that permits peer review of measurement practices. It is imperative to establish SI-traceability at the beginning of an environmental satellite program. This includes planning for system-level pre-launch and, in particular, on-orbit instrument calibration. On-orbit calibration strategies should be insensitive to reasonably expected perturbations that arise during launch or on orbit, and one should employ strategies to validate on-orbit traceability. As a rule, optical systems with simple designs tend to be more amenable to robust calibration schemes.

  10. Femtoscopic scales in p + p and p + Pb collisions in view of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Braun-Munzinger, P.; Karpenko, Iu. A.; Sinyukov, Yu. M.

    2013-08-01

    A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in p + p collisions at LHC energy (√{ s} = 7 TeV). A prediction is also presented of pion interferometric radii for p + Pb collisions at √{ s} = 5.02 TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as 'afterburner' is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of p + p collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic A + A collisions, also for high multiplicity p + p and p + Pb events.

  11. f (R )-modified gravity, Wald entropy, and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Hammad, Fayal

    2015-08-01

    Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an f (R )-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking modified gravity theories. Two different versions of the GUP are used, and it is found that only one of them yields a viable modified gravity model. Conversely, it is possible to find a general formulation of the GUP that would reproduce Wald entropy formula for any f (R ) theory of gravity.

  12. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    PubMed

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction. PMID:26022837

  13. A Dark Energy Model with Generalized Uncertainty Principle in the Emergent, Intermediate and Logamediate Scenarios of the Universe

    NASA Astrophysics Data System (ADS)

    Ghosh, Rahul; Chattopadhyay, Surajit; Debnath, Ujjal

    2012-02-01

    This work is motivated by the work of Kim et al. (Mod. Phys. Lett. A 23:3049, 2008), which considered the equation of state parameter for the new agegraphic dark energy based on generalized uncertainty principle coexisting with dark matter without interaction. In this work, we have considered the same dark energy interacting with dark matter in emergent, intermediate and logamediate scenarios of the universe. Also, we have investigated the statefinder, kerk and lerk parameters in all three scenarios under this interaction. The energy density and pressure for the new agegraphic dark energy based on generalized uncertainty principle have been calculated and their behaviors have been investigated. The evolution of the equation of state parameter has been analyzed in the interacting and non-interacting situations in all the three scenarios. The graphical analysis shows that the dark energy behaves like quintessence era for logamediate expansion and phantom era for emergent and intermediate expansions of the universe.

  14. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  15. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    NASA Astrophysics Data System (ADS)

    Dupuy, Jean-Pierre; Grinbaum, Alexei

    2005-03-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). To cite this article: J.-P. Dupuy, A. Grinbaum, C. R. Geoscience 337 (2005).

  16. The precautionary principle and international conflict over domestic regulation: mitigating uncertainty and improving adaptive capacity.

    PubMed

    Oye, K A

    2005-01-01

    Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs. PMID:16304935

  17. The effect of generalized uncertainty principle on square well, a case study

    SciTech Connect

    Ma, Meng-Sen; Zhao, Ren

    2014-08-15

    According to a special case (β = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except α. But the eigenstates will depend on another two parameters besides α.

  18. Phase-space noncommutative extension of the Robertson-Schrödinger formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Dias, Nuno Costa; Prata, João Nuno

    2015-03-01

    We revisit Ozawa's uncertainty principle (OUP) in the framework of noncommutative (NC) quantum mechanics. We derive a matrix version of OUP accommodating any NC structure in the phase space, and compute NC corrections to lowest order for two measurement interactions, namely the backaction evading quadrature amplifier and noiseless quadrature transducers. These NC corrections alter the nature of the measurement interaction, as a noiseless interaction may acquire noise, and an interaction of independent intervention may become dependent on the object system. However the most striking result is that noncommutativity may lead to a violation of the OUP itself. The NC corrections for the backaction evading quadrature amplifier reveal a new term which may potentially be amplified in such a way that the violation of the OUP becomes experimentally testable. On the other hand, the NC corrections to the noiseless quadrature transducer shows an incompatibility of this model with NC quantum mechanics. We discuss the implications of this incompatibility for NC quantum mechanics and for Ozawa's uncertainty principle.

  19. Chaos and the way of Zen: psychiatric nursing and the 'uncertainty principle'.

    PubMed

    Barker, P J

    1996-08-01

    The biological sciences have been dominated by 'classicist' science-predicated on the post-Enlightenment belief that a real world exists, which behaves according to notions of causality and consistency. Although medicine, and by implication psychiatric nursing, derives its explanatory power from such a science, much of its focus-illness-is not amenable to causal explanation or prediction. The theoretical developments of the 'new physics' have been used to redefine science and, as a result, have challenged traditional constructions of reality. The new physics are usually framed in terms of the physical world, or to construe consciousness. In this paper I shall consider the implications of chaos-a relative of the new physics-for psychiatric nursing practice. As nursing appears to crave a 'certainty principle' to govern the theoretical underpinnings of practice, this study considers how chaos might contribute to a metaparadigm of nursing. PMID:8997984

  20. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  1. Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators

    SciTech Connect

    Marchiolli, M.A.; Mendonça, P.E.M.F.

    2013-09-15

    We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.

  2. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Technical Reports Server (NTRS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  3. About the Heisenberg's uncertainty principle and the determination of effective optical indices in integrated photonics at high sub-wavelength regime

    NASA Astrophysics Data System (ADS)

    Bêche, B.; Gaviot, E.

    2016-04-01

    Within the Heisenberg's uncertainty principle it is explicitly discussed the impact of these inequalities on the theory of integrated photonics at sub-wavelength regime. More especially, the uncertainty of the effective index values in nanophotonics at sub-wavelength regime, which is defined as the eigenvalue of the overall opto-geometric problems in integrated photonics, appears directly stemming from Heisenberg's uncertainty. An apt formula is obtained allowing us to assume that the incertitude and the notion of eigenvalue called effective optical index or propagation constant is inversely proportional to the spatial dimensions of a given nanostructure yielding a transfer of the fuzziness on relevant senses of eigenvalues below a specific limit's volume.

  4. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

  5. Adding a strategic edge to human factors/ergonomics: principles for the management of uncertainty as cornerstones for system design.

    PubMed

    Grote, Gudela

    2014-01-01

    It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735

  6. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    PubMed

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124

  7. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Gour, Gilad

    2014-03-01

    Uncertainty relations are a distinctive characteristic of quantum theory that imposes intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring non-commuting observables. However, I will show here that there is no fundamental reason for using entropies as quantifiers; in fact, any functional relation that characterizes the uncertainty of the measurement outcomes can be used to define an uncertainty relation. Starting from a simple assumption that any measure of uncertainty is non-decreasing under mere relabeling of the measurement outcomes, I will show that Schur-concave functions are the most general uncertainty quantifiers. I will then introduce a novel fine-grained uncertainty relation written in terms of a majorization relation, which generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary measures of uncertainty. This infinite family of uncertainty relations includes all the known entropic uncertainty relations, but is not limited to them. In this sense, the relation is universally valid and captures the essence of the uncertainty principle in quantum theory. This talk is based on a joint work with Shmuel Friedland and Vlad Gheorghiu. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada and by the Pacific Institute for Mathematical Sciences (PIMS).

  8. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  9. Uncertainty and nonseparability

    NASA Astrophysics Data System (ADS)

    de La Torre, A. C.; Catuogno, P.; Ferrando, S.

    1989-06-01

    A quantum covariance function is introduced whose real and imaginary parts are related to the independent contributions to the uncertainty principle: noncommutativity of the operators and nonseparability. It is shown that factorizability of states is a sufficient but not necessary condition for separability. It is suggested that all quantum effects could be considered to be a consequence of nonseparability alone.

  10. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  11. Measurement uncertainty.

    PubMed

    Bartley, David; Lidén, Göran

    2008-08-01

    The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill. PMID:18573808

  12. Entropic uncertainty relations for multiple measurements

    NASA Astrophysics Data System (ADS)

    Liu, Shang; Mu, Liang-Zhu; Fan, Heng

    2015-04-01

    We present the entropic uncertainty relations for multiple measurement settings which demonstrate the uncertainty principle of quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory, and can be proven by a unified method. Our results recover some well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. The bounds of those relations which quantify the extent of the uncertainty take concise forms and are easy to calculate. Those uncertainty relations might play important roles in the foundations of quantum theory. Potential experimental demonstration of those entropic uncertainty relations is discussed.

  13. Uncertainties in Long-Term Geologic Offset Rates of Faults: General Principles Illustrated With Data From California and Other Western States

    NASA Astrophysics Data System (ADS)

    Bird, P.

    2006-12-01

    Because the slip rates of seismic faults are highly variable, a better target for statistical estimation is the long- term offset rate, which can be defined as the rate of one component of the slip which would be measured between any two times when fault-plane shear tractions are equal. The probability density function for the sum of elastic offset plus fault slip offset since a particular geologic event includes uncertainties associated with changes in elastic strain between that event and the present, which are estimated from the sizes of historic earthquake offsets on other faults of similar type. The probability density function for the age of a particular geologic event may be non-Gaussian, especially if it is determined from cross-cutting relations, or from radiocarbon or cosmogenic-nuclide ages containing inheritance. Two alternate convolution formulas relating the distributions for offset and age give the probability density function for long-term offset rate; these are computed for most published cases of dated offset features along active faults in California and other western states. After defining a probabilistic measure of disagreement between two long-term offset rate distributions measured on the same fault section, I investigate how disagreement varies with geologic time (difference in age of the offset features) and with publication type (primary, secondary, or tertiary). Patterns of disagreement suggest that at least 4.3% of offset rates in primary literature are incorrect (due to failure to span the whole fault, undetected complex initial shapes of offset features, or faulty correlation in space or in geologic time) or unrepresentative (due to variations in offset rate along the trace). Tertiary (third-hand) literature sources have a higher error rate of 14.5%. In the western United States, it appears that rates from offset features as old as 3 Ma can be averaged without introducing age-dependent bias. Offsets of older features can and should be used as well, but it is necessary to make allowance for the increased risk, rising to rapidly to 48%, that they are inapplicable to neotectonics. Based on these results, best-estimate combined probability density functions are computed for the long-term offset rates of all active faults in California and other conterminous western states, and described in tables using several scalar measures. Of 849 active and potentially-active faults in the conterminous western United States, only 48 are "well-constrained" (having combined probability density functions for long-term offset rate in which the width of the 95%-confidence range is smaller than the median). It appears to require about 4 offset features to give an even chance of achieving a well-constrained combined rate, and at least 7 offset features to guarantee it.

  14. Theoretical analysis of uncertainty visualizations

    NASA Astrophysics Data System (ADS)

    Zuk, Torre; Carpendale, Sheelagh

    2006-01-01

    Although a number of theories and principles have been developed to guide the creation of visualizations, it is not always apparent how to apply the knowledge in these principles. We describe the application of perceptual and cognitive theories for the analysis of uncertainty visualizations. General principles from Bertin, Tufte, and Ware are outlined and then applied to the analysis of eight different uncertainty visualizations. The theories provided a useful framework for analysis of the methods, and provided insights into the strengths and weaknesses of various aspects of the visualizations.

  15. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  16. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  17. Quantal localization and the uncertainty principle

    SciTech Connect

    Leopold, J.G.; Richards, D.

    1988-09-01

    We give a dynamical explanation for the localization of the wave function for the one-dimensional hydrogen atom, with the Coulomb singularity, in a high-frequency electric field, which leads to a necessary condition for classical dynamics to be valid. Numerical tests confirm the accuracy of the condition. Our analysis is relevant to the comparison between the classical and quantal dynamics of the kicked rotor and standard map.

  18. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  19. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  20. Reformulating the Quantum Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  1. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  2. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of

  3. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  4. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it

  5. Rényi entropy uncertainty relation for successive projective measurements

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-06-01

    We investigate the uncertainty principle for two successive projective measurements in terms of Rényi entropy based on a single quantum system. Our results cover a large family of the entropy (including the Shannon entropy) uncertainty relations with a lower optimal bound. We compare our relation with other formulations of the uncertainty principle in two-spin observables measured on a pure quantum state of qubit. It is shown that the low bound of our uncertainty relation has better tightness.

  6. Majorization formulation of uncertainty in quantum mechanics

    SciTech Connect

    Partovi, M. Hossein

    2011-11-15

    Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.

  7. Improved entropic uncertainty relations and information exclusion relations

    NASA Astrophysics Data System (ADS)

    Coles, Patrick J.; Piani, Marco

    2014-02-01

    The uncertainty principle can be expressed in entropic terms, also taking into account the role of entanglement in reducing uncertainty. The information exclusion principle bounds instead the correlations that can exist between the outcomes of incompatible measurements on one physical system, and a second reference system. We provide a more stringent formulation of both the uncertainty principle and the information exclusion principle, with direct applications for, e.g., the security analysis of quantum key distribution, entanglement estimation, and quantum communication. We also highlight a fundamental distinction between the complementarity of observables in terms of uncertainty and in terms of information.

  8. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  9. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  10. Entropic uncertainty relations under the relativistic motion

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2013-10-01

    The uncertainty principle bounds our ability to simultaneously predict two incompatible observables of a quantum particle. Assisted by a quantum memory to store the particle, this uncertainty could be reduced and quantified by a new Entropic Uncertainty Relation (EUR). In this Letter, we explore how the relativistic motion of the system would affect the EUR in two sample scenarios. First, we show that the Unruh effect of an accelerating particle would surely increase the uncertainty if the system and particle entangled initially. On the other hand, the entanglement could be generated from nonuniform motion once the Unruh decoherence is prevented by utilizing the cavity. We show that, in a uncertainty game between an inertial cavity and a nonuniformly accelerated one, the uncertainty evolves periodically with respect to the duration of acceleration segment. Therefore, with properly chosen cavity parameters, the uncertainty bound could be protected. Implications of our results for gravitation are also discussed.

  11. Uncertainty in the Classroom--Teaching Quantum Physics

    ERIC Educational Resources Information Center

    Johansson, K. E.; Milstead, D.

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how

  12. Uncertainty in the Classroom--Teaching Quantum Physics

    ERIC Educational Resources Information Center

    Johansson, K. E.; Milstead, D.

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how…

  13. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  14. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  15. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  16. Principled leadership.

    PubMed

    Lexa, Frank James

    2010-07-01

    Leadership is not just a set of activities; it is also about vision and character. Principles matter: for you, for your coworkers, and for the group or institution you serve. Individuals and groups can succeed only through a climate of commitment and trust. Your integrity and principled leadership are the cornerstones for building an effective team. Following principles doesn't mean that you will win every time, but having a plan and sticking to it even in tough times is a strong element of long-term success. PMID:20630389

  17. Abolishing the maximum tension principle

    NASA Astrophysics Data System (ADS)

    Dąbrowski, Mariusz P.; Gohar, H.

    2015-09-01

    We find the series of example theories for which the relativistic limit of maximum tension Fmax =c4 / 4 G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.

  18. Angular performance measure for tighter uncertainty relations

    SciTech Connect

    Hradil, Z.; Rehacek, J.; Klimov, A. B.; Rigas, I.; Sanchez-Soto, L. L.

    2010-01-15

    The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

  19. The physical origins of the uncertainty theorem

    NASA Astrophysics Data System (ADS)

    Giese, Albrecht

    2013-10-01

    The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

  20. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Landé, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schrödinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrödinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.

  1. Psychosomatic Principles

    PubMed Central

    Cleghorn, R. A.

    1965-01-01

    There are four lines of development that might be called psychosomatic principles. The first represents the work initiated by Claude Bernard, Cannon, and others, in neurophysiology and endocrinology in relationship to stress. The second is the application of psychoanalytic formulations to the understanding of illness. The third is in the development of the social sciences, particularly anthropology, social psychology and sociology with respect to the emotional life of man, and, fourth, there is an increased application of epidemiological techniques to the understanding and incidence of disease and its causes. These principles can be applied to the concepts of comprehensive medicine and they bid fair to be unifying and helpful in its study. This means that future practitioners, as well as those working in the field of psychosomatic medicine, are going to have to have a much more precise knowledge of the influence of emotions on bodily processes. PMID:14259334

  2. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  3. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  4. Uncertainty as knowledge.

    PubMed

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D

    2015-11-28

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  5. Role of information theoretic uncertainty relations in quantum theory

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson-Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  6. Role of information theoretic uncertainty relations in quantum theory

    SciTech Connect

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  7. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  8. Reproducibility and uncertainty of wastewater turbidity measurements.

    PubMed

    Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G

    2008-01-01

    Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used. PMID:18520026

  9. Regulating under uncertainty: newsboy for exposure limits.

    PubMed

    Cooke, Roger M; Macdonell, Margaret

    2008-06-01

    Setting action levels or limits for health protection is complicated by uncertainty in the dose-response relation across a range of hazards and exposures. To address this issue, we consider the classic newsboy problem. The principles used to manage uncertainty for that case are applied to two stylized exposure examples, one for high dose and high dose rate radiation and the other for ammonia. Both incorporate expert judgment on uncertainty quantification in the dose-response relationship. The mathematical technique of probabilistic inversion also plays a key role. We propose a coupled approach, whereby scientists quantify the dose-response uncertainty using techniques such as structured expert judgment with performance weights and probabilistic inversion, and stakeholders quantify associated loss rates. PMID:18643816

  10. Uncertainty in audiometer calibration

    NASA Astrophysics Data System (ADS)

    Aurélio Pedroso, Marcos; Gerges, Samir N. Y.; Gonçalves, Armando A., Jr.

    2004-02-01

    The objective of this work is to present a metrology study necessary for the accreditation of audiometer calibration procedures at the National Brazilian Institute of Metrology Standardization and Industrial Quality—INMETRO. A model for the calculation of measurement uncertainty was developed. Metrological aspects relating to audiometer calibration, traceability and measurement uncertainty were quantified through comparison between results obtained at the Industrial Noise Laboratory—LARI of the Federal University of Santa Catarina—UFSC and the Laboratory of Electric/acoustics—LAETA of INMETRO. Similar metrological performance of the measurement system used in both laboratories was obtained, indicating that the interlaboratory results are compatible with the expected values. The uncertainty calculation was based on the documents: EA-4/02 Expression of the Uncertainty of Measurement in Calibration (European Co-operation for Accreditation 1999 EA-4/02 p 79) and Guide to the Expression of Uncertainty in Measurement (International Organization for Standardization 1993 1st edn, corrected and reprinted in 1995, Geneva, Switzerland). Some sources of uncertainty were calculated theoretically (uncertainty type B) and other sources were measured experimentally (uncertainty type A). The global value of uncertainty calculated for the sound pressure levels (SPLs) is similar to that given by other calibration institutions. The results of uncertainty related to measurements of SPL were compared with the maximum uncertainties Umax given in the standard IEC 60645-1: 2001 (International Electrotechnical Commission 2001 IEC 60645-1 Electroacoustics—Audiological Equipment—Part 1:—Pure-Tone Audiometers).

  11. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  12. Uncertainty quantification for Markov chain models

    NASA Astrophysics Data System (ADS)

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

  13. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-04-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, including for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  14. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  15. Geometrical Uncertainties in Radiotherapy

    NASA Astrophysics Data System (ADS)

    Remeijer, Peter

    Geometrical uncertainties are a fact in any radiotherapy practice. Lasers can be misaligned, patients are mobile and the definition of the target volume is not always very easy. To deal with these uncertainties a safety margin is applied, i.e. a larger volume than the target itself is treated. In this chapter we will discuss common sources of geometrical uncertainties and how to compute these safety margins.

  16. Generalized uncertainty principle and self-adjoint operators

    SciTech Connect

    Balasubramanian, Venkat; Das, Saurya; Vagenas, Elias C.

    2015-09-15

    In this work we explore the self-adjointness of the GUP-modified momentum and Hamiltonian operators over different domains. In particular, we utilize the theorem by von-Neumann for symmetric operators in order to determine whether the momentum and Hamiltonian operators are self-adjoint or not, or they have self-adjoint extensions over the given domain. In addition, a simple example of the Hamiltonian operator describing a particle in a box is given. The solutions of the boundary conditions that describe the self-adjoint extensions of the specific Hamiltonian operator are obtained.

  17. Phase-space noncommutative formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno

    2014-08-01

    Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.

  18. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to

  19. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

  20. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  1. Physics and Operational Research: measure of uncertainty via Nonlinear Programming

    NASA Astrophysics Data System (ADS)

    Davizon-Castillo, Yasser A.

    2008-03-01

    Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

  2. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  3. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  4. [The precautionary principle and the environment].

    TOXLINE Toxicology Bibliographic Information

    de Cózar Escalante JM

    2005-03-01

    The precautionary principle is a response to uncertainty in the face of risks to health or the environment. In general, it involves taking measures to avoid potential harm, despite lack of scientific certainty. In recent years it has been applied, not without difficulties, as a legal and political principle in many countries, particularly on the European and International level. In spite of the controversy, the precautionary principle has become an integral component of a new paradigm for the creation of public policies needed to meet today's challenges and those of the future.

  5. [The precautionary principle and the environment].

    PubMed

    de Cózar Escalante, José Manuel

    2005-01-01

    The precautionary principle is a response to uncertainty in the face of risks to health or the environment. In general, it involves taking measures to avoid potential harm, despite lack of scientific certainty. In recent years it has been applied, not without difficulties, as a legal and political principle in many countries, particularly on the European and International level. In spite of the controversy, the precautionary principle has become an integral component of a new paradigm for the creation of public policies needed to meet today's challenges and those of the future. PMID:15913050

  6. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  7. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  8. Image restoration, uncertainty, and information.

    PubMed

    Yu, F T

    1969-01-01

    Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy. PMID:20072171

  9. Uncertainty relations and precession of perihelion

    NASA Astrophysics Data System (ADS)

    Scardigli, Fabio; Casadio, Roberto

    2016-03-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the perihelion precession for planets in the solar system, and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  10. Controlling entropic uncertainty bound through memory effects

    NASA Astrophysics Data System (ADS)

    Karpat, Göktuğ; Piilo, Jyrki; Maniscalco, Sabrina

    2015-09-01

    One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.

  11. The Uncertainty Relation for Quantum Propositions

    NASA Astrophysics Data System (ADS)

    Zizzi, Paola

    2013-01-01

    Logical propositions with the fuzzy modality "Probably" are shown to obey an uncertainty principle very similar to that of Quantum Optics. In the case of such propositions, the partial truth values are in fact probabilities. The corresponding assertions in the metalanguage, have complex assertion degrees which can be interpreted as probability amplitudes. In the logical case, the uncertainty relation is about the assertion degree, which plays the role of the phase, and the total number of atomic propositions, which plays the role of the number of modes. In analogy with coherent states in quantum physics, we define "quantum coherent propositions" those which minimize the above logical uncertainty relation. Finally, we show that there is only one kind of compound quantum-coherent propositions: the "cat state" propositions.

  12. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications. PMID:22042902

  13. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  14. Adaptive framework for uncertainty analysis in electromagnetic field measurements.

    PubMed

    Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano

    2015-04-01

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. PMID:25143178

  15. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  16. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  17. Communicating scientific uncertainty.

    PubMed

    Fischhoff, Baruch; Davis, Alex L

    2014-09-16

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  18. A Generalized Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Chen, Zhengli; Liang, Lili; Li, Haojing; Wang, Wenhua

    2015-08-01

    By using a generalization of the Wigner-Yanase-Dyson skew information, a quantity is introduced in this paper for every Hilbert-Schmidt operator A on a Hilbert space H and a related uncertainty relation was established. The obtained inequality generalizes a known uncertainty relation. Moreover, a negative answer to a conjecture induced in Dou and Du (Int. J. Theor. Phys. 53, 952-958, 2014) was given by a counterexample.

  19. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.

  20. Visualization of Uncertainty

    NASA Astrophysics Data System (ADS)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs at the generating points (effectively the center of the polygon). This methodology readily admits to rigorous statistical analysis using standard components found in R and thus entirely compatible with the visualization package we use (Visit and/or ParaView), the language we use (Python) and the UVCDAT environment that provides the programmer and analyst workbench. We will demonstrate the power and effectiveness of this methodology in climate studies. We will further argue that our method of defining (or predicting) values in a region has many advantages over the traditional visualization notion of value at a point.

  1. Uncertainties in risk tolerability

    SciTech Connect

    Cassidy, K.

    1995-12-31

    The management of risk is now recognized as central to the effective and efficient operation of industry and commerce and is widely practiced. Risk Management has economic, political and human dimensions, which in all cases involve pivotal judgments relating to the acceptability or (as appropriate) tolerability of the criteria which underpin the executive decisions and actions in the risk management process. How robust are the techniques used to arrive at such judgments? And how can existing variations in tolerability criteria be explained or justified? The developing methodologies contain many uncertainties (for example, selection of failure cases from a range of possibilities; failure possibilities in each case; scale of modeling and consequence uncertainties; model validation; parameter values of the models used; uncertainties in enhancing and mitigating factors). How far do these uncertainties affect the validity of risk management decisions? And how sensitive are these decisions to aspects of uncertainty? How far do the influences affecting public perception of the type, nature and magnitude of any risks affect the nature of risk management? (For example, issues such as voluntary vs involuntary exposure; natural vs man-made risks, perceptions of personal control, familiarity, perceptions of benefit or disbenefit, the nature of the hazard, the nature of the threat, the special vulnerability of sensitive groups, public perceptions of comparators, reversibility of effects, all may be felt to influence significantly the decision making process.) Expression and communication of risk (particularly methods of calculating and expressing societal risk) may compound any problems.

  2. Measurement uncertainty relations

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-01

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  3. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  4. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  5. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  6. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  7. The value of uncertainty.

    PubMed

    Feldman, Michael

    2013-01-01

    The author discusses some of the characteristics of Roy Schafer's contributions to psychoanalysis that he finds most valuable, such as his openness to uncertainty, his anti-reductive view of analytic constructions, his unique formulation of the analyst's role, and his close attention to how the patient engenders particular emotional reactions in the analyst. The author also presents a clinical vignette illustrating the value of the analyst's tolerance of uncertainty in the face of the patient's push for interpretations, explanations, and reassurance. PMID:23457099

  8. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  9. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  10. Measurement Uncertainty Estimation in Amperometric Sensors: A Tutorial Review

    PubMed Central

    Helm, Irja; Jalukse, Lauri; Leito, Ivo

    2010-01-01

    This tutorial focuses on measurement uncertainty estimation in amperometric sensors (both for liquid and gas-phase measurements). The main uncertainty sources are reviewed and their contributions are discussed with relation to the principles of operation of the sensors, measurement conditions and properties of the measured samples. The discussion is illustrated by case studies based on the two major approaches for uncertainty evaluation–the ISO GUM modeling approach and the Nordtest approach. This tutorial is expected to be of interest to workers in different fields of science who use measurements with amperometric sensors and need to evaluate the uncertainty of the obtained results but are new to the concept of measurement uncertainty. The tutorial is also expected to be educative in order to make measurement results more accurate. PMID:22399887

  11. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  12. Quasar uncertainty study

    SciTech Connect

    Khatib-Rahbar, M.; Park, C.; Davis, R.; Nourbakhsh, H.; Lee, M.; Cazzoli, E.; Schmidt, E.

    1986-10-01

    Over the last decade, substantial development and progress has been made in the understanding of the nature of severe accidents and associated fission product release and transport. As part of this continuing effort, the United States Nuclear Regulatory Commission (USNRC) sponsored the development of the Source Term Code Package (STCP), which models core degradation, fission product release from the damaged fuel, and the subsequent migration of the fission products from the primary system to the containment and finally to the environment. The objectives of the QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors) program are: (1) to address the uncertainties associated with input parameters and phenomenological models used in the STCP; and (2) to define reasonable and technically defensible parameter ranges and modelling assumptions for the use in the STCP. The uncertainties in the radiological releases to the environment can be defined as the degree of current knowledge associated with the magnitude, the timing, duration, and other pertinent characteristics of the release following a severe nuclear reactor accident. These uncertainties can be quantified by probability density functions (PDF) using the Source Term Code Package as the physical model. An attempt will also be made to address the phenomenological issues not adequately modeled by the STCP, using more advanced, mechanistic models.

  13. Reciprocity and uncertainty.

    PubMed

    Bereby-Meyer, Yoella

    2012-02-01

    Guala points to a discrepancy between strong negative reciprocity observed in the lab and the way cooperation is sustained "in the wild." This commentary suggests that in lab experiments, strong negative reciprocity is limited when uncertainty exists regarding the players' actions and the intentions. Thus, costly punishment is indeed a limited mechanism for sustaining cooperation in an uncertain environment. PMID:22289307

  14. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  15. Hybrid uncertainty theory

    SciTech Connect

    Oblow, E.M.

    1985-05-13

    A hybrid uncertainty theory for artificial intelligence problems combining the strengths of fuzzy-set theory and Dempster/Shafer theory is presented. The basic operations for combining uncertain information are given with an indication of their applicability in expert systems and robot planning problems.

  16. Clearinghouse: hope and uncertainty.

    PubMed

    1997-06-01

    The field of post-traumatic stress syndrome, as it relates to disease survival and HIV/AIDS, is the subject of books, papers, and research. This reference section lists material related to patient outlook and despair, living with uncertainty, loss and grief, and survival mechanisms. Research contacts in Fremont, CA and New York City are listed. PMID:11364543

  17. The equivalence principle in a quantum world

    NASA Astrophysics Data System (ADS)

    Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre

    2015-09-01

    We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).

  18. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  19. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy. PMID:10174798

  20. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  1. Uncertainty quantification for Markov chain models.

    PubMed

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated. PMID:23278037

  2. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  3. Intuitions, principles and consequences.

    PubMed

    Shaw, A B

    2001-02-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. PMID:11233371

  4. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  5. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  6. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  7. Multiresolutional models of uncertainty generation and reduction

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.

  8. The 4P Approach to Dealing with Scientific Uncertainty.

    ERIC Educational Resources Information Center

    Costanza, Robert; Cornwell, Laura

    1992-01-01

    Suggests a new approach to environmental protection that requires users of environmental resources to post a bond adequate to cover uncertain future environmental damages. Summarized as the "precautionary polluter pays principle," or the 4P approach, it shifts the burden of proof and the cost of uncertainty from the public to the resource user.…

  9. Separability conditions from the Landau-Pollak uncertainty relation

    SciTech Connect

    Vicente, Julio I. de; Sanchez-Ruiz, Jorge

    2005-05-15

    We obtain a collection of necessary (sufficient) conditions for a bipartite system of qubits to be separable (entangled), which are based on the Landau-Pollak formulation of the uncertainty principle. These conditions are tested and compared with previously stated criteria by applying them to states whose separability limits are already known. Our results are also extended to multipartite and higher-dimensional systems.

  10. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  11. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that

  12. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  13. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  14. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  15. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  16. Equivalence of wave-particle duality to entropic uncertainty.

    PubMed

    Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie

    2014-01-01

    Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138

  17. Nonequivalence of equivalence principles

    NASA Astrophysics Data System (ADS)

    Di Casola, Eolo; Liberati, Stefano; Sonego, Sebastiano

    2015-01-01

    Equivalence principles played a central role in the development of general relativity. Furthermore, they have provided operative procedures for testing the validity of general relativity, or constraining competing theories of gravitation. This has led to a flourishing of different, and inequivalent, formulations of these principles, with the undesired consequence that often the same name, "equivalence principle," is associated with statements having a quite different physical meaning. In this paper, we provide a precise formulation of the several incarnations of the equivalence principle, clarifying their uses and reciprocal relations. We also discuss their possible role as selecting principles in the design and classification of viable theories of gravitation.

  18. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  19. Using Models that Incorporate Uncertainty

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.

    2002-01-01

    In this article, the author discusses the use in policy analysis of models that incorporate uncertainty. He believes that all models should consider incorporating uncertainty, but that at the same time it is important to understand that sampling variability is not usually the dominant driver of uncertainty in policy analyses. He also argues that…

  20. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  1. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  2. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory.

    PubMed

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details. PMID:26118488

  3. Position-momentum uncertainty relations based on moments of arbitrary order

    SciTech Connect

    Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.

    2011-05-15

    The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

  4. Schwarzschild mass uncertainty

    NASA Astrophysics Data System (ADS)

    Davidson, Aharon; Yellin, Ben

    2014-02-01

    Applying Dirac's procedure to -dependent constrained systems, we derive a reduced total Hamiltonian, resembling an upside down harmonic oscillator, which generates the Schwarzschild solution in the mini super-spacetime. Associated with the now -dependent Schrodinger equation is a tower of localized Guth-Pi-Barton wave packets, orthonormal and non-singular, admitting equally spaced average-`energy' levels. Our approach is characterized by a universal quantum mechanical uncertainty structure which enters the game already at the flat spacetime level, and accompanies the massive Schwarzschild sector for any arbitrary mean mass. The average black hole horizon surface area is linearly quantized.

  5. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

  6. The precautionary principle and ecological hazards of genetically modified organisms.

    PubMed

    Giampietro, Mario

    2002-09-01

    This paper makes three points relevant to the application of the precautionary principle to the regulation of GMOs. i) The unavoidable arbitrariness in the application of the precautionary principle reflects a deeper epistemological problem affecting scientific analyses of sustainability. This requires understanding the difference between the concepts of "risk", "uncertainty" and "ignorance". ii) When dealing with evolutionary processes it is impossible to ban uncertainty and ignorance from scientific models. Hence, traditional risk analysis (probability distributions and exact numerical models) becomes powerless. Other forms of scientific knowledge (general principles or metaphors) may be useful alternatives. iii) The existence of ecological hazards per se should not be used as a reason to stop innovations altogether. However, the precautionary principle entails that scientists move away from the concept of "substantive rationality" (trying to indicate to society optimal solutions) to that of "procedural rationality" (trying to help society to find "satisficing" solutions). PMID:12436844

  7. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  8. Improvement of Statistical Decisions under Parametric Uncertainty

    NASA Astrophysics Data System (ADS)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  9. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  10. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  11. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  12. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  13. Physical principles of hearing

    NASA Astrophysics Data System (ADS)

    Martin, Pascal

    2015-10-01

    The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography

  14. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health. PMID:14585517

  15. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity to assessing the damage to different elements at risk, of the databases on different elements at risk, such as population and building stock distribution, as well critical facilities characteristics, on the reliability of expected loss estimations at regional and global scale.

  16. Interference of macroscopic beams on a beam splitter: phase uncertainty converted into photon-number uncertainty

    NASA Astrophysics Data System (ADS)

    Spasibko, K. Yu; Töppel, F.; Iskhakov, T. Sh; Stobińska, M.; Chekhova, M. V.; Leuchs, G.

    2014-01-01

    Squeezed-vacuum twin beams, commonly generated through parametric down-conversion, are known to have perfect photon-number correlations. According to the Heisenberg principle, this is accompanied by a huge uncertainty in their relative phase. By overlapping bright twin beams on a beam splitter, we convert phase fluctuations into photon-number fluctuations and observe this uncertainty as a typical ‘U-shape’ of the output photon-number distribution. This effect, although reported for atomic ensembles and giving hope for phase super-resolution, has never been observed for light beams. The shape of the normalized photon-number difference distribution is similar to the one that would be observed for high-order Fock states. It can be also mimicked by classical beams with artificially mixed phase, but without any perspective for phase super-resolution. The probability distribution at the beam splitter output can be used for filtering macroscopic superpositions at the input.

  17. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  18. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  19. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1971-01-01

    Satellite altitude determination uncertainties are discussed from the standpoint of the GEOS-C satellite. GEOS-C will be tracked by a number of the conventional satellite tracking systems, as well as by two advanced systems; a satellite-to-satellite tracking system and lasers capable of decimeter accuracies which are being developed in connection with the Goddard Earth and Ocean Dynamics Applications program. The discussion is organized in terms of a specific type of GEOS-C orbit which would satisfy a number of scientific objectives including the study of the gravitational field by means of both the altimeter and the satellite-to-satellite tracking system, studies of tides, and the Gulf Stream meanders.

  20. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables. PMID:25763941

  1. Principled Grammar Teaching

    ERIC Educational Resources Information Center

    Batstone, Rob; Ellis, Rod

    2009-01-01

    A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world

  2. The genetic difference principle.

    PubMed

    Farrelly, Colin

    2004-01-01

    In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice. PMID:15186680

  3. Principles of learning.

    PubMed

    Voith, V L

    1986-12-01

    This article discusses some general principles of learning as well as possible constraints and how such principles can apply to horses. A brief review is presented of experiments that were designed to assess learning in horses. The use of behavior modification techniques to treat behavior problems in horses is discussed and several examples of the use of these techniques are provided. PMID:3492241

  4. Principled Grammar Teaching

    ERIC Educational Resources Information Center

    Batstone, Rob; Ellis, Rod

    2009-01-01

    A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world…

  5. Hamilton's Principle for Beginners

    ERIC Educational Resources Information Center

    Brun, J. L.

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…

  6. The anthropic principle

    NASA Astrophysics Data System (ADS)

    Rosen, Joe

    1985-04-01

    The anthropic principle states that the fact of existence of intelligent beings may be a valid explanation of why the universe and laws of physics are as they are. The origin and some of the deeper implications of the principle are investigated. The discussion involves considerations of physics and metaphysics, unified schemes and holism, the nature of physical explanation, realism and idealism, and symmetry.

  7. Uncertainty As Knowledge: Harnessing Ambiguity and Uncertainty into Policy Constraints

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.

    2014-12-01

    There are numerous sources of uncertainty that impact policy decisions relating to climate change: There is scientific uncertainty, as for example encapsulated in estimates of climate sensitivity. There is policy uncertainty, which arises when mitigation efforts are erratic or are reversed (as recently happened in Australia). There is also technological uncertainty which affects the mitigation pathway. How can policy decisions be informed in light of these multiple sources of uncertainty? We propose an "ordinal" approach that relies on comparisons such as "greater than" or "lesser than" (known as ordinal), which can help sidestep disagreement about specific parameter estimates (e.g., climate sensitivity). To illustrate, recent analyses (Lewandowsky et al., 2014, Climatic Change) have shown that the magnitude of uncertainty about future temperature increases is directly linked with the magnitude of future risk: the greater the uncertainty, the greater the risk of mitigation failure (defined as exceeding a carbon budget for a predetermined threshold). Here we extend this approach to other sources of uncertainty, with a particular focus on "ambiguity" or "second-order" uncertainty, which arises when there is dissent among experts.

  8. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  9. Principlism and communitarianism.

    PubMed

    Callahan, D

    2003-10-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help. PMID:14519838

  10. Uncertainty in Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2006-12-01

    Uncertainty is a part of our life, and society has to deal with it, even though it is sometimes difficult to estimate. This is particularly true in seismic hazard assessment for large events, such as the mega-tsunami in Southeast Asia and the great New Madrid earthquakes in the central United States. There are two types of uncertainty in seismic hazard assessment: temporal and spatial. Temporal uncertainty describes distribution of the events in time and is estimated from the historical records, while spatial uncertainty describes distribution of physical measurements generated at a specific point by the events and is estimated from the measurements at the point. These uncertainties are of different characteristics and generally considered separately in hazard assessment. For example, temporal uncertainty (i.e., the probability of exceedance in a period) is considered separately from spatial uncertainty (a confidence level of physical measurement) in flood hazard assessment. Although estimating spatial uncertainty in seismic hazard assessment is difficult because there are not enough physical measurements (i.e., ground motions), it can be supplemented by numerical modeling. For example, the ground motion uncertainty or tsunami uncertainty at a point of interest has been estimated from numerical modeling. Estimating temporal uncertainty is particularly difficult, especially for large earthquakes, because there are not enough instrumental, historical, and geological records. Therefore, the temporal and spatial uncertainties in seismic hazard assessment are of different characteristics and should be determined separately. Probabilistic seismic hazard analysis (PSHA), the most widely used method to assess seismic hazard for various aspects of public and financial policy, uses spatial uncertainty (ground motion uncertainty) to extrapolate temporal uncertainty (ground motion occurrence), however. This extrapolation, or so-called ergodic assumption, is caused by a mathematical error in hazard calculation of PSHA: incorrectly equating the conditional exceedance probability of the ground-motion attenuation relationship (a function) to the exceedance probability of the ground-motion uncertainty (a variable). An alternative approach has been developed to correct the error and to determine temporal and spatial uncertainties separately.

  11. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  12. A Robust Approach to Inventory Optimization Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Aswal, Abhilasha; Srinivasa Prasanna, G. N.

    2009-10-01

    In this paper we present an application of robust optimization to inventory optimization under uncertainty. We represent uncertainty in a constraint based framework derived from basic economic principles. This approach offers the ability to use information theoretic concepts to quantify the amount of information used in the optimization. The results are shown to correspond to classical models such as EOQ in simple cases. Not only this, the presented model easily incorporates more realistic constraints, which are complicated and are not easily incorporated by the classical models.

  13. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  14. Uncertainty and Anticipation in Anxiety

    PubMed Central

    Grupe, Dan W.; Nitschke, Jack B.

    2014-01-01

    Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199

  15. Evaluating uncertainty in simulation models

    SciTech Connect

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  16. Uncertainty analysis of thermoreflectance measurements

    NASA Astrophysics Data System (ADS)

    Yang, Jia; Ziade, Elbara; Schmidt, Aaron J.

    2016-01-01

    We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica.

  17. Managing Uncertainty in Data and Models: UncertWeb

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Cornford, D.; Pebesma, E. J.

    2010-12-01

    There is an increasing recognition that issues of quality, error and uncertainty are central concepts to both scientific progress and practical decision making. Recent moves towards evidence driven policy and complex, uncertain scientific investigations into climate change and its likely impacts have heightened the awareness that uncertainty is critical in linking our observations and models to reality. The most natural, principled framework is provided by Bayesian approaches, which recognise a variety of sources of uncertainty such as aleatory (variability), epistemic (lack of knowledge) and possibly ontological (lack of agreed definitions). Most current information models used in the geosciences do not fully support the communication of uncertain results, although some do provide limited support for quality information in metadata. With the UncertWeb project (http://www.uncertweb.org), involving statisticians, geospatial and application scientists and informaticians we are developing a framework for representing and communicating uncertainty in observational data and models which builds on existing standards such as the Observations and Measurements conceptual model, and related Open Geospatial Consortium and ISO standards to allow the communication and propagation of uncertainty in chains of model services. A key component is the description of uncertainties in observational data, based on a revised version of UncertML, a conceptual model and encoding for representing uncertain quantities. In this talk we will describe how we envisage using UncertML with existing standards to describe the uncertainty in observational data and how this uncertainty information can then be propagated through subsequent analysis. We will highlight some of the tools which we are developing within UncertWeb to support the management of uncertainty in web based geoscientific applications.

  18. Group environmental preference aggregation: the principle of environmental justice

    SciTech Connect

    Davos, C.A.

    1986-01-01

    The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.

  19. Uncertainty analysis in RECCAP

    NASA Astrophysics Data System (ADS)

    Enting, I. G.

    2010-12-01

    The Global Carbon Project RECCAP exercise aims to produce regional analyses of net carbon fluxes between the atmosphere and the land and ocean carbon systems. The project aims to synthesise multiple source of information from modelling, inversions and inventory studies. A careful analysis of uncertainty is essential, both for the final synthesis and for assuring consistency in the process of combining disparate inputs. A unifying approach is to treat the overall analysis as a process of statistical estimation. The broadest-scale grouping of approaches is `top-down' vs. `bottom-up' techniques, but each of these needs to be further partitioned. Top-down approaches generally take the form of inversions, using measurements of carbon dioxide concentrations to either deduce surface concentrations or deduce parameters in spatially-explicit process-based models. These two types of inversion will have somewhat different statistical characteristics, but each will achieve only limited spatial resolution due to the ill-conditioned nature of the inversion. Bottom-up techniques aim to resolve great spatial detail. They comprise both census-type studies (mainly for anthropogenic emissions) and modelling studies with remotely-sensed data to provide spatially and temporally explicit forcing or constraints. Again, these two types of approach are likely to have quite different statistical characteristics. An important issue in combining information is consistency between definitions used for the disparate components. Cases where there is significant potential for ambiguity include wildfire and delayed responses to land-use change. A particular concern is the potential for `double counting' when combining bottom-up estimates with the results of inversion techniques that have incorporated Bayesian constraints using the same data as is used in the bottom-up estimates. The communication of distribution of uncertainty in one time and two space dimensions poses particular challenges. Temporal variability can be usefully characterised in terms of long-term trends, seasonal cycles and irregular variability. Additional choices need to be made concerning the frequency ranges that define each of these components. Spatial resolution remains problematic with the diffuse boundaries of top-down approaches failing to match the sharp boundaries from bottom-up techniques.

  20. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, wave patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  1. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  2. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  3. Principles of Technology Spinoffs.

    ERIC Educational Resources Information Center

    Hammer, Douglas E.; Thode, Brad

    1989-01-01

    The authors discuss "Principles of Technology," a standard introduction to technology in many secondary schools. They suggest the possibility of introducing teacher-developed spinoff activities into the curriculum and provide several examples. (CH)

  4. Global ethics and principlism.

    PubMed

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together. PMID:22073817

  5. Multi-band pyrometer uncertainty analysis and improvement

    NASA Astrophysics Data System (ADS)

    Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

    2010-12-01

    According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

  6. Uncertainties in the Astronomical Ephemeris as Constraints on New Physics

    NASA Astrophysics Data System (ADS)

    Warecki, Zoey; Overduin, J.

    2014-01-01

    Most extensions of the standard model of particle physics predict composition-dependent violations of the universality of free fall (equivalence principle). We test this idea using observational uncertainties in mass, range and mean motion for the Moon and planets, as well as orbit uncertainties for Trojan asteroids and Saturnian satellites. For suitable pairs of solar-system bodies, we derive linearly independent constraints on relative difference in gravitational and inertial mass from modifications to Kepler's third law, the migration of stable Lagrange points, and orbital polarization (the Nordtvedt effect). These constraints can be combined with data on bulk composition to extract limits on violations of the equivalence principle for individual elements relative to one another. These limits are weaker than those from laboratory experiments, but span a much larger volume in composition space.

  7. New intelligent power quality analyzer and dynamic uncertainty research

    NASA Astrophysics Data System (ADS)

    Feng, Xu-gang; Zhang, Jia-yan; Fei, Ye-tai

    2010-08-01

    This paper presents a novel intelligent power quality analyzer, which can be used to analyze the collected dynamic data using the modern uncertainty principle. The analyzer consists of components used for data acquisition, communication, display, storage and so on, and has some advantages including strong computing ability, good on-line performance, large storage capacity, high precision, and user friendly interface, etc. In addition, the reliability of measurement results is evaluated according to the international standards; while the uncertainty principle of the international survey is adopted for the evaluation of an electrical energy quality analyzer for the first time, it offer a perfect GB code in addition to the evidence to a perfect GB code.

  8. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  9. Designing for Uncertainty: Three Approaches

    ERIC Educational Resources Information Center

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  10. Hydrology, society, change and uncertainty

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  11. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  12. Planning ATES systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.

  13. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  14. A variational principle in optics.

    PubMed

    Rubinstein, Jacob; Wolansky, Gershon

    2004-11-01

    We derive a new variational principle in optics. We first formulate the principle for paraxial waves and then generalize it to arbitrary waves. The new principle, unlike the Fermat principle, concerns both the phase and the intensity of the wave. In particular, the principle provides a method for finding the ray mapping between two surfaces in space from information on the wave's intensity there. We show how to apply the new principle to the problem of phase reconstruction from intensity measurements. PMID:15535374

  15. Estimating uncertainties in watershed studies

    NASA Astrophysics Data System (ADS)

    Campbell, John; Yanai, Ruth; Green, Mark

    2011-06-01

    Quantifying Uncertainty in Ecosystem Studies (QUEST) Workshop: Uncertainty in Hydrologic Fluxes of Elements at the Small Watershed Scale; Boston, Massachusetts, 14-15 March 2011; Small watersheds have been used widely to quantify chemical fluxes and cycling in terrestrial ecosystems for about the past half century. The small watershed approach has been valuable in characterizing hydrologic and nutrient budgets, for instance, in estimating the net gain or loss of solutes in response to disturbance. However, the uncertainty in these ecosystem budget calculations is generally ignored. Without uncertainty estimates in watershed studies, it is difficult to evaluate the significance of observed differences between watersheds or changes in budgets over time, and erroneous conclusions may be drawn. The historical lack of attention given to uncertainty has been due at least in part to the lack of appropriate analytical tools and approaches. The issue of uncertainty has been confronted more rigorously in other disciplines, yet the advances made have not been comprehensively applied to biogeochemical input-output budgets. In recent years, there has been growing recognition that estimates of uncertainty are essential for coming to sound scientific conclusions, identifying which budget components most need improvement, and developing more efficient monitoring strategies, thereby maximizing information gained per unit cost.

  16. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  17. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find

  18. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  19. The traveltime holographic principle

    NASA Astrophysics Data System (ADS)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  20. Applying the four principles.

    PubMed

    Macklin, R

    2003-10-01

    Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle. PMID:14519836

  1. Uncertainty of testing methods--what do we (want to) know?

    PubMed

    Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone

    2013-01-01

    It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods. PMID:23665803

  2. Uncertainties in large space systems

    NASA Technical Reports Server (NTRS)

    Fuh, Jon-Shen

    1988-01-01

    Uncertainties of a large space system (LSS) can be deterministic or stochastic in nature. The former may result in, for example, an energy spillover problem by which the interaction between unmodeled modes and controls may cause system instability. The stochastic uncertainties are responsible for mode localization and estimation errors, etc. We will address the effects of uncertainties on structural model formulation, use of available test data to verify and modify analytical models before orbiting, and how the system model can be further improved in the on-orbit environment.

  3. Uncertainty relations for characteristic functions

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz; Tasca, D. S.; Walborn, S. P.

    2016-02-01

    We present the uncertainty relation for the characteristic functions (ChUR) of the quantum mechanical position and momentum probability distributions. This inequality is more general than the Heisenberg uncertainty relation and is saturated in two extreme cases for wave functions described by periodic Dirac combs. We further discuss a broad spectrum of applications of the ChUR; in particular, we constrain quantum optical measurements involving general detection apertures and provide the uncertainty relation that is relevant for loop quantum cosmology. A method to measure the characteristic function directly using an auxiliary qubit is also briefly discussed.

  4. Managing uncertainty in family practice.

    PubMed Central

    Biehn, J.

    1982-01-01

    Because patients present in the early stages of undifferentiated problems, the family physician often faces uncertainty, especially in diagnosis and management. The physician's uncertainty may be unacceptable to the patient and may lead to inappropriate use of diagnostic procedures. The problem is intensified by the physician's hospital training, which emphasizes mastery of available knowledge and decision-making based on certainty. Strategies by which a physician may manage uncertainty include (a) a more open doctor-patient relationship, (b) understanding the patient's reason for attending the office, (c) a thorough assessment of the problem, (d) a commitment to reassessment and (e) appropriate consultation. PMID:7074488

  5. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  6. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

  7. Principles of Optics

    NASA Astrophysics Data System (ADS)

    Born, Max; Wolf, Emil

    1999-10-01

    Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.

  8. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  9. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

  10. The Bayesian brain: phantom percepts resolve sensory uncertainty.

    PubMed

    De Ridder, Dirk; Vanneste, Sven; Freeman, Walter

    2014-07-01

    Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the Bayesian updating of knowledge via active sensory exploration of the environment, driven by the Shannonian free-energy principle, provides an explanation for the generation of phantom percepts, as a way to reduce uncertainty, to make sense of the world. PMID:22516669

  11. Uncertainty analysis of thermoreflectance measurements.

    PubMed

    Yang, Jia; Ziade, Elbara; Schmidt, Aaron J

    2016-01-01

    We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica. PMID:26827342

  12. Scientific Uncertainty: An Industry Perspective.

    ERIC Educational Resources Information Center

    Perhac, Ralph

    1986-01-01

    Discusses the uncertainties inherent in assessing the nature and extent of any damage that might be attributed to acidic deposition. Probes associated dilemmas related to decisions involving control strategies, and indicates societal and legislative roles for solving this problem. (ML)

  13. Fundamental uncertainties in lung counting.

    PubMed

    Kramer, Gary H; Hauck, Barry M

    2007-10-01

    The HML has investigated the effect the uncertainty introduced into an activity estimate from a lung count due to 1) replicate counts and 2) lung set variability. Replicate counts in the HML seem to only be affected by random statistics as the uncertainty can be predicted by Monte Carlo simulations. These findings from the lung set variability experiments suggest that a lung set has an unquantified uncertainty on its activity that adds a component to the uncertainty on the counting efficiency, and ultimately the activity estimate, as they can differ by as much as 30% at 17.5 keV or about 13% at 185.7 keV, when one is expecting only a 3% difference. PMID:17846529

  14. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  15. Climate targets: Values and uncertainty

    NASA Astrophysics Data System (ADS)

    Lempert, Robert J.

    2015-10-01

    Policymakers know that the risks associated with climate change mean they need to cut greenhouse-gas emissions. But uncertainty surrounding the likelihood of different scenarios makes choosing specific policies difficult.

  16. Visualizing uncertainty about the future.

    PubMed

    Spiegelhalter, David; Pearson, Mike; Short, Ian

    2011-09-01

    We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge. PMID:21903802

  17. Haplotype uncertainty in association studies.

    PubMed

    Mensah, F K; Gilthorpe, M S; Davies, C F; Keen, L J; Adamson, P J; Roman, E; Morgan, G J; Bidwell, J L; Law, G R

    2007-05-01

    Inferring haplotypes from genotype data is commonly undertaken in population genetic association studies. Within such studies the importance of accounting for uncertainty in the inference of haplotypes is well recognised. We investigate the effectiveness of correcting for uncertainty using simple methods based on the output provided by the PHASE haplotype inference methodology. In case-control analyses investigating non-Hodgkin lymphoma and haplotypes associated with immune regulation we find little effect of making adjustment for uncertainty in inferred haplotypes. Using simulation we introduce a higher degree of haplotype uncertainty than was present in our study data. The simulation represents two genetic loci, physically close on a chromosome, forming haplotypes. Considering a range of allele frequencies, degrees of linkage between the loci, and frequency of missing genotype data, we detail the characteristics of genetic regions which may be susceptible to the influence of haplotype uncertainty. Within our evaluation we find that bias is avoided by considering haplotype probabilities or using multiple imputation, provided that for each of these methods haplotypes are inferred separately for case and control populations; furthermore using multiple imputation provides the facility to incorporate haplotype uncertainty in the estimation of confidence intervals. We discuss the implications of our findings within the context of the complexity of haplotype inference for larger marker rich regions as would typically be encountered in genetic analyses. PMID:17323369

  18. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  19. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  20. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  1. Business Principles 201.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    This teaching guide consists of guidelines for conducting a secondary-level course on business principles. Intended as part of an office skills or accounting/data processing program, the course provides the management viewpoint toward the planning and operation of a business. First, the goals and objectives of the course are outlined. Provided…

  2. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  3. STANDARD SETTING PRINCIPLES

    EPA Science Inventory

    The basis for setting drinking water standards has not changed much in principle during the past decade, but the procedure for creating them in an open manner has caused the United States, at least, to go through a much more elaborate process to obtain approval and support from t...

  4. Business Principles 201.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    This teaching guide consists of guidelines for conducting a secondary-level course on business principles. Intended as part of an office skills or accounting/data processing program, the course provides the management viewpoint toward the planning and operation of a business. First, the goals and objectives of the course are outlined. Provided

  5. First Principles of Instruction.

    ERIC Educational Resources Information Center

    Merrill, M. David

    2002-01-01

    Examines instructional design theories and elaborates principles about when learning is promoted, i.e., when learners are engaged in solving real-world problems, when existing knowledge is activated as a foundation for new knowledge, and when new knowledge is demonstrated to the learner, applied by the learner, and integrated into the learner's…

  6. Principles of Cancer Screening.

    PubMed

    Pinsky, Paul F

    2015-10-01

    Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike. PMID:26315516

  7. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In

  8. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  9. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  10. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  11. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  12. Uncertainty in perception and the Hierarchical Gaussian Filter.

    PubMed

    Mathys, Christoph D; Lomakina, Ekaterina I; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H; Friston, Karl J; Stephan, Klaas E

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient-but at the same time intuitive-framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  13. Uncertainty in perception and the Hierarchical Gaussian Filter

    PubMed Central

    Mathys, Christoph D.; Lomakina, Ekaterina I.; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H.; Friston, Karl J.; Stephan, Klaas E.

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder–Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient—but at the same time intuitive—framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  14. Study on uncertainty of geospatial semantic Web services composition based on broker approach and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Yang, Xiaodong; Cui, Weihong; Liu, Zhen; Ouyang, Fucheng

    2008-10-01

    The Semantic Web has a major weakness which is lacking of a principled means to represent and reason about uncertainty. This is also located in the services composition approaches such as BPEL4WS and Semantic Description Model. We analyze the uncertainty of Geospatial Web Service composition through mining the knowledge in historical records of composition based on Broker approach and Bayesian Networks. We proved this approach is effective and efficient through a sample scenario in this paper.

  15. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  16. Uncertainty Quantification in Solidification Modelling

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  17. Communicating Uncertainties on Climate Change

    NASA Astrophysics Data System (ADS)

    Planton, S.

    2009-09-01

    The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

  18. Principles of Natural Photosynthesis.

    PubMed

    Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A

    2016-01-01

    Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285

  19. Principles of magnetodynamic chemotherapy.

    PubMed

    Babincová, M; Leszczynska, D; Sourivong, P; Babinec, P; Leszczynski, J

    2004-01-01

    Basic principles of a novel method of cancer treatment are explained. Method is based on the thermal activation of an inactive prodrug encapsulated in magnetoliposomes via Neél and Brown effects of inductive heating of subdomain superparamagnetic particles to sufficiently high temperatures. This principle may be combined with targeted drug delivery (using constant magnetic field) and controlled release (using high-frequency magnetic field) of an activated drug entrapped in magnetoliposomes. Using this method drug may be applied very selectively in the particular site of organism and this procedure may be repeated several times using e.g. stealth magnetoliposomes which are circulating in a blood-stream for several days. Moreover the magnetoliposomes concentrated by external constant magnetic field in tumor vasculature may lead to embolic lesions and necrosis of a tumor body and further the heat produced for thermal activation of a drug enhances the effect of chemotherapy by local hyperthermic treatment of neoplastic cells. PMID:14975506

  20. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  1. Common principles and multiculturalism.

    PubMed

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  2. The precautionary principle in environmental science.

    PubMed Central

    Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

    2001-01-01

    Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114

  3. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  4. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-02-23

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  5. Principles of dielectrics

    SciTech Connect

    Scaife, B.K.P.

    1989-01-01

    This paper focuses on basic principles of the theory of dielectrics. Concentrates on fundamentals including relevant areas of electrostatics. Author takes a completely classical approach, avoiding quantum mechanics altogether. The electrostatic field in free space, multipole-moment fluctuations the generalized Kubo equation, the thermodynamics of electrostriction, and the incremental permittivity tensor are among the specific topics examined. Extensive appendices, specific journal references, and a general bibliography are included. Intended for advanced undergraduates in the physical sciences.

  6. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  7. Teaching professionalism: general principles.

    PubMed

    Cruess, Richard L; Cruess, Sylvia R

    2006-05-01

    There are educational principles that apply to the teaching of professionalism during undergraduate education and postgraduate training. It is axiomatic that there is a single cognitive base that applies with increasing moral force as students enter medical school, progress to residency or registrar training, and enter practice. While parts of this body of knowledge are easier to teach and learn at different stages of an individual's career, it remains a definable whole at all times and should be taught as such. While the principle that self-reflection on theoretical and real issues encountered in the life of a student, resident or practitioner is essential to the acquisition of experiential learning and the incorporation of the values and behaviors of the professional, the opportunities to provide situations where this can take place will change as an individual progresses through the system, as will the sophistication of the level of learning. Teaching the cognitive base of professionalism and providing opportunities for the internalization of its values and behaviors are the cornerstones of the organization of the teaching of professionalism at all levels. Situated learning theory appears to provide practical guidance as to how this may be implemented. While the application of this theory will vary with the type of curriculum, the institutional culture and the resources available, the principles outlined should remain constant. PMID:16753716

  8. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  9. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  10. Climate negotiations under scientific uncertainty.

    PubMed

    Barrett, Scott; Dannenberg, Astrid

    2012-10-23

    How does uncertainty about "dangerous" climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners' dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners' dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  11. Uncertainty in Integrative Structural Modeling

    PubMed Central

    Schneidman-Duhovny, Dina; Pellarin, Riccardo; Sali, Andrej

    2014-01-01

    Integrative structural modelling uses multiple types of input information and proceeds in four stages: (i) gathering information, (ii) designing model representation and converting information into a scoring function, (iii) sampling good-scoring models, and (iv) analyzing models and information. In the first stage, uncertainty originates from data that are sparse, noisy, ambiguous, or derived from heterogeneous samples. In the second stage, uncertainty can originate from a representation that is too coarse for the available information or a scoring function that does not accurately capture the information. In the third stage, the major source of uncertainty is insufficient sampling. In the fourth stage, clustering, cross-validation, and other methods are used to estimate the precision and accuracy of the models and information. PMID:25173450

  12. Uncertainty formulations for multislit interferometry

    NASA Astrophysics Data System (ADS)

    Biniok, Johannes C. G.

    2014-12-01

    In the context of (far-field) multislit interferometry we investigate the utility of two formulations of uncertainty in accounting for the complementarity of spatial localization and fringe width. We begin with a characterization of the relevant observables and general considerations regarding the suitability of different types of measures. The detailed analysis shows that both of the discussed uncertainty formulations yield qualitatively similar results, confirming that they correctly capture the relevant tradeoff. One approach, based on an idea of Aharonov and co-workers, is intuitively appealing and relies on a modification of the Heisenberg uncertainty relation. The other approach, developed by Uffink and Hilgevoord for single- and double-slit experiments, is readily applied to multislits. However, it is found that one of the underlying concepts requires generalization and that the choice of the parameters requires more consideration than was known.

  13. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  14. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  15. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  16. Uncertainties in climate data sets

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1992-01-01

    Climate diagnostics are constructed from either analyzed fields or from observational data sets. Those that have been commonly used are normally considered ground truth. However, in most of these collections, errors and uncertainties exist which are generally ignored due to the consistency of usage over time. Examples of uncertainties and errors are described in NMC and ECMWF analyses and in satellite observational sets-OLR, TOVS, and SMMR. It is suggested that these errors can be large, systematic, and not negligible in climate analysis.

  17. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  18. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle

  19. Uncertainty in 3D gel dosimetry

    NASA Astrophysics Data System (ADS)

    De Deene, Yves; Jirasek, Andrew

    2015-01-01

    Three-dimensional (3D) gel dosimetry has a unique role to play in safeguarding conformal radiotherapy treatments as the technique can cover the full treatment chain and provides the radiation oncologist with the integrated dose distribution in 3D. It can also be applied to benchmark new treatment strategies such as image guided and tracking radiotherapy techniques. A major obstacle that has hindered the wider dissemination of gel dosimetry in radiotherapy centres is a lack of confidence in the reliability of the measured dose distribution. Uncertainties in 3D dosimeters are attributed to both dosimeter properties and scanning performance. In polymer gel dosimetry with MRI readout, discrepancies in dose response of large polymer gel dosimeters versus small calibration phantoms have been reported which can lead to significant inaccuracies in the dose maps. The sources of error in polymer gel dosimetry with MRI readout are well understood and it has been demonstrated that with a carefully designed scanning protocol, the overall uncertainty in absolute dose that can currently be obtained falls within 5% on an individual voxel basis, for a minimum voxel size of 5 mm3. However, several research groups have chosen to use polymer gel dosimetry in a relative manner by normalizing the dose distribution towards an internal reference dose within the gel dosimeter phantom. 3D dosimetry with optical scanning has also been mostly applied in a relative way, although in principle absolute calibration is possible. As the optical absorption in 3D dosimeters is less dependent on temperature it can be expected that the achievable accuracy is higher with optical CT. The precision in optical scanning of 3D dosimeters depends to a large extend on the performance of the detector. 3D dosimetry with X-ray CT readout is a low contrast imaging modality for polymer gel dosimetry. Sources of error in x-ray CT polymer gel dosimetry (XCT) are currently under investigation and include inherent limitations in dosimeter homogeneity, imaging performance, and errors induced through post-acquisition processing. This overview highlights a number of aspects relating to uncertainties in polymer gel dosimetry.

  20. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  1. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  2. Uncertainties in radiation flow experiments

    NASA Astrophysics Data System (ADS)

    Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.

    2016-03-01

    Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.

  3. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  4. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between

  5. Quantification of entanglement via uncertainties

    SciTech Connect

    Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S.

    2007-03-15

    We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.

  6. The physical principles of quantum mechanics. A critical review

    NASA Astrophysics Data System (ADS)

    Strocchi, F.

    2012-01-01

    The standard presentation of the principles of quantum mechanics is critically reviewed both from the experimental/operational point and with respect to the request of mathematical consistency and logical economy. A simpler and more physically motivated formulation is discussed. The existence of non commuting observables, which characterizes quantum mechanics with respect to classical mechanics, is related to operationally testable complementarity relations, rather than to uncertainty relations. The drawbacks of Dirac argument for canonical quantization are avoided by a more geometrical approach.

  7. Principles of tendon transfers.

    PubMed

    Coulet, B

    2016-04-01

    Tendon transfers are carried out to restore functional deficits by rerouting the remaining intact muscles. Transfers are highly attractive in the context of hand surgery because of the possibility of restoring the patient's ability to grip. In palsy cases, tendon transfers are only used when a neurological procedure is contraindicated or has failed. The strategy used to restore function follows a common set of principles, no matter the nature of the deficit. The first step is to clearly distinguish between deficient muscles and muscles that could be transferred. Next, the type of palsy will dictate the scope of the program and the complexity of the gripping movements that can be restored. Based on this reasoning, a surgical strategy that matches the means (transferable muscles) with the objectives (functions to restore) will be established and clearly explained to the patient. Every paralyzed hand can be described using three parameters. 1) Deficient segments: wrist, thumb and long fingers; 2) mechanical performance of muscles groups being revived: high energy-wrist extension and finger flexion that require strong transfers with long excursion; low energy-wrist flexion and finger extension movements that are less demanding mechanically, because they can be accomplished through gravity alone in some cases; 3) condition of the two primary motors in the hand: extrinsics (flexors and extensors) and intrinsics (facilitator). No matter the type of palsy, the transfer surgery follows the same technical principles: exposure, release, fixation, tensioning and rehabilitation. By performing an in-depth analysis of each case and by following strict technical principles, tendon transfer surgery leads to reproducible results; this allows the surgeon to establish clear objectives for the patient preoperatively. PMID:27117119

  8. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  9. Principles of smile design

    PubMed Central

    Bhuvaneswaran, Mohan

    2010-01-01

    An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile. PMID:21217950

  10. Principles of electromagnetic theory

    SciTech Connect

    Kovetz, A.H. )

    1990-01-01

    This book emphasizes the fundamental understanding of the laws governing the behavior of charge and current carrying bodies. Electromagnetism is presented as a classical theory, based-like mechanics-on principles that are independent of the atomic constitution of matter. This book is unique among electromagnetic texts in its treatment of the precise manner in which electromagnetism is linked to mechanics and thermodynamics. Applications include electrostriction, piezoelectricity, ferromagnetism, superconductivity, thermoelectricity, magnetohydrodynamics, radiation from charged particles, electromagnetic wave propagation and guided waves. There are many worked examples of dynamical and thermal effects of electromagnetic fields, and of effects resulting from the motion of bodies.

  11. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  12. Principles for School Drug Education

    ERIC Educational Resources Information Center

    Meyer, Lois

    2004-01-01

    This document presents a revised set of principles for school drug education. The principles for drug education in schools comprise an evolving framework that has proved useful over a number of decades in guiding the development of effective drug education. The first edition of "Principles for Drug Education in Schools" (Ballard et al. 1994) has…

  13. Academic Principles: A Brief Introduction

    ERIC Educational Resources Information Center

    Association of American Universities, 2013

    2013-01-01

    For many decades certain core principles have guided the conduct of teaching, research, and scholarship at American universities, as well as the ways in which these institutions are governed. There is ample evidence that these principles have strongly contributed to the quality of American universities. The principles have also made these

  14. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is

  15. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  16. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  17. Principles of Safety Pharmacology

    PubMed Central

    Pugsley, M K; Authier, S; Curtis, M J

    2008-01-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

  18. [Principles of callus distraction].

    PubMed

    Hankemeier, S; Bastian, L; Gosling, T; Krettek, C

    2004-10-01

    Callus distraction is based on the principle of regenerating bone by continuous distraction of proliferating callus tissue. It has become the standard treatment of significant leg shortening and large bone defects. Due to many problems and complications, exact preoperative planning, operative technique and careful postoperative follow-up are essential. External fixators can be used for all indications of callus distraction. However, due to pin tract infections, pain and loss of mobility caused by soft tissue transfixation, fixators are applied in patients with open growth plates, simultaneous lengthening with continuous deformity corrections, and increased risk of infection. Distraction over an intramedullary nail allows removal of the external fixator at the end of distraction before callus consolidation (monorail method). The intramedullary nail protects newly formed callus tissue and reduces the risk of axial deviation and refractures. Recently developed, fully intramedullary lengthening devices eliminate fixator-associated complications and accelerate return to normal daily activities. This review describes principles of callus distraction, potential complications and their management. PMID:15452653

  19. Principle of relative locality

    SciTech Connect

    Amelino-Camelia, Giovanni; Freidel, Laurent; Smolin, Lee; Kowalski-Glikman, Jerzy

    2011-10-15

    We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

  20. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  1. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  2. Alignment uncertainty and genomic analysis.

    PubMed

    Wong, Karen M; Suchard, Marc A; Huelsenbeck, John P

    2008-01-25

    The statistical methods applied to the analysis of genomic data do not account for uncertainty in the sequence alignment. Indeed, the alignment is treated as an observation, and all of the subsequent inferences depend on the alignment being correct. This may not have been too problematic for many phylogenetic studies, in which the gene is carefully chosen for, among other things, ease of alignment. However, in a comparative genomics study, the same statistical methods are applied repeatedly on thousands of genes, many of which will be difficult to align. Using genomic data from seven yeast species, we show that uncertainty in the alignment can lead to several problems, including different alignment methods resulting in different conclusions. PMID:18218900

  3. Uncertainty in flood risk mapping

    NASA Astrophysics Data System (ADS)

    Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

    2014-05-01

    A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.

  4. Evaluation of process inventory uncertainties

    SciTech Connect

    Roberts, N.J.

    1980-01-01

    This paper discusses the determination of some of the process inventory uncertainties in the Fast Flux Test Facility (FFTF) process line at the Los Alamos Scientific Laboratory (LASL) Plutonium Processing Facility (TA-55). A brief description of the FFTF process is given, along with a more detailed look at the peroxide precipitation and re-dissolution (PR) process. Emphasis is placed on the identification of the product and sidestreams from the unit processes, as they have application to the accountability measurements. The method of measurement of each of the product and sidestreams and their associated uncertainties are discussed. Some typical data for the PR process are presented, along with a discussion of the data. The data presented are based on our operating experience, and data on file in the TA-55 Nuclear Material Accountability System (PF/LASS/.

  5. Credible Software and Simulation Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; Nixon, David (Technical Monitor)

    1998-01-01

    The utility of software primarily depends on its reliability and performance; whereas, its significance depends solely on its credibility for intended use. The credibility of simulations confirms the credibility of software. The level of veracity and the level of validity of simulations determine the degree of credibility of simulations. The process of assessing this credibility in fields such as computational mechanics (CM) differs from that followed by the Defense Modeling and Simulation Office in operations research. Verification and validation (V&V) of CM simulations is not the same as V&V of CM software. Uncertainty is the measure of simulation credibility. Designers who use software are concerned with management of simulation uncertainty. Terminology and concepts are presented with a few examples from computational fluid dynamics.

  6. Principles of Induction Accelerators

    NASA Astrophysics Data System (ADS)

    Briggs*, Richard J.

    The basic concepts involved in induction accelerators are introduced in this chapter. The objective is to provide a foundation for the more detailed coverage of key technology elements and specific applications in the following chapters. A wide variety of induction accelerators are discussed in the following chapters, from the high current linear electron accelerator configurations that have been the main focus of the original developments, to circular configurations like the ion synchrotrons that are the subject of more recent research. The main focus in the present chapter is on the induction module containing the magnetic core that plays the role of a transformer in coupling the pulsed power from the modulator to the charged particle beam. This is the essential common element in all these induction accelerators, and an understanding of the basic processes involved in its operation is the main objective of this chapter. (See [1] for a useful and complementary presentation of the basic principles in induction linacs.)

  7. Kepler and Mach's Principle

    NASA Astrophysics Data System (ADS)

    Barbour, Julian

    The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.

  8. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  9. Systems-based guiding principles for risk modeling, planning, assessment, management, and communication.

    PubMed

    Haimes, Yacov Y

    2012-09-01

    This article is grounded on the premise that the complex process of risk assessment, management, and communication, when applied to systems of systems, should be guided by universal systems-based principles. It is written from the perspective of systems engineering with the hope and expectation that the principles introduced here will be supplemented and complemented by principles from the perspectives of other disciplines. Indeed, there is no claim that the following 10 guiding principles constitute a complete set; rather, the intent is to initiate a discussion on this important subject that will incrementally lead us to a more complete set of guiding principles. The 10 principles are as follows: First Principle: Holism is the common denominator that bridges risk analysis and systems engineering. Second Principle: The process of risk modeling, assessment, management, and communication must be systemic and integrated. Third Principle: Models and state variables are central to quantitative risk analysis. Fourth Principle: Multiple models are required to represent the essence of the multiple perspectives of complex systems of systems. Fifth Principle: Meta-modeling and subsystems integration must be derived from the intrinsic states of the system of systems. Sixth Principle: Multiple conflicting and competing objectives are inherent in risk management. Seventh Principle: Risk analysis must account for epistemic and aleatory uncertainties. Eighth Principle: Risk analysis must account for risks of low probability with extreme consequences. Ninth Principle: The time frame is central to quantitative risk analysis. Tenth Principle: Risk analysis must be holistic, adaptive, incremental, and sustainable, and it must be supported with appropriate data collection, metrics with which to measure efficacious progress, and criteria on the basis of which to act. The relevance and efficacy of each guiding principle is demonstrated by applying it to the U.S. Federal Aviation Administration complex Next Generation (NextGen) system of systems. PMID:22548671

  10. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  11. Age models and their uncertainties

    NASA Astrophysics Data System (ADS)

    Marwan, N.; Rehfeld, K.; Goswami, B.; Breitenbach, S. F. M.; Kurths, J.

    2012-04-01

    The usefulness of a proxy record is largely dictated by accuracy and precision of its age model, i.e., its depth-age relationship. Only if age model uncertainties are minimized correlations or lead-lag relations can be reliably studied. Moreover, due to different dating strategies (14C, U-series, OSL dating, or counting of varves), dating errors or diverging age models lead to difficulties in comparing different palaeo proxy records. Uncertainties in the age model are even more important if an exact dating is necessary in order to calculate, e.g., data series of flux or rates (like dust flux records, pollen deposition rates). Several statistical approaches exist to handle the dating uncertainties themselves and to estimate the age-depth relationship. Nevertheless, linear interpolation is still the most commonly used method for age modeling. The uncertainties of a certain event at a given time due to the dating errors are often even completely neglected. Here we demonstrate the importance of considering dating errors and implications for the interpretation of variations in palaeo-climate proxy records from stalagmites (U-series dated). We present a simple approach for estimating age models and their confidence levels based on Monte Carlo methods and non-linear interpolation. This novel algorithm also allows for removing age reversals. Our approach delivers a time series of a proxy record with a value range for each age depth also, if desired, on an equidistant time axis. The algorithm is implemented in interactive scripts for use with MATLAB®, Octave, and FreeMat.

  12. Dynamical principles in neuroscience

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  13. Dynamical principles in neuroscience

    SciTech Connect

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-15

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

  14. Fault Management Guiding Principles

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  15. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  16. Word learning under infinite uncertainty.

    PubMed

    Blythe, Richard A; Smith, Andrew D M; Smith, Kenny

    2016-06-01

    Language learners must learn the meanings of many thousands of words, despite those words occurring in complex environments in which infinitely many meanings might be inferred by the learner as a word's true meaning. This problem of infinite referential uncertainty is often attributed to Willard Van Orman Quine. We provide a mathematical formalisation of an ideal cross-situational learner attempting to learn under infinite referential uncertainty, and identify conditions under which word learning is possible. As Quine's intuitions suggest, learning under infinite uncertainty is in fact possible, provided that learners have some means of ranking candidate word meanings in terms of their plausibility; furthermore, our analysis shows that this ranking could in fact be exceedingly weak, implying that constraints which allow learners to infer the plausibility of candidate word meanings could themselves be weak. This approach lifts the burden of explanation from 'smart' word learning constraints in learners, and suggests a programme of research into weak, unreliable, probabilistic constraints on the inference of word meaning in real word learners. PMID:26927884

  17. Uncertainty propagation in nuclear forensics.

    PubMed

    Pommé, S; Jerome, S M; Venchiarutti, C

    2014-07-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined. PMID:24607529

  18. Rotating Torsion Balance Tests of the Weak Equivalence Principle

    NASA Astrophysics Data System (ADS)

    Wagner, Todd A.

    We used a rotating torsion balance to make the most precise laboratory search for equivalence-principle violation. We used a beryllium-aluminum composition dipole to complement our previous measurement with a beryllium-titanium composition dipole. We improved the tilt stability of the apparatus and reduced the temperature gradient feed-through to improve the uncertainty by 30% compared to our beryllium-titanium result. Using the beryllium-aluminum test bodies, we found eta⊕ = (--1.3+/-1.2)x10 --13. The combined limits using both test bodies pairs generally limit any new equivalence-principle-violating force that couples to ordinary neutral matter. We also measured test-bodies with compositions that mimic the difference in composition between the earth and moon to provide a model-independent weak equivalence principle limit of etaCD = (1.2 +/- 1.1) x10--13 for comparison with lunar laser ranging strong equivalence principle measurements. The combined lunar laser ranging and weak equivalence principle measurements limit equivalence-principle violation for gravitational binding energy to ≤ 6 x10 --4 at 1-sigma.

  19. Accounting for Calibration Uncertainty in Detectors for High-Energy Astrophysics

    NASA Astrophysics Data System (ADS)

    Xu, Jin

    Systematic instrumental uncertainties in astronomical analyses have been generally ignored in data analysis due to the lack of robust principled methods, though the importance of incorporating instrumental calibration uncertainty is widely recognized by both users and instrument builders. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. Lee et al. (2011) introduced a so-called pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduces an ad hoc technique that simplifies computation by assuming that the current data is not useful in narrowing the uncertainty for the calibration product, i.e., that the prior and posterior distributions for the calibration products are the same. In the thesis, we focus on incorporating calibration uncertainty into a principled Bayesian X-ray spectral analysis, specifically we account for uncertainty in the so-called effective area curve and the photon redistribution matrix. X-ray spectral analysis models the distribution of the energies of X-ray photons emitted from an astronomical source. The effective area curve of an X-ray detector describes its sensitive as a function of the energy of incoming photons, and the photon redistribution matrix describes the probability distribution of the recorded (discrete) energy of a photon as a function of the true (discretized) energy. Starting with the effective area curve, we follow Lee et al. (2011) and use a principle component analysis (PCA) to efficiently represent the uncertainty. Here, however, we leverage this representation to enable a principled, fully Bayesian method to account for calibration uncertainty in high-energy spectral analysis. For the photon redistribution matrix, we first model each conditional distribution as a normal distribution and then apply PCA to the parameters describing the normal models. This results in an efficient low-dimensional summary of the uncertainty in the redistribution matrix. Our methods for both calibration products are compared with standard analysis techniques and the pragmatic Bayesian method of Lee et al. (2011). The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product; we demonstrate this for the effective area curve. In this way, our fully Bayesian approach can yield more accurate and efficient estimates of the source parameters, and valid estimates of their uncertainty. Moreover, the fully Bayesian approach is the only method that allows us to make a valid inference about the effective area curve itself, quantifying which possible curves are most consistent with the data.

  20. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  1. Confronting uncertainty in peatland ecohydrology

    NASA Astrophysics Data System (ADS)

    Morris, P. J.; Waddington, J. M.; Baird, A. J.; Belyea, L. R.

    2011-12-01

    Background and Rationale: Peatlands are heavily water-controlled systems; long-term peat accumulation relies on slow organic matter decay in cool, saturated soil conditions. This interdependence of ecological, hydrological and biogeochemical processes makes peatlands prime examples of ecohydrological systems. Peatland ecohydrology exhibits a number of facets of complexity in the form of multiple mutual interdependencies between physical and biological processes and structures. Uncertainty as to the underlying mechanisms that control complex systems arises from a wide variety of sources; in this paper we explore three types of uncertainty in reference to peatland ecohydrology. 1) Parameterization. Analysis of complex systems such as peatlands lends itself naturally to a simulation modelling approach. An obvious source of uncertainty under a modelling approach is that of parameterization. A central theme in modelling studies is often that of sensitivity analysis: parameters to which model behavior is sensitive must be understood with high fidelity; in less sensitive areas of a model a greater level of uncertainty may be tolerated. Using a simple peatland water-budget model we demonstrate the importance of separating uncertainty from sensitivity. Using a Monte Carlo approach to analyze the model's behavior we identify those parameters that are both uncertain and to which the model's behavior is sensitive, and which therefore exhibit the most pressing need for further research. 2) Model structure. A more subtle form of uncertainty surrounds the assumed algorithmic structure of a model. We analyze the behavior of a simple ecohydrological model of long-term peatland development. By sequentially switching different feedbacks on and off we demonstrate that the level of complexity represented in the model is of central importance to the model's behavior, distinct from parameterization. 3) Spatial heterogeneity. We examine the role of horizontal spatial heterogeneity by extending the 1-D model used in section (2) to include a horizontal dimension. The spatially-explicit model simulates the growth of a domed bog over 5,000 years using the same equations, algorithmic structures and parameter values as the one-dimensional model. However, the behavior of the two models' two state variables (peat thickness, central water-table depth) is substantially different. The inclusion of spatial heterogeneity therefore not only leads to the prediction of spatial structures that simply cannot be represented in 1-D models, but also exerts an independent effect on state variables. This finding adds weight to the argument that spatial interactions play a non-trivial role in governing the behaviour of ecohydrological systems, and that failure to take account of spatial heterogeneity may fundamentally undermine models of ecohydrological systems. Synthesis: We demonstrate how exploring and confronting sources of uncertainty in peatland ecohydrology may be used to reduce the complexity of these and other systems, and to identify clearly the most urgent priorities for future observational research.

  2. Optimal uncertainty quantification with model uncertainty and legacy data

    NASA Astrophysics Data System (ADS)

    Kamga, P.-H. T.; Li, B.; McKerns, M.; Nguyen, L. H.; Ortiz, M.; Owhadi, H.; Sullivan, T. J.

    2014-12-01

    We present an optimal uncertainty quantification (OUQ) protocol for systems that are characterized by an existing physics-based model and for which only legacy data is available, i.e., no additional experimental testing of the system is possible. Specifically, the OUQ strategy developed in this work consists of using the legacy data to establish, in a probabilistic sense, the level of error of the model, or modeling error, and to subsequently use the validated model as a basis for the determination of probabilities of outcomes. The quantification of modeling uncertainty specifically establishes, to a specified confidence, the probability that the actual response of the system lies within a certain distance of the model. Once the extent of model uncertainty has been established in this manner, the model can be conveniently used to stand in for the actual or empirical response of the system in order to compute probabilities of outcomes. To this end, we resort to the OUQ reduction theorem of Owhadi et al. (2013) in order to reduce the computation of optimal upper and lower bounds on probabilities of outcomes to a finite-dimensional optimization problem. We illustrate the resulting UQ protocol by means of an application concerned with the response to hypervelocity impact of 6061-T6 Aluminum plates by Nylon 6/6 impactors at impact velocities in the range of 5-7 km/s. The ability of the legacy OUQ protocol to process diverse information on the system and its ability to supply rigorous bounds on system performance under realistic-and less than ideal-scenarios demonstrated by the hypervelocity impact application is remarkable.

  3. Scientific basis for the Precautionary Principle

    SciTech Connect

    Vineis, Paolo . E-mail: p.vineis@imperial.ac.uk

    2005-09-01

    The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as 'stressors', inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones.

  4. Dosimetric Uncertainties: Magnetic Field Coupling to Peripheral Nerve.

    PubMed

    Kavet, Robert

    2015-12-01

    The International Commission on Non-ionizing Radiation Protection (ICNIRP) and the Institute for Electrical and Electronic Engineers (IEEE) have established magnetic field exposure limits for the general public between 400 Hz (ICNIRP)/759 Hz (IEEE) and 100 kHz to protect against adverse effects associated with peripheral nerve stimulation (PNS). Despite apparent common purpose and similarly stated principles, the two sets of limits diverge between 3.35-100 kHz by a factor of about 7.7 with respect to PNS. To address the basis for this difference and the more general issue of dosimetric uncertainty, this paper combines experimental data of PNS thresholds derived from human subjects exposed to magnetic fields together with published estimates of induced in situ electric field PNS thresholds to evaluate dosimetric relationships of external magnetic fields to induced fields at the threshold of PNS and the uncertainties inherent to such relationships. The analyses indicate that the logarithmic range of magnetic field thresholds constrains the bounds of uncertainty of in situ electric field PNS thresholds and coupling coefficients related to the peripheral nerve (the coupling coefficients define the dosimetric relationship of external field to induced electric field). The general public magnetic field exposure limit adopted by ICNIRP uses a coupling coefficient that falls above the bounds of dosimetric uncertainty, while IEEE's is within the bounds of uncertainty toward the lower end of the distribution. The analyses illustrate that dosimetric estimates can be derived without reliance on computational dosimetry and the associated values of tissue conductivity. With the limits now in place, investigative efforts would be required if a field measurement were to exceed ICNIRP's magnetic field limit (the reference level), even when there is a virtual certainty that the dose limit (the basic restriction) has not been exceeded. The constraints on the range of coupling coefficients described in this paper could facilitate a re-evaluation of ICNIRP and IEEE dose and exposure limits and possibly lead toward harmonization. PMID:26509623

  5. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  6. [Comparison of three approaches for uncertainty estimation].

    PubMed

    Marini, R D; Chiap, P; Boulanger, B; Rudaz, S; Rozet, E; Crommen, J; Hubert, P

    2006-01-01

    Three different approaches for the estimation of uncertainty measurements using the same analytical method were compared, namely validation, robustness and inter-laboratory studies. The uncertainty obtained with the robustness study! predicted well the uncertainty of the inter-laboratory study. On the other hand, the uncertainty estimation obtained with the validation study is lower than those obtained with the two other approaches but is still acceptable as long as the analytical method will be used in a single laboratory. PMID:16700155

  7. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…

  8. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective

  9. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zo E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty

  10. The Principle of General Tovariance

    NASA Astrophysics Data System (ADS)

    Heunen, C.; Landsman, N. P.; Spitters, B.

    2008-06-01

    We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.

  11. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  12. Great Lakes Literacy Principles

    NASA Astrophysics Data System (ADS)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  13. Principles of alternative gerontology.

    PubMed

    Bilinski, Tomasz; Bylak, Aneta; Zadrag-Tecza, Renata

    2016-04-01

    Surveys of taxonomic groups of animals have shown that contrary to the opinion of most gerontologists aging is not a genuine trait. The process of aging is not universal and its mechanisms have not been widely conserved among species. All life forms are subject to extrinsic and intrinsic destructive forces. Destructive effects of stochastic events are visible only when allowed by the specific life program of an organism. Effective life programs of immortality and high longevity eliminate the impact of unavoidable damage. Organisms that are capable of agametic reproduction are biologically immortal. Mortality of an organism is clearly associated with terminal specialisation in sexual reproduction. The longevity phenotype that is not accompanied by symptoms of senescence has been observed in those groups of animals that continue to increase their body size after reaching sexual maturity. This is the result of enormous regeneration abilities of both of the above-mentioned groups. Senescence is observed when: (i) an organism by principle switches off the expression of existing growth and regeneration programs, as in the case of imago formation in insect development; (ii) particular programs of growth and regeneration of progenitors are irreversibly lost, either partially or in their entirety, in mammals and birds. PMID:27017907

  14. Principles of Bioremediation Assessment

    NASA Astrophysics Data System (ADS)

    Madsen, E. L.

    2001-12-01

    Although microorganisms have successfully and spontaneously maintained the biosphere since its inception, industrialized societies now produce undesirable chemical compounds at rates that outpace naturally occurring microbial detoxification processes. This presentation provides an overview of both the complexities of contaminated sites and methodological limitations in environmental microbiology that impede the documentation of biodegradation processes in the field. An essential step toward attaining reliable bioremediation technologies is the development of criteria which prove that microorganisms in contaminated field sites are truly active in metabolizing contaminants of interest. These criteria, which rely upon genetic, biochemical, physiological, and ecological principles and apply to both in situ and ex situ bioremediation strategies include: (i) internal conservative tracers; (ii) added conservative tracers; (iii) added radioactive tracers; (iv) added isotopic tracers; (v) stable isotopic fractionation patterns; (vi) detection of intermediary metabolites; (vii) replicated field plots; (viii) microbial metabolic adaptation; (ix) molecular biological indicators; (x) gradients of coreactants and/or products; (xi) in situ rates of respiration; (xii) mass balances of contaminants, coreactants, and products; and (xiii) computer modeling that incorporates transport and reactive stoichiometries of electron donors and acceptors. The ideal goal is achieving a quantitative understanding of the geochemistry, hydrogeology, and physiology of complex real-world systems.

  15. Cryogenic Equivalence Principle Experiment

    NASA Technical Reports Server (NTRS)

    Everitt, C. W. F.; Worden, P. W.

    1985-01-01

    The purpose of this project is to test the equivalence of inertial and passive gravitational mass in an Earth-orbiting satellite. A ground-based experiment is now well developed. It consists of comparing the motions of two cylindrical test masses suspended in precision superconducting magnetic bearings and free to move along the horizontal (axis) direction. The masses are made of niobium and lead-plated aluminum. A position detector based on a SQUID magnetometer measures the differential motion between the masses. The periods of the masses are matched by adjustment of the position detector until the system is insensitive to common mode signals, and so that the experiment is less sensitive to seismic vibration. The apparatus is contained in a twelve inch helium dewar suspended in a vibration isolation stand. The stand achieves 30 db isolation from horizontal motions between 0.1 and 60 Hz, by simulating the motion of a 200 meter long pendulum with an air bearing. With this attenuation of seismic noise and a common mode rejection ratio of 10 to the 5th power in the differential mode, the ground based apparatus should have a sensitivity to equivalence principle violations of one part in 10 to the 13th power; the satellite version might have a sensitivity of one part in 10 to the 17th power.

  16. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. PMID:21175720

  17. Error models for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Josset, L.; Scheidt, C.; Lunati, I.

    2012-12-01

    In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing spatial uncertainty using distances and kernels", Math Geosci (2009) [2] P. Jenny et al., "Multi-Scale finite-volume method for elliptic problems in subsurface flow simulation", J. Comp. Phys., 187(1) (2003) [3] I. Lunati and S.H. Lee, "An operator formulation of the multiscale finite-volume method with correction function", Multiscale Model. Simul. 8(1) (2009) [4] G. Mariethoz, P. Renard, and J. Straubhaar "The Direct Sampling method to perform multiple-point geostatistical simulations", Water Resour. Res., 46 (2010) [5] P. Bayer et al., "Three-dimensional high resolution fluvio-glacial aquifer analog", J. Hydro 405 (2011) 19

  18. Uncertainties in debris growth predictions

    SciTech Connect

    McKnight, D.S. )

    1991-01-10

    The growth of artificial space debris in Earth orbit may pose a significant hazard to satellites in the future though the collision hazard to operational spacecraft is presently manageable. The stability of the environment is dependent on the growth of debris from satellite deployment, mission operations and fragmentation events. Growth trends of the trackable on-orbit population are investigated highlighting the complexities and limitations of using the data that supports this modeling. The debris produced by breakup events may be a critical aspect of the present and future environment. As a result, growth predictions produced using existing empirically-based models may have large, possibly even unacceptable, uncertainties.

  19. Congress probes climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Simarski, Lynn Teo

    Policymakers are demanding information about climate change faster than it can be turned out by scientists. This conflict between politics and science was debated at a recent congressional hearing on priorities in global change research. On October 8 and 10, panels of scientists that included AGU president-elect Ralph J. Cicerone of the University of California attempted to identify scientific uncertainties in global warming research before the House Science Committee's Subcommittee on Science.“Decisionmakers provided with incomplete information are left with the problem of choosing among options where the consequences of a wrong choice could be disastrous,” said subcommittee chair Rick Boucher (D-Va.).

  20. The inconstant "principle of constancy".

    PubMed

    Kanzer, M

    1983-01-01

    A review of the principle of constancy, as it appeared in Freud's writings, shows that it was inspired by his clinical observations, first with Breuer in the field of cathartic therapy and then through experiences in the early usage of psychoanalysis. The recognition that memories repressed in the unconscious created increasing tension, and that this was relieved with dischargelike phenomena when the unconscious was made conscious, was the basis for his claim to originality in this area. The two principles of "neuronic inertia" Freud expounded in the Project (1895), are found to offer the key to the ambiguous definition of the principle of constancy he was to offer in later years. The "original" principle, which sought the complete discharge of energy (or elimination of stimuli), became the forerunner of the death drive; the "extended" principle achieved balances that were relatively constant, but succumbed in the end to complete discharge. This was the predecessor of the life drives. The relation between the constancy and pleasure-unpleasure principles was maintained for twenty-five years largely on an empirical basis which invoked the concept of psychophysical parallelism between "quantity" and "quality." As the links between the two principles were weakened by clinical experiences attendant upon the growth of ego psychology, a revision of the principle of constancy was suggested, and it was renamed the Nirvana principle. Actually it was shifted from alignment with the "extended" principle of inertia to the original, so that "constancy" was incongruously identified with self-extinction. The former basis for the constancy principle, the extended principle of inertia, became identified with Eros. Only a few commentators seem aware of this radical transformation, which has been overlooked in the Standard Edition of Freud's writings. Physiological biases in the history and conception of the principle of constancy are noted in the Standard Edition. The historical antecedents of the principle of constancy, especially in relation to the teachings and influence of J. F. Herbart (1776-1841), do much to bridge the gap between psychological and neurophysiological aspects of Freud's ideas about constancy and its associated doctrine, psychic determinism. Freud's later teachings about the Nirvana principle and Eros suggest a continuum of "constancies" embodied in the structural and functional development of the mental apparatus as it evolves from primal unity with the environment (e.g., the mother-child unit) and differentiates in patterns that organize the inner and outer worlds in relation to each other. PMID:6681436

  1. Validation of an Experimentally Derived Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.

    1996-01-01

    The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.

  2. Visualization of Information Uncertainty: Progress and Challenges

    NASA Astrophysics Data System (ADS)

    Pham, Binh; Streit, Alex; Brown, Ross

    Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.

  3. Forest management under uncertainty for multiple bird population objectives

    USGS Publications Warehouse

    Moore, C.T.; Plummer, W.T.; Conroy, M.J.

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

  4. Toward an uncertainty budget for measuring nanoparticles by AFM

    NASA Astrophysics Data System (ADS)

    Delvallée, A.; Feltin, N.; Ducourtieux, S.; Trabelsi, M.; Hochepied, J. F.

    2016-02-01

    This article reports on the evaluation of an uncertainty budget associated with the measurement of the mean diameter of a nanoparticle (NP) population by Atomic Force Microscopy. The measurement principle consists in measuring the height of a spherical-like NP population to determine the mean diameter and the size distribution. This method assumes that the NPs are well-dispersed on the substrate and isolated enough to avoid measurement errors due to agglomeration phenomenon. Since the measurement is directly impacted by the substrate roughness, the NPs have been deposited on a mica sheet presenting a very low roughness. A complete metrological characterization of the instrument has been carried out and the main error sources have been evaluated. The measuring method has been tested on a population of SiO2 NPs. Homemade software has been used to build the height distribution histogram taking into account only isolated NP. Finally, the uncertainty budget including main components has been established for the mean diameter measurement of this NP population. The most important components of this uncertainty budget are the calibration process along Z-axis, the scanning speed influence and then the vertical noise level.

  5. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  6. The legal status of uncertainty

    NASA Astrophysics Data System (ADS)

    Ferraris, L.; Miozzo, D.

    2009-09-01

    Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We are, in fact, presently experiencing a detrimental increase of legal actions taken against the authorities of civil protection whom, relying on the forecasts of mathematical models, fail in protecting the population. It is our profound concern that civilians have granted the right of being protected by any means, and at the same extent, from natural hazards and from the fallacious behaviour of whom should grant individual safety. But, at the same time, a dangerous overcriminalization could have a negative impact on the Civil Protection system inducing a dangerous defensive behaviour which is costly and ineffective. A few case studies are presented in which the role of uncertainty, in numerical predictions, is made evident and discussed. Scientists, thus, need to help policymakers to agree on sound procedures that must recognize the real level of unpredictability. Hence, we suggest the creation of an international and interdisciplinary committee, with the scope of having politics, jurisprudence and science communicate, to find common solutions to a common problem.

  7. Uncertainty Quantification and Transdimensional Inversion

    NASA Astrophysics Data System (ADS)

    Sambridge, M.; Hawkins, R.

    2014-12-01

    Over recent years transdimensional inference methods have grown in popularity and found applications in fields ranging from Solid Earth Geophysics, to Geochemistry. In all applications of inversion assumptions are made about the nature of the model parametrisation, complexity and data noise characteristics, and results can be significantly dependent on those assumptions. Often these are in the form of fixed choices imposed a priori, e.g. in the grid size of the model or noise level in the data. A transdimensional approach allows these assumptions to be relaxed by incorporating relevant parameters as unknowns in the inference problem, e.g. the number of model parameters becomes a variable as does the form of basis functions and the variance of the data noise. In this way uncertainty due to parameterisation effects or data noise choices may be incorporated into the inference process. Probabilistic sampling techniques such as Birth-Death Markov chain Monte Carlo and the Reversible jump algorithm, allow sampling over complex posterior probability density functions providing information on constraint, trade-offs and uncertainty in the unknowns. This talk will present a review of trans-dimensional inference and its application in geophysical inversion, and highlight some emerging trends such as Multi-scale McMC, Parallel Tempering and Sequential McMC which hold the promise of further extending the range of problems where these methods are practical.

  8. [The pediatrician, his laboratory and its uncertainty].

    PubMed

    Boulat, Olivier

    2002-12-01

    Four aspects of uncertainty linked to the laboratory of medical analysis are discussed. 1) Uncertainty and the laboratory test: the post-test probability of a diagnostic is intimately linked to the ability to establish clinically the pre-test probability. 2) Uncertainty and the patient: reference values are most often chosen from the literature. The analytical methodology as well as the population of reference should be carefully checked. 3) Uncertainty and the laboratory result: this uncertainty will diminish if preanalytical conditions are standardized and if analytical imprecision is known. The analytical imprecision is given by the coefficient of variation (CV) of the internal quality control. The CV is used to calculate the critical difference. 4) Uncertainty and the practitioner: to efficiently diminish this uncertainty in the case of unusual questions of the medical practice, a network should be established between the practitioner, the medical specialists and the scientific specialists of the laboratory. PMID:12611193

  9. Principles of Instructed Language Learning

    ERIC Educational Resources Information Center

    Ellis, Rod

    2005-01-01

    This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on…

  10. Ideario Educativo (Principles of Education).

    ERIC Educational Resources Information Center

    Consejo Nacional Tecnico de la Educacion (Mexico).

    This document is an English-language abstract (approximately 1,500 words) which discusses an overall educational policy for Mexico based on Constitutional principles and those of humanism. The basic principles that should guide Mexican education as seen by the National Technical Council for Education are the following: (1) love of country; (2)…

  11. Multimedia Principle in Teaching Lessons

    ERIC Educational Resources Information Center

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  12. Principles of Play for Soccer

    ERIC Educational Resources Information Center

    Ouellette, John

    2004-01-01

    Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…

  13. Meaty Principles for Environmental Educators.

    ERIC Educational Resources Information Center

    Rockcastle, V. N.

    1985-01-01

    Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)

  14. Precautionary principle in international law.

    PubMed

    Saladin, C

    2000-01-01

    The deregulatory nature of trade rules frequently brings them into conflict with the precautionary principle. These rules dominate debate over the content and legal status of the precautionary principle at the international level. The World Trade Organization (WTO), because of its power in settling disputes, is a key player. Many States are concerned to define the precautionary principle consistent with WTO rules, which generally means defining it as simply a component of risk analysis. At the same time, many States, especially environmental and public health policymakers, see the principle as the legal basis for preserving domestic and public health measures in the face of deregulatory pressures from the WTO. The precautionary principle has begun to acquire greater content and to move into the operative articles of legally binding international agreements. It is important to continue this trend. PMID:11114120

  15. [Dignity, founding principle of law].

    PubMed

    Mathieu, Bertrand

    2010-09-01

    The principle of dignity made a noted appearance in the legal field on the occasion of the adoption of the first texts concerning bioethics. There is in fact an obvious correlation between the need to provide a framework for certain practices and the principle of human dignity. This recognition, which can be seen in international and European law as much as in national law, is marked by certain ambiguities as to its meaning and its impact. So this principle should be subjected to a legal analysis. From this point of view, it presents three main characteristics, it is a matrix principle, which cannot be waived and it constitutes an objective right. Today, beyond its formal recognition, the effectiveness of the principle of dignity is weakened by a tendency to give prevalence to the requirement of freedom, as a subjective right. Beyond the ideological debate on this issue, it is the protection of the individual that is at stake. PMID:21456303

  16. The uncertainty of local flow parameters during inundation flow over complex topographies with elevation errors

    NASA Astrophysics Data System (ADS)

    Tsubaki, Ryota; Kawahara, Yoshihisa

    2013-04-01

    SummarySince the topographical data obtained from LiDAR (Light Detection and Ranging) measurements is superior in resolution and accuracy as compared to conventional geospatial data, over the last decade aerial LiDAR (Light Detection and Ranging) has been widely used for obtaining geospatial information. However, digital terrain models made from LiDAR data retain some degree of uncertainty as a result of the measurement principles and the operational limitations of LiDAR surveying. LiDAR cannot precisely measure topographical elements such as ground undulation covered by vegetation, curbstones, etc. Such instrumental and physical uncertainties may impact an estimated result in an inundation flow simulation. Meanwhile, how much and how these topographical uncertainties affect calculated results is not understood. To evaluate the effect of topographical uncertainty on the calculated inundation flow, three representative terrains were prepared that included errors in elevation. Here, the topographical uncertainty that was introduced was generated using a fractal algorithm in order to represent the spatial structure of the elevation uncertainty. Then, inundation flows over model terrains were calculated with an unstructured finite volume flow model that solved shallow water equations. The sensitivity of the elevation uncertainty on the calculated inundated propagation, especially the local flow velocity, was evaluated. The predictability of inundation flow over complex topography is discussed, as well as its relationship to topographical features.

  17. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  18. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  19. An Inconvenient Principle

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

  20. Quantum principles and free particles. [evaluation of partitions

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.

  1. Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?

    NASA Astrophysics Data System (ADS)

    Mc Leod, David; Mc Leod, Roger

    2007-04-01

    String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that λp = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5

  2. Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Miyadera, Takayuki; Imai, Hideki

    2008-11-01

    A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.

  3. Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.

    ERIC Educational Resources Information Center

    Wassermann, Selma

    2001-01-01

    Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill. (PKP)

  4. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  5. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  6. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  7. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  8. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  9. On Assessing Uncertainty, Variability and Feedback

    NASA Astrophysics Data System (ADS)

    Tsanis, I. K.

    2009-04-01

    Environmental projects consist of three phases: (a) a fast track phase, (b) an enrichment phase and (c) a synthesis phase. The fast track phase aims at providing quantitative input to the scenario-building process and gaining experience with the data flows between the work packages. The enrichment phase which also includes the scenario-building process, provides new and more detailed trends of the enriched storylines. Sources of feedback information are discussed. This paper studies assessment methods of (a) variability and (b) uncertainty of data to be used for scenario building and modelling and (c) end user feedback information regarding preliminary data and model output. Some of the major sources of uncertainty in environmental variables related to water resources and hydrology science are described. A classification of uncertainty is presented according to the empirical quality of data and its sources of uncertainty. Since there is distinction between attributes of objects depending on their variability in space and time, then there is a need to consider spatial and temporal dependence (autocorrelation) alongside the uncertainty models at individual locations. In addition, uncertainties in environmental data combine with modelling uncertainties leading to uncertain model predictions; that is, uncertainties propagate through models. Sources of modelling uncertainty such as (a) uncertainties in input data (measurement, interpolation/extrapolation and re-scaling errors), (b) uncertainties in models (in model structure -conceptual or logical uncertainties-, in model parameters and in the solution of the model) are discussed. Finally, examples of uncertainty on the variable of measured point and estimated areal precipitation are presented.

  10. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  11. Induction of models under uncertainty

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter

    1986-01-01

    This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.

  12. Groundwater Optimal Management Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Karatzas, George P.

    One of the latest developments dealing with noisy or incomplete data in mathemat- ical programming is the robust optimization approach. This approach is based on a scenario-based description of the data and yields a solution that is less sensitive to realizations of the data of the different scenarios. The objective function considers the violations of the constraints under each scenario and incorporates that into the formu- lation by using a kind of penalty `weights'. In the area of groundwater management the robust optimization approach has been used to incorporate uncertainty into the model by considering a multiple scenario description of the hydraulic conductivity field. The focus of the present study is to determine an effective methodology for selecting the scenarios as well as the `weights' in the most effective manner.

  13. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  14. Self as the feminine principle.

    PubMed

    Weisstub, E B

    1997-07-01

    In analytical psychology, ego is associated with consciousness and the masculine principle. Although the feminine principle generally characterizes the unconscious, it was not assigned a psychic structure equivalent to the ego. This paper proposes a model of the psyche where self and ego are the major modes of psychic experience. The self as the 'being' mode represents the feminine principle and functions according to primary process; the ego represents 'doing', the masculine principle and secondary process. Feminine and masculine principles are considered to be of equal significance in both men and women and are not limited to gender. Jung's concept of the self is related to the Hindu metaphysical concepts of Atman and Brahman, whose source was the older Aryan nature-oriented, pagan religion. The prominence of self in analytical psychology and its predominantly 'feminine' symbolism can be understood as Jung's reaction to the psychoanalytic emphasis on ego and to Freud's 'patriarchal' orientation. In Kabbalah, a similar development took place when the feminine principle of the Shekinah emerged in a central, redemptive role, as a mythic compensation to the overtly patriarchal Judaic religion. In the proposed model of the psyche neither ego nor self represents the psychic totality. The interplay of both psychic modes/principles constitutes the psyche and the individuation process. PMID:9246929

  15. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-01-01

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643

  16. A Bayesian Foundation for Individual Learning Under Uncertainty

    PubMed Central

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J.; Stephan, Klaas E.

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID:21629826

  17. A bayesian foundation for individual learning under uncertainty.

    PubMed

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID:21629826

  18. Concepts and Practice of Verification, Validation, and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Oberkampf, W. L.

    2014-12-01

    Verification and validation (V&V) are the primary means to assess the numerical and physics modeling accuracy, respectively, in computational simulation. Code verification assesses the reliability of the software coding and the numerical algorithms used in obtaining a solution, while solution verification addresses numerical error estimation of the computational solution of a mathematical model for a specified set of initial and boundary conditions. Validation assesses the accuracy of the mathematical model as compared to experimentally measured response quantities of the system being modeled. As these experimental data are typically available only for simplified subsystems or components of the system, model validation commonly provides limited ability to assess model accuracy directly. Uncertainty quantification (UQ), specifically in regard to predictive capability of a mathematical model, attempts to characterize and estimate the total uncertainty for conditions where no experimental data are available. Specific sources of uncertainty that can impact the total predictive uncertainty are: the assumptions and approximations in the formulation of the mathematical model, the error incurred in the numerical solution of the discretized model, the information available for stochastic input data for the system, and the extrapolation of the mathematical model to conditions where no experimental data are available. This presentation will briefly discuss the principles and practices of VVUQ from both the perspective of computational modeling and simulation, as well as the difficult issue of estimating predictive capability. Contrasts will be drawn between weak and strong code verification testing, and model validation as opposed to model calibration. Closing remarks will address what needs to be done to improve the value of information generated by computational simulation for improved decision-making.

  19. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for comparison of the measurement accuracy of existing or newly developed PIV interrogation algorithms. The database is publicly available on the website www.piv.de/uncertainty.

  20. The principle of pooled calibrations and outlier retainment elucidates optimum performance of ion chromatography.

    PubMed

    Andersen, Jens E T; Mikolajczak, Maria; Wojtachnio-Zawada, Katarzyna Olga; Nicolajsen, Henrik Vigan

    2012-11-01

    A principle with quality assurance of ion chromatography (IC) is presented. Since the majority of scientists and costumers are interested in the determination of the true amount of analyte in real samples, the focus of attention should be directed towards the concept of accuracy rather than focussing on precision. By exploiting the principle of pooled calibrations and retainment of all outliers it was possible to obtain full correspondence between calibration uncertainty and repetition uncertainty, which for the first time evidences statistical control in experiments with ion chromatography. Anions of bromide were analysed and the results were subjected to quality assurance (QA). It was found that the limit of quantification (LOQ) was significantly underestimated by up to a factor of 30 with respect to the determination of concentration of unknowns. The concept of lower-limit of analysis (LLA) and upper-limit of analysis (ULA) were found to provide more acceptable limits with respect to reliable analysis with a limited number of repetitions. An excellent correspondence was found between calibration uncertainty and repetition uncertainty. These findings comply with earlier investigations of method validations where it was found that the principle of pooled calibrations provides a more realistic picture of the analytical performance with the drawback, however, that generally higher levels of uncertainties should be accepted, as compared to contemporary literature values. The implications to the science analytical chemistry in general and to method validations in particular are discussed. PMID:23040989

  1. Interpretation of the extreme physical information principle in terms of shift information

    SciTech Connect

    Vstovsky, G.V. )

    1995-02-01

    It is shown that Fisher information (FI) can be considered as a limiting case of a related form of Kullback information---a shift information (SI). The compatibility of the use of SI with a basic physical principle of uncertainty is demonstrated. The scope of FI based theory is extended to the nonlinear Klein-Gordon equation.

  2. Fine-grained uncertainty relation and biased nonlocal games in bipartite and tripartite systems

    NASA Astrophysics Data System (ADS)

    Dey, Ansuman; Pramanik, T.; Majumdar, A. S.

    2013-01-01

    The fine-grained uncertainty relation can be used to discriminate among classical, quantum, and superquantum correlations based on their strength of nonlocality, as has been shown for bipartite and tripartite systems with unbiased measurement settings. Here we consider the situations when two and three parties choose settings with bias for playing certain nonlocal games. We show analytically that while the fine-grained uncertainty principle is still able to distinguish classical, quantum, and superquantum correlations for biased settings corresponding to certain ranges of the biasing parameters, the above-mentioned discrimination is not manifested for all biasing.

  3. Damage assessment of the truss system with uncertainty using frequency response function based damage identification method

    NASA Astrophysics Data System (ADS)

    Zhao, Jie; DeSmidt, Hans; Yao, Wei

    2015-04-01

    A novel vibration-based damage identification methodology for the truss system with mass and stiffness uncertainties is proposed and demonstrated. This approach utilizes the damaged-induced changes of frequency response functions (FRF) to assess the severity and location of the structural damage in the system. The damage identification algorithm is developed basing on the least square and Newton-Raphson methods. The dynamical model of system is built using finite element method and Lagrange principle while the crack model is based on fracture mechanics. The method is synthesized via numerical examples for a truss system to demonstrate the effectiveness in detecting both stiffness and mass uncertainty existed in the system.

  4. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  5. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pommé, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as ‘easy’ and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  6. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  7. Incorporating Forecast Uncertainty in Utility Control Center

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  8. Optimality principles for the visual code

    NASA Astrophysics Data System (ADS)

    Pitkow, Xaq

    One way to try to make sense of the complexities of our visual system is to hypothesize that evolution has developed nearly optimal solutions to the problems organisms face in the environment. In this thesis, we study two such principles of optimality for the visual code. In the first half of this dissertation, we consider the principle of decorrelation. Influential theories assert that the center-surround receptive fields of retinal neurons remove spatial correlations present in the visual world. It has been proposed that this decorrelation serves to maximize information transmission to the brain by avoiding transfer of redundant information through optic nerve fibers of limited capacity. While these theories successfully account for several aspects of visual perception, the notion that the outputs of the retina are less correlated than its inputs has never been directly tested at the site of the putative information bottleneck, the optic nerve. We presented visual stimuli with naturalistic image correlations to the salamander retina while recording responses of many retinal ganglion cells using a microelectrode array. The output signals of ganglion cells are indeed decorrelated compared to the visual input, but the receptive fields are only partly responsible. Much of the decorrelation is due to the nonlinear processing by neurons rather than the linear receptive fields. This form of decorrelation dramatically limits information transmission. Instead of improving coding efficiency we show that the nonlinearity is well suited to enable a combinatorial code or to signal robust stimulus features. In the second half of this dissertation, we develop an ideal observer model for the task of discriminating between two small stimuli which move along an unknown retinal trajectory induced by fixational eye movements. The ideal observer is provided with the responses of a model retina and guesses the stimulus identity based on the maximum likelihood rule, which involves sums over all random walk trajectories. These sums can be implemented in a biologically plausible way. The necessary ingredients are: neurons modeled as a cascade of a linear filter followed by a static nonlinearity, a recurrent network with additive and multiplicative interactions between neurons, and divisive global inhibition. This architecture implements Bayesian inference by representing likelihoods as neural activity which can then diffuse through the recurrent network and modulate the influence of later information. We also develop approximation methods for characterizing the performance of the ideal observer. We find that the effect of positional uncertainty is essentially to slow the acquisition of signal. The time scaling is related to the size of the uncertainty region, which is in turn related to both the signal strength and the statistics of the fixational eye movements. These results imply that localization cues should determine the slope of the performance curve in time.

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  10. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  11. Uncertainty Analysis in Environmental Modeling Made Easy

    NASA Astrophysics Data System (ADS)

    Pappenberger, Florian; Harvey, Hamish; Beven, Keith; Hall, Jim

    2007-01-01

    Uncertainty analysis assesses the uncertainty in numerical model outputs that arises from ambiguity in model structures, parameters, boundary conditions, and evaluation data. An analysis of the impact of uncertainties should be undertaken in every environmental modeling exercise. Many techniques exist, however, and each requires an investment of time and resources to learn. The potential analyst is faced with the difficult question of which technique is best to use and which may be put off.

  12. Notes on the effect of dose uncertainty

    SciTech Connect

    Morris, M.D.

    1987-01-01

    The apparent dose-response relationship between amount of exposure to acute radiation and level of mortality in humans is affected by uncertainties in the dose values. It is apparent that one of the greatest concerns regarding the human data from Hiroshima and Nagasaki is the unexpectedly shallow slope of the dose response curve. This may be partially explained by uncertainty in the dose estimates. Some potential effects of dose uncertainty on the apparent dose-response relationship are demonstrated.

  13. Get Provoked: Applying Tilden's Principles.

    ERIC Educational Resources Information Center

    Shively, Carol A.

    1995-01-01

    This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

  14. Updated uncertainty budgets for NIST thermocouple calibrations

    NASA Astrophysics Data System (ADS)

    Meyer, C. W.; Garrity, K. M.

    2013-09-01

    We have recently updated the uncertainty budgets for calibrations in the NIST Thermocouple Calibration Laboratory. The purpose for the updates has been to 1) revise the estimated values of the relevant uncertainty elements to reflect the current calibration facilities and methods, 2) provide uncertainty budgets for every standard calibration service offered, and 3) make the uncertainty budgets more understandable to customers by expressing all uncertainties in units of temperature (°C) rather than emf. We have updated the uncertainty budgets for fixed-point calibrations of type S, R, and B thermocouples and comparison calibrations of type R and S thermocouples using a type S reference standard. In addition, we have constructed new uncertainty budgets for comparison calibrations of type B thermocouples using a type B reference standard as well as using both a type S and type B reference standard (for calibration over a larger range). We have updated the uncertainty budgets for comparison calibrations of base-metal thermocouples using a type S reference standard and alternately using a standard platinum resistance thermometer reference standard. Finally, we have constructed new uncertainty budgets for comparison tests of noble-metal and base-metal thermoelements using a type S reference standard. A description of these updates is presented in this paper.

  15. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  16. Equivalence principle and gravitational redshift.

    PubMed

    Hohensee, Michael A; Chu, Steven; Peters, Achim; Müller, Holger

    2011-04-15

    We investigate leading order deviations from general relativity that violate the Einstein equivalence principle in the gravitational standard model extension. We show that redshift experiments based on matter waves and clock comparisons are equivalent to one another. Consideration of torsion balance tests, along with matter-wave, microwave, optical, and Mössbauer clock tests, yields comprehensive limits on spin-independent Einstein equivalence principle-violating standard model extension terms at the 10(-6) level. PMID:21568541

  17. Equivalence Principle and Gravitational Redshift

    SciTech Connect

    Hohensee, Michael A.; Chu, Steven; Mueller, Holger; Peters, Achim

    2011-04-15

    We investigate leading order deviations from general relativity that violate the Einstein equivalence principle in the gravitational standard model extension. We show that redshift experiments based on matter waves and clock comparisons are equivalent to one another. Consideration of torsion balance tests, along with matter-wave, microwave, optical, and Moessbauer clock tests, yields comprehensive limits on spin-independent Einstein equivalence principle-violating standard model extension terms at the 10{sup -6} level.

  18. Testing the strong equivalence principle by radio ranging

    NASA Technical Reports Server (NTRS)

    Canuto, V. M.; Goldman, I.; Shapiro, I. I.

    1984-01-01

    Planetary range data offer the most promising means to test the validity of the Strong Equivalence Principle (SEP). Analytical expressions for the perturbation in the 'range' expected from an SEP violation predicted by the 'variation-of-G' method and by the 'two-times' approach are derived and compared. The dominant term in both expressions is quadratic in time. Analysis of existing range data should allow a determination of the coefficient of this term with a one-standard-deviation uncertainty of about 1 part in 100 billion/yr.

  19. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.

  20. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (ESTSC)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  1. Uncertainty reasoning in expert systems

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  2. Estimating epistemic and aleatory uncertainties during hydrologic modeling: An information theoretic approach

    NASA Astrophysics Data System (ADS)

    Gong, Wei; Gupta, Hoshin V.; Yang, Dawen; Sricharan, Kumar; Hero, Alfred O.

    2013-04-01

    With growing interest in understanding the magnitudes and sources of uncertainty in hydrological modeling, the difficult problem of characterizing model structure adequacy is now attracting considerable attention. Here, we examine this problem via a model-structure-independent approach based in information theory. In particular, we (a) discuss how to assess and compute the information content in multivariate hydrological data, (b) present practical methods for quantifying the uncertainty and shared information in data while accounting for heteroscedasticity, (c) show how these tools can be used to estimate the best achievable predictive performance of a model (for a system given the available data), and (d) show how model adequacy can be characterized in terms of the magnitude and nature of its aleatory uncertainty that cannot be diminished (and is resolvable only up to specification of its density), and its epistemic uncertainty that can, in principle, be suitably resolved by improving the model. An illustrative modeling example is provided using catchment-scale data from three river basins, the Leaf and Chunky River basins in the United States and the Chuzhou basin in China. Our analysis shows that the aleatory uncertainty associated with making catchment simulations using this data set is significant (˜50%). Further, estimated epistemic uncertainties of the HyMod, SAC-SMA, and Xinanjiang model hypotheses indicate that considerable room for model structural improvements remain.

  3. Uncertainty estimates for the gravimetric primary flow standards of the MRF

    SciTech Connect

    Park, J.T.; Behring, K.A. II; Grimley, T.A.

    1995-12-31

    Two gravimetric flow standards for mass flowrate are in operation for the calibration of high capacity flowmeters with natural gas. Both systems measure mass electronically from scales which operate on a gyroscopic principle. The gravimetric provers are an integral part of the Gas Research Institute (GRI) Metering Research Facility (MRF) and can provide a direct primary calibration for any conventional gas flowmeter. The smaller system is attached to the Low Pressure Loop (LPL) with an operating pressure of 0.14 to 1.4 MPa (20 to 200 psia) and a flowrate up to 4.6 kg/s (10 lbm/s). The larger system is connected to the High Pressure Loop (HPL) with pressures of 1.4 to 10 MPa (200 to 1,455 psia) and flows to 43 kg/s (95 lbm/s). The performance of these two standards and their estimated uncertainties are described. The optimal total uncertainty in mass flowrate is {plus_minus}0.01% and {plus_minus}0.02%, respectively, for the LPL and HPL. The actual uncertainty is dependent on the operating conditions and is primarily a function of the operating pressure and flowrate. Uncertainty estimates are provided on the calibration of turbine meters and sonic nozzles. The largest uncertainty in the calibration of flowmeters is the uncertainty in theoretical models for density and the measurement of natural gas composition.

  4. Position-momentum uncertainty relations in the presence of quantum memory

    SciTech Connect

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  5. Position-momentum uncertainty relations in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-01

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  6. Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

    SciTech Connect

    Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

    2013-12-01

    The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

  7. The GUM revision: the Bayesian view toward the expression of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Lira, I.

    2016-03-01

    The ‘Guide to the Expression of Uncertainty in Measurement’ (GUM) has been in use for more than 20 years, serving its purposes worldwide at all levels of metrology, from scientific to industrial and commercial applications. However, the GUM presents some inconsistencies, both internally and with respect to its two later Supplements. For this reason, the Joint Committee for Guides in Metrology, which is responsible for these documents, has decided that a major revision of the GUM is needed. This will be done by following the principles of Bayesian statistics, a concise summary of which is presented in this article. Those principles should be useful in physics and engineering laboratory courses that teach the fundamentals of data analysis and measurement uncertainty evaluation.

  8. Two additional principles for determining which species to monitor.

    PubMed

    Wilson, Howard B; Rhodes, Jonathan R; Possingham, Hugh P

    2015-11-01

    Monitoring to detect population declines is widespread, but also costly. There is, consequently, a need to optimize monitoring to maximize cost-effectiveness. Here we develop a quantitative decision analysis framework for how to optimally allocate resources for monitoring among species. By keeping the framework simple, we analytically establish two new principles about which species are optimal to monitor for detecting declines: (1) those that lie on the boundary between species being allocated resources for conservation action and species that are not and (2) those with the greatest uncertainty in whether they are declining. These two principles are in addition to other factors that are also important in monitoring decisions, such as complementarity. We demonstrate the efficacy of these principles when other factors are not present, and show how the two principles can be combined. This analysis demonstrates that the most cost-effective species to monitor are ones where the information gained from monitoring is most likely to change the allocation of funds for action, not necessarily the most vulnerable or endangered. We suggest these results are general and apply to all ecological monitoring, not just of biological species: monitoring and information are only valuable when they are likely to change how people act. PMID:27070020

  9. DO MODEL UNCERTAINTY WITH CORRELATED INPUTS

    EPA Science Inventory

    The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

  10. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  11. Uncertainty--We Do Need It.

    ERIC Educational Resources Information Center

    Cothern, C. Richard; Cothern, Margaret Fogt

    1980-01-01

    The precision of measurements in today's society is discussed and is related to the range of uncertainty or variation of measurement. Numerous examples provide insight into the margin of error in any measurement. The issue of uncertainty is particularly applicable to levels of toxic chemicals in the environment. (SA)

  12. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  13. The Stock Market: Risk vs. Uncertainty.

    ERIC Educational Resources Information Center

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…

  14. Uncertainty Propagation in an Ecosystem Nutrient Budget.

    EPA Science Inventory

    New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

  15. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  16. Uncertainty and Engagement with Learning Games

    ERIC Educational Resources Information Center

    Howard-Jones, Paul A.; Demetriou, Skevi

    2009-01-01

    Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

  17. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  18. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  19. Critical analysis of uncertainties during particle filtration

    NASA Astrophysics Data System (ADS)

    Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

    2012-09-01

    Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

  20. Critical analysis of uncertainties during particle filtration.

    PubMed

    Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

    2012-09-01

    Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data. PMID:23020418

  1. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were

  2. Uncertainty Analysis by the "Worst Case" Method.

    ERIC Educational Resources Information Center

    Gordon, Roy; And Others

    1984-01-01

    Presents a new method of uncertainty propagation which concentrates on the calculation of upper and lower limits (the "worst cases"), bypassing absolute and relative uncertainties. Includes advantages of this method and its use in freshmen laboratories, advantages of the traditional method, and a numerical example done by both methods. (JN)

  3. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  4. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  5. Microform calibration uncertainties of Rockwell diamond indenters

    SciTech Connect

    Song, J.F.; Rudder, F.F. Jr.; Vorburger, T.V.; Smith, J.H.

    1995-09-01

    The Rockwell hardness test is a mechanical testing method for evaluating a property of metal products. National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of calibration and check standards, and calibration and uncertainty calculation procedures, the authors have calibrated the microform geometric parameters of Rockwell diamond indenters. These calibrations are traceable to fundamental standards. The expanded uncertainties are {+-} 0.3 {micro}m for the least-squares radius; {+-} 0.01{degree} for the cone angle; and {+-} 0.025 for the holder axis alignment calibrations. Under ISO and NIST guidelines for expressing measurement uncertainties, the calibration and uncertainty calculation procedure, error sources, and uncertainty components are described, and the expanded uncertainties are calculated. The instrumentation and calibration procedure also allows the measurement of profile deviation from the least-squares radius and cone flank straightness. The surface roughness and the shape of the spherical tip of the diamond indenter can also be explored and quantified. The calibration approach makes it possible to quantify the uncertainty, uniformity, and reproducibility of Rockwell diamond indenter microform geometry, as well as to unify the Rockwell hardness standards, through fundamental measurements rather than performance comparisons.

  6. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  7. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  11. Assessing uncertainties in urban drainage models

    NASA Astrophysics Data System (ADS)

    Deletic, A.; Dotto, C. B. S.; McCarthy, D. T.; Kleidorfer, M.; Freni, G.; Mannina, G.; Uhl, M.; Henrichs, M.; Fletcher, T. D.; Rauch, W.; Bertrand-Krajewski, J. L.; Tait, S.

    The current state of knowledge regarding uncertainties in urban drainage models is poor. This is in part due to the lack of clarity in the way model uncertainty analyses are conducted and how the results are presented and used. There is a need for a common terminology and a conceptual framework for describing and estimating uncertainties in urban drainage models. Practical tools for the assessment of model uncertainties for a range of urban drainage models are also required to be developed. This paper, produced by the International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, is a contribution to the development of a harmonised framework for defining and assessing uncertainties in the field of urban drainage modelling. The sources of uncertainties in urban drainage models and their links are initially mapped out. This is followed by an evaluation of each source, including a discussion of its definition and an evaluation of methods that could be used to assess its overall importance. Finally, an approach for a Global Assessment of Modelling Uncertainties (GAMU) is proposed, which presents a new framework for mapping and quantifying sources of uncertainty in urban drainage models.

  12. Nonclassicality in phase-number uncertainty relations

    SciTech Connect

    Matia-Hernando, Paloma; Luis, Alfredo

    2011-12-15

    We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.

  13. The water sensitive city: principles for practice.

    PubMed

    Wong, T H F; Brown, R R

    2009-01-01

    With the widespread realisation of the significance of climate change, urban communities are increasingly seeking to ensure resilience to future uncertainties in urban water supplies, yet change seems slow with many cities facing ongoing investment in the conventional approach. This is because transforming cities to more sustainable urban water cities, or to Water Sensitive Cities, requires a major overhaul of the hydro-social contract that underpins conventional approaches. This paper provides an overview of the emerging research and practice focused on system resilience and principles of sustainable urban water management Three key pillars that need to underpin the development and practice of a Water Sensitive City are proposed: (i) access to a diversity of water sources underpinned by a diversity of centralised and decentralised infrastructure; (ii) provision of ecosystem services for the built and natural environment; and (iii) socio-political capital for sustainability and water sensitive behaviours. While there is not one example in the world of a Water Sensitive City, there are cities that lead on distinct and varying attributes of the water sensitive approach and examples from Australia and Singapore are presented. PMID:19657162

  14. Uncertainties in derived temperature-height profiles

    NASA Technical Reports Server (NTRS)

    Minzner, R. A.

    1974-01-01

    Nomographs were developed for relating uncertainty in temperature T to uncertainty in the observed height profiles of both pressure p and density rho. The relative uncertainty delta T/T is seen to depend not only upon the relative uncertainties delta P/P or delta rho/rho, and to a small extent upon the value of T or H, but primarily upon the sampling-height increment Delta h, the height increment between successive observations of p or delta. For a fixed value of delta p/p, the value of delta T/T varies inversely with Delta h. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperatures with unacceptably large uncertainties.

  15. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  16. Contending with uncertainty in conservation management decisions.

    PubMed

    McCarthy, Michael A

    2014-08-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  17. Contending with uncertainty in conservation management decisions

    PubMed Central

    McCarthy, Michael A

    2014-01-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  18. Habitable zone dependence on stellar parameter uncertainties

    SciTech Connect

    Kane, Stephen R.

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  19. The legal status of Uncertainty

    NASA Astrophysics Data System (ADS)

    Altamura, M.; Ferraris, L.; Miozzo, D.; Musso, L.; Siccardi, F.

    2011-03-01

    An exponential improvement of numerical weather prediction (NWP) models was observed during the last decade (Lynch, 2008). Civil Protection (CP) systems exploited Meteo services in order to redeploy their actions towards the prediction and prevention of events rather than towards an exclusively response-oriented mechanism1. Nevertheless, experience tells us that NWP models, even if assisted by real time observations, are far from being deterministic. Complications frequently emerge in medium to long range forecasting, which are subject to sudden modifications. On the other hand, short term forecasts, if seen through the lens of criminal trials2, are to the same extent, scarcely reliable (Molini et al., 2009). One particular episode related with wrong forecasts, in the Italian panorama, has deeply frightened CP operators as the NWP model in force missed a meteorological adversity which, in fact, caused death and dealt severe damage in the province of Vibo Valentia (2006). This event turned into a very discussed trial, lasting over three years, and intended against whom assumed the legal position of guardianship within the CP. A first set of data is now available showing that in concomitance with the trial of Vibo Valentia the number of alerts issued raised almost three folds. We sustain the hypothesis that the beginning of the process of overcriminalization (Husak, 2008) of CPs is currently increasing the number of false alerts with the consequent effect of weakening alert perception and response by the citizenship (Brezntiz, 1984). The common misunderstanding of such an issue, i.e. the inherent uncertainty in weather predictions, mainly by prosecutors and judges, and generally by whom deals with law and justice, is creating the basis for a defensive behaviour3 within CPs. This paper intends, thus, to analyse the social and legal relevance of uncertainty in the process of issuing meteo-hydrological alerts by CPs. Footnotes: 1 The Italian Civil Protection is working in this direction since 1992 (L. 225/92). An example of this effort is clearly given by the Prime Minister Decree (DPCM 20/12/2001 "Linee guida relative ai piani regionali per la programmazione delle attivita' di previsione, prevenzione e lotta attiva contro gli incendi boschivi - Guidelines for regional plans for the planning of prediction, prevention and forest fires fighting activities") that, already in 2001, emphasized "the most appropriate approach to pursue the preservation of forests is to promote and encourage prediction and prevention activities rather than giving priority to the emergency-phase focused on fire-fighting". 2 Supreme Court of the United States, In re Winship (No. 778), No. 778 argued: 20 January 1970, decided: 31 March 1970: Proof beyond a reasonable doubt, which is required by the Due Process Clause in criminal trials, is among the "essentials of due process and fair treatment" 3 In Kessler and McClellan (1996): "Defensive medicine is a potentially serious social problem: if fear of liability drives health care providers to administer treatments that do not have worthwhile medical benefits, then the current liability system may generate inefficiencies much larger than the costs of compensating malpractice claimants".

  20. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of meteorological, public health, media studies, social science, and psychology literature opens up a vast and interesting library that is not obvious to the volcanologist at a first glance. It shows that forecasts and their uncertainty can be phrased and delivered, and followed-up upon, in a manner that reduces the chance of message distortion. The mass-media delivery vehicle requires careful tracking because the potential for forecast distortion can result in a frame that the scientific response is "absurd", "confused", "shambolic", or "dysfunctional." This can help set up a "frightened", "frustrated", "angry", even "furious" reaction to the forecast and forecaster.