Science.gov

Sample records for uncertainty principle

  1. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  2. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  3. Finite Frames and Graph Theoretic Uncertainty Principles

    NASA Astrophysics Data System (ADS)

    Koprowski, Paul J.

    The subject of analytical uncertainty principles is an important field within harmonic analysis, quantum physics, and electrical engineering. We explore uncertainty principles in the context of the graph Fourier transform, and we prove additive results analogous to the multiplicative version of the classical uncertainty principle. We establish additive uncertainty principles for finite Parseval frames. Lastly, we examine the feasibility region of simultaneous values of the norms of a graph differential operator acting on a function f ∈ l2(G) and its graph Fourier transform.

  4. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  5. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  6. Extrapolation, uncertainty factors, and the precautionary principle.

    PubMed

    Steel, Daniel

    2011-09-01

    This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards.

  7. Curriculum in Art Education: The Uncertainty Principle.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1989-01-01

    Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the…

  8. Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.

    ERIC Educational Resources Information Center

    McKerrow, K. Kelly; McKerrow, Joan E.

    1991-01-01

    The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)

  9. An uncertainty principle for unimodular quantum groups

    SciTech Connect

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  10. A Principle of Uncertainty for Information Seeking.

    ERIC Educational Resources Information Center

    Kuhlthau, Carol C.

    1993-01-01

    Proposes an uncertainty principle for information seeking based on the results of a series of studies that investigated the user's perspective of the information search process. Constructivist theory is discussed as a conceptual framework for studying the user's perspective, and areas for further research are suggested. (Contains 44 references.)…

  11. Geometric formulation of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bosyk, G. M.; Osán, T. M.; Lamberti, P. W.; Portesi, M.

    2014-03-01

    A geometric approach to formulate the uncertainty principle between quantum observables acting on an N-dimensional Hilbert space is proposed. We consider the fidelity between a density operator associated with a quantum system and a projector associated with an observable, and interpret it as the probability of obtaining the outcome corresponding to that projector. We make use of fidelity-based metrics such as angle, Bures, and root infidelity to propose a measure of uncertainty. The triangle inequality allows us to derive a family of uncertainty relations. In the case of the angle metric, we recover the Landau-Pollak inequality for pure states and show, in a natural way, how to extend it to the case of mixed states in arbitrary dimension. In addition, we derive and compare alternative uncertainty relations when using other known fidelity-based metrics.

  12. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  13. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  14. Tightening the uncertainty principle for stochastic currents

    NASA Astrophysics Data System (ADS)

    Polettini, Matteo; Lazarescu, Alexandre; Esposito, Massimiliano

    2016-11-01

    We connect two recent advances in the stochastic analysis of nonequilibrium systems: the (loose) uncertainty principle for the currents, which states that statistical errors are bounded by thermodynamic dissipation, and the analysis of thermodynamic consistency of the currents in the light of symmetries. Employing the large deviation techniques presented by Gingrich et al. [Phys. Rev. Lett. 116, 120601 (2016), 10.1103/PhysRevLett.116.120601] and Pietzonka, Barato, and Seifert [Phys. Rev. E 93, 052145 (2016), 10.1103/PhysRevE.93.052145], we provide a short proof of the loose uncertainty principle, and prove a tighter uncertainty relation for a class of thermodynamically consistent currents J . Our bound involves a measure of partial entropy production, that we interpret as the least amount of entropy that a system sustaining current J can possibly produce, at a given steady state. We provide a complete mathematical discussion of quadratic bounds which allows one to determine which are optimal, and finally we argue that the relationship for the Fano factor of the entropy production rate var σ /mean σ ≥2 is the most significant realization of the loose bound. We base our analysis both on the formalism of diffusions, and of Markov jump processes in the light of Schnakenberg's cycle analysis.

  15. Tightening the uncertainty principle for stochastic currents.

    PubMed

    Polettini, Matteo; Lazarescu, Alexandre; Esposito, Massimiliano

    2016-11-01

    We connect two recent advances in the stochastic analysis of nonequilibrium systems: the (loose) uncertainty principle for the currents, which states that statistical errors are bounded by thermodynamic dissipation, and the analysis of thermodynamic consistency of the currents in the light of symmetries. Employing the large deviation techniques presented by Gingrich et al. [Phys. Rev. Lett. 116, 120601 (2016)PRLTAO0031-900710.1103/PhysRevLett.116.120601] and Pietzonka, Barato, and Seifert [Phys. Rev. E 93, 052145 (2016)2470-004510.1103/PhysRevE.93.052145], we provide a short proof of the loose uncertainty principle, and prove a tighter uncertainty relation for a class of thermodynamically consistent currents J. Our bound involves a measure of partial entropy production, that we interpret as the least amount of entropy that a system sustaining current J can possibly produce, at a given steady state. We provide a complete mathematical discussion of quadratic bounds which allows one to determine which are optimal, and finally we argue that the relationship for the Fano factor of the entropy production rate varσ/meanσ≥2 is the most significant realization of the loose bound. We base our analysis both on the formalism of diffusions, and of Markov jump processes in the light of Schnakenberg's cycle analysis.

  16. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  17. Quantum randomness certified by the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Vallone, Giuseppe; Marangon, Davide G.; Tomasin, Marco; Villoresi, Paolo

    2014-11-01

    We present an efficient method to extract the amount of true randomness that can be obtained by a quantum random number generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min-entropy and max-entropy. We tested our method with two different QRNGs by using a train of qubits or ququart and demonstrated the scalability toward practical applications.

  18. Lorentz invariance violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag

    2016-01-01

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  19. Robertson-Schrödinger formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Nuno Prata, João

    2015-07-01

    A more general measurement disturbance uncertainty principle is presented in a Robertson-Schrödinger formulation. It is shown that it is stronger and having nicer properties than Ozawa's uncertainty relations. In particular it is invariant under symplectic transformations. One shows also that there are states of the probe (measuring device) that saturate the matrix formulation of measurement disturbance uncertainty principle.

  20. Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael

    1993-01-01

    Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)

  1. Incorporation of generalized uncertainty principle into Lifshitz field theories

    NASA Astrophysics Data System (ADS)

    Faizal, Mir; Majumder, Barun

    2015-06-01

    In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.

  2. Incorporation of generalized uncertainty principle into Lifshitz field theories

    SciTech Connect

    Faizal, Mir; Majumder, Barun

    2015-06-15

    In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.

  3. Open timelike curves violate Heisenberg's uncertainty principle.

    PubMed

    Pienaar, J L; Ralph, T C; Myers, C R

    2013-02-08

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.

  4. Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2016-01-01

    Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…

  5. The uncertainty principle determines the nonlocality of quantum mechanics.

    PubMed

    Oppenheim, Jonathan; Wehner, Stephanie

    2010-11-19

    Two central concepts of quantum mechanics are Heisenberg's uncertainty principle and a subtle form of nonlocality that Einstein famously called "spooky action at a distance." These two fundamental features have thus far been distinct concepts. We show that they are inextricably and quantitatively linked: Quantum mechanics cannot be more nonlocal with measurements that respect the uncertainty principle. In fact, the link between uncertainty and nonlocality holds for all physical theories. More specifically, the degree of nonlocality of any theory is determined by two factors: the strength of the uncertainty principle and the strength of a property called "steering," which determines which states can be prepared at one location given a measurement at another.

  6. Quantum theory of the generalised uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bruneton, Jean-Philippe; Larena, Julien

    2017-04-01

    We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.

  7. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  8. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  9. Entanglement, Identical Particles and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Rigolin, Gustavo

    2016-08-01

    A new uncertainty relation (UR) is obtained for a system of N identical pure entangled particles if we use symmetrized observables when deriving the inequality. This new expression can be written in a form where we identify a term which explicitly shows the quantum correlations among the particles that constitute the system. For the particular cases of two and three particles, making use of the Schwarz inequality, we obtain new lower bounds for the UR that are different from the standard one.

  10. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures.

  11. An uncertainty principle underlying the functional architecture of V1.

    PubMed

    Barbieri, Davide; Citti, Giovanna; Sanguinetti, Gonzalo; Sarti, Alessandro

    2012-01-01

    We present a model of the morphology of orientation maps in V1 based on the uncertainty principle of the SE(2) group. Starting from the symmetries of the cortex, suitable harmonic analysis instruments are used to obtain coherent states in the Fourier domain as minimizers of the uncertainty. Cortical activities related to orientation maps are then obtained by projection on a suitable cortical Fourier basis.

  12. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

  13. Single-Slit Diffraction and the Uncertainty Principle

    ERIC Educational Resources Information Center

    Rioux, Frank

    2005-01-01

    A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

  14. Heisenberg uncertainty principles for an oscillatory integral operator

    NASA Astrophysics Data System (ADS)

    Castro, L. P.; Guerra, R. C.; Tuan, N. M.

    2017-01-01

    The main aim of this work is to obtain Heisenberg uncertainty principles for a specific oscillatory integral operator which representatively exhibits different parameters on their sine and cosine phase components. Additionally, invertibility theorems, Parseval type identities and Plancherel type theorems are also obtained.

  15. Relativistic Particle in Electromagnetic Fields with a Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Merad, M.; Zeroual, F.; Falek, M.

    2012-05-01

    In this paper, we propose to solve the relativistic Klein-Gordon and Dirac equations subjected to the action of a uniform electromagnetic field with a generalized uncertainty principle in the momentum space. In both cases, the energy eigenvalues and their corresponding eigenfunctions are obtained. The limit case is then deduced for a small parameter of deformation.

  16. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  17. The 'Herbivory Uncertainty Principle': application in a cerrado site.

    PubMed

    Gadotti, C A; Batalha, M A

    2010-05-01

    Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control), quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation.

  18. “Stringy” coherent states inspired by generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  19. Entropic Law of Force, Emergent Gravity and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Santos, M. A.; Vancea, I. V.

    The entropic formulation of the inertia and the gravity relies on quantum, geometrical and informational arguments. The fact that the results are completely classical is misleading. In this paper, we argue that the entropic formulation provides new insights into the quantum nature of the inertia and the gravity. We use the entropic postulate to determine the quantum uncertainty in the law of inertia and in the law of gravity in the Newtonian Mechanics, the Special Relativity and in the General Relativity. These results are obtained by considering the most general quantum property of the matter represented by the Uncertainty Principle and by postulating an expression for the uncertainty of the entropy such that: (i) it is the simplest quantum generalization of the postulate of the variation of the entropy and (ii) it reduces to the variation of the entropy in the absence of the uncertainty.

  20. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  1. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  2. Human time-frequency acuity beats the Fourier uncertainty principle.

    PubMed

    Oppenheim, Jacob N; Magnasco, Marcelo O

    2013-01-25

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4 π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple "linear filter" models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  3. Remarks on Generalized Uncertainty Principle Induced from Constraint System

    NASA Astrophysics Data System (ADS)

    Eune, Myungseok; Kim, Wontae

    2014-12-01

    The extended commutation relations for generalized uncertainty principle (GUP) have been based on the assumption of the minimal length in position. Instead of this assumption, we start with a constrained Hamiltonian system described by the conventional Poisson algebra and then impose appropriate second class constraints to this system. Consequently, we can show that the consistent Dirac brackets for this system are nothing, but the extended commutation relations describing the GUP.

  4. Uncertainty principle for Gabor systems and the Zak transform

    SciTech Connect

    Czaja, Wojciech; Zienkiewicz, Jacek

    2006-12-15

    We show that if g(set-membership sign)L{sup 2}(R) is a generator of a Gabor orthonormal basis with the lattice ZxZ, then its Zak transform Z(g) satisfies {nabla}Z(g)(negated-set-membership sign)L{sup 2}([0,1){sup 2}). This is a generalization and extension of the Balian-Low uncertainty principle.

  5. The uncertainty principle - A simplified review of the four versions

    NASA Astrophysics Data System (ADS)

    Jijnasu, Vasudeva

    2016-08-01

    The complexity of the historical confusions around different versions of the uncertainty principle, in addition to the increasing technicality of physics in general, has made its affairs predominantly accessible only to specialists. Consequently, the clarity that has dawned upon physicists over the decades regarding quantum uncertainty remains mostly imperceptible for general readers, students, philosophers and even non-expert scientists. In an attempt to weaken this barrier, the article presents a summary of this technical subject, focussing at the prime case of the position-momentum pair, as modestly and informatively as possible. This includes a crisp analysis of the historical as well as of the latest developments. In the process the article provides arguments to show that the usually sidelined version of uncertainty-the intrinsic 'unsharpness' or 'indeterminacy'-forms the basis for all the other three versions, and subsequently presents its hard philosophical implications.

  6. Generalized uncertainty principle: implications for black hole complementarity

    NASA Astrophysics Data System (ADS)

    Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han

    2014-12-01

    At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.

  7. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  8. The Precautionary Principle and statistical approaches to uncertainty.

    PubMed

    Keiding, Niels; Budtz-Jørgensen, Esben

    2004-01-01

    The central challenge from the Precautionary Principle to statistical methodology is to help delineate (preferably quantitatively) the possibility that some exposure is hazardous, even in cases where this is not established beyond reasonable doubt. The classical approach to hypothesis testing is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model uncertainty: usually these procedures assume that the class of models describing dose/response is known with certainty; this assumption is, however, often violated, perhaps particularly often when epidemiological data form the source of the risk assessment, and regulatory authorities have occasionally resorted to some average based on competing models. The recent methodology of the Bayesian model averaging might be a systematic version of this, but is this an arena for the Precautionary Principle to come into play?

  9. Quantum black hole and the modified uncertainty principle

    NASA Astrophysics Data System (ADS)

    Majumder, Barun

    2011-07-01

    Recently Ali et al. (2009) [13] proposed a Generalized Uncertainty Principle (or GUP) with a linear term in momentum (accompanied by Planck length). Inspired by this idea we examine the Wheeler-DeWitt equation for a Schwarzschild black hole with a modified Heisenberg algebra which has a linear term in momentum. We found that the leading contribution to mass comes from the square root of the quantum number n which coincides with Bekenstein's proposal. We also found that the mass of the black hole is directly proportional to the quantum number n when quantum gravity effects are taken into consideration via the modified uncertainty relation but it reduces the value of mass for a particular value of the quantum number.

  10. Nondivergent classical response functions from uncertainty principle: quasiperiodic systems.

    PubMed

    Kryvohuz, Maksym; Cao, Jianshu

    2005-01-08

    Time-divergence in linear and nonlinear classical response functions can be removed by taking a phase-space average within the quantized uncertainty volume O(hn) around the microcanonical energy surface. For a quasiperiodic system, the replacement of the microcanonical distribution density in the classical response function with the quantized uniform distribution density results in agreement of quantum and classical expressions through Heisenberg's correspondence principle: each matrix element (u/alpha(t)/v) corresponds to the (u-v)th Fourier component of alpha(t) evaluated along the classical trajectory with mean action (Ju+Jv)/2. Numerical calculations for one- and two-dimensional systems show good agreement between quantum and classical results. The generalization to the case of N degrees of freedom is made. Thus, phase-space averaging within the quantized uncertainty volume provides a useful way to establish the classical-quantum correspondence for the linear and nonlinear response functions of a quasiperiodic system.

  11. Nonequilibrium fluctuation-dissipation inequality and nonequilibrium uncertainty principle.

    PubMed

    Fleming, C H; Hu, B L; Roura, Albert

    2013-07-01

    The fluctuation-dissipation relation is usually formulated for a system interacting with a heat bath at finite temperature, and often in the context of linear response theory, where only small deviations from the mean are considered. We show that for an open quantum system interacting with a nonequilibrium environment, where temperature is no longer a valid notion, a fluctuation-dissipation inequality exists. Instead of being proportional, quantum fluctuations are bounded below by quantum dissipation, whereas classically the fluctuations vanish at zero temperature. The lower bound of this inequality is exactly satisfied by (zero-temperature) quantum noise and is in accord with the Heisenberg uncertainty principle, in both its microscopic origins and its influence upon systems. Moreover, it is shown that there is a coupling-dependent nonequilibrium fluctuation-dissipation relation that determines the nonequilibrium uncertainty relation of linear systems in the weak-damping limit.

  12. Factorization in the quantum mechanics with the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2015-07-01

    In this paper, we discuss the quantum mechanics with the generalized uncertainty principle (GUP) where the commutation relation is given by [x̂,p̂] = iℏ(1 + αp̂ + βp̂2). For this algebra, we obtain the eigenfunction of the momentum operator. We also study the GUP corrected quantum particle in a box. Finally, we apply the factorization method to the harmonic oscillator in the presence of a minimal observable length and obtain the energy eigenvalues by applying the perturbation method.

  13. Generalized uncertainty principle in Bianchi type I quantum cosmology

    NASA Astrophysics Data System (ADS)

    Vakili, B.; Sepangi, H. R.

    2007-07-01

    We study a quantum Bianchi type I model in which the dynamical variables of the corresponding minisuperspace obey the generalized Heisenberg algebra. Such a generalized uncertainty principle has its origin in the existence of a minimal length suggested by quantum gravity and sting theory. We present approximate analytical solutions to the corresponding Wheeler DeWitt equation in the limit where the scale factor of the universe is small and compare the results with the standard commutative and noncommutative quantum cosmology. Similarities and differences of these solutions are also discussed.

  14. Uncertainty, imprecision, and the precautionary principle in climate change assessment.

    PubMed

    Borsuk, M E; Tomassini, L

    2005-01-01

    Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.

  15. Effects of the generalised uncertainty principle on quantum tunnelling

    NASA Astrophysics Data System (ADS)

    Blado, Gardo; Prescott, Trevor; Jennings, James; Ceyanes, Joshuah; Sepulveda, Rafael

    2016-03-01

    In a previous paper (Blado et al 2014 Eur. J. Phys. 35 065011), we showed that quantum gravity effects can be discussed with only a background in non-relativistic quantum mechanics at the undergraduate level by looking at the effect of the generalised uncertainty principle (GUP) on the finite and infinite square wells. In this paper, we derive the GUP corrections to the tunnelling probability of simple quantum mechanical systems which are accessible to undergraduates (alpha decay, simple models of quantum cosmogenesis and gravitational tunnelling radiation) and which employ the WKB approximation, a topic discussed in undergraduate quantum mechanics classes. It is shown that the GUP correction increases the tunnelling probability in each of the examples discussed.

  16. Generalized uncertainty principle and analogue of quantum gravity in optics

    NASA Astrophysics Data System (ADS)

    Braidotti, Maria Chiara; Musslimani, Ziad H.; Conti, Claudio

    2017-01-01

    The design of optical systems capable of processing and manipulating ultra-short pulses and ultra-focused beams is highly challenging with far reaching fundamental technological applications. One key obstacle routinely encountered while implementing sub-wavelength optical schemes is how to overcome the limitations set by standard Fourier optics. A strategy to overcome these difficulties is to utilize the concept of a generalized uncertainty principle (G-UP) which has been originally developed to study quantum gravity. In this paper we propose to use the concept of G-UP within the framework of optics to show that the generalized Schrödinger equation describing short pulses and ultra-focused beams predicts the existence of a minimal spatial or temporal scale which in turn implies the existence of maximally localized states. Using a Gaussian wavepacket with complex phase, we derive the corresponding generalized uncertainty relation and its maximally localized states. Furthermore, we numerically show that the presence of nonlinearity helps the system to reach its maximal localization. Our results may trigger further theoretical and experimental tests for practical applications and analogues of fundamental physical theories.

  17. Molecular Response Theory in Terms of the Uncertainty Principle.

    PubMed

    Harde, Hermann; Grischkowsky, Daniel

    2015-08-27

    We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared.

  18. Effect of the Generalized Uncertainty Principle on post-inflation preheating

    SciTech Connect

    Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr

    2011-12-01

    We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.

  19. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  20. Robertson-Schrödinger-type formulation of Ozawa's noise-disturbance uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno

    2014-04-01

    In this work we derive a matrix formulation of a noise-disturbance uncertainty relation, which is akin to the Robertson-Schrödinger uncertainty principle. Our inequality is stronger than Ozawa's uncertainty principle and takes noise-disturbance correlations into account. Moreover, we show that for certain types of measurement interactions it is covariant with respect to linear symplectic transformations of the noise and disturbance operators. Finally, we also study the tightness of our matrix inequality.

  1. On different types of uncertainties in the context of the precautionary principle.

    PubMed

    Aven, Terje

    2011-10-01

    Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean.

  2. Generalized uncertainty principle as a consequence of the effective field theory

    NASA Astrophysics Data System (ADS)

    Faizal, Mir; Ali, Ahmed Farag; Nassar, Ali

    2017-02-01

    We will demonstrate that the generalized uncertainty principle exists because of the derivative expansion in the effective field theories. This is because in the framework of the effective field theories, the minimum measurable length scale has to be integrated away to obtain the low energy effective action. We will analyze the deformation of a massive free scalar field theory by the generalized uncertainty principle, and demonstrate that the minimum measurable length scale corresponds to a second more massive scale in the theory, which has been integrated away. We will also analyze CFT operators dual to this deformed scalar field theory, and observe that scaling of the new CFT operators indicates that they are dual to this more massive scale in the theory. We will use holographic renormalization to explicitly calculate the renormalized boundary action with counter terms for this scalar field theory deformed by generalized uncertainty principle, and show that the generalized uncertainty principle contributes to the matter conformal anomaly.

  3. The precautionary principle in times of intermingled uncertainty and risk: some regulatory complexities.

    PubMed

    van Asselt, M B A; Vos, E

    2005-01-01

    This article explores the use of the precautionary principle in situations of intermingled uncertainty and risk. It analyses how the so-called uncertainty paradox works out by examining the Pfizer case. It reveals regulatory complexities that result from contradictions in precautionary thinking. In conclusion, a plea is made for embedment of uncertainty information, while stressing the need to rethink regulatory reform in the broader sense.

  4. Path Integral for Dirac oscillator with generalized uncertainty principle

    SciTech Connect

    Benzair, H.; Boudjedaa, T.; Merad, M.

    2012-12-15

    The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.

  5. Uncertainty Principle--Limited Experiments: Fact or Academic Pipe-Dream?

    ERIC Educational Resources Information Center

    Albergotti, J. Clifton

    1973-01-01

    The question of whether modern experiments are limited by the uncertainty principle or by the instruments used to perform the experiments is discussed. Several key experiments show that the instruments limit our knowledge and the principle remains of strictly academic concern. (DF)

  6. Uncertainty principle for experimental measurements: Fast versus slow probes

    PubMed Central

    Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.

    2016-01-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments – angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy – suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902

  7. Uncertainty principle for experimental measurements: Fast versus slow probes.

    PubMed

    Hansmann, P; Ayral, T; Tejeda, A; Biermann, S

    2016-02-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments--angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy--suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates.

  8. Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students' depictions

    NASA Astrophysics Data System (ADS)

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-12-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students’ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students’ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students’ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students’ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the

  9. Nonlinear Schrödinger equation from generalized exact uncertainty principle

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz

    2016-09-01

    Inspired by the generalized uncertainty principle, which adds gravitational effects to the standard description of quantum uncertainty, we extend the exact uncertainty principle approach by Hall and Reginatto (2002 J. Phys. A: Math. Gen. 35 3289), and obtain a (quasi)nonlinear Schrödinger equation. This quantum evolution equation of unusual form, enjoys several desired properties like separation of non-interacting subsystems or plane-wave solutions for free particles. Starting with the harmonic oscillator example, we show that every solution of this equation respects the gravitationally induced minimal position uncertainty proportional to the Planck length. Quite surprisingly, our result successfully merges the core of classical physics with non-relativistic quantum mechanics in its extremal form. We predict that the commonly accepted phenomenon, namely a modification of a free-particle dispersion relation due to quantum gravity might not occur in reality.

  10. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  11. Zero-point energies, the uncertainty principle, and positivity of the quantum Brownian density operator.

    PubMed

    Tameshtit, Allan

    2012-04-01

    High-temperature and white-noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white-noise approximation is avoided, it is shown that if the zero-point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.

  12. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  13. The uncertainty principle enables non-classical dynamics in an interferometer.

    PubMed

    Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

    2014-08-08

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics.

  14. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    PubMed

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  15. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  16. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  17. Generalized uncertainty principle corrections to the simple harmonic oscillator in phase space

    NASA Astrophysics Data System (ADS)

    Das, Saurya; Robbins, Matthew P. G.; Walton, Mark A.

    2016-01-01

    We compute Wigner functions for the harmonic oscillator including corrections from generalized uncertainty principles (GUPs), and study the corresponding marginal probability densities and other properties. We show that the GUP corrections to the Wigner functions can be significant, and comment on their potential measurability in the laboratory.

  18. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  19. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    SciTech Connect

    Tawfik, A.

    2013-07-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  20. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  1. Integrating Leonardo da Vinci's principles of demonstration, uncertainty, and cultivation in contemporary nursing education.

    PubMed

    Story, Lachel; Butts, Janie

    2014-03-01

    Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosità principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience.

  2. The most general form of deformation of the Heisenberg algebra from the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Masood, Syed; Faizal, Mir; Zaz, Zaid; Ali, Ahmed Farag; Raza, Jamil; Shah, Mushtaq B.

    2016-12-01

    In this paper, we will propose the most general form of the deformation of Heisenberg algebra motivated by the generalized uncertainty principle. This deformation of the Heisenberg algebra will deform all quantum mechanical systems. The form of the generalized uncertainty principle used to motivate these results will be motivated by the space fractional quantum mechanics, and non-locality in quantum mechanical systems. We also analyse a specific limit of this generalized deformation for one dimensional system, and in that limit, a nonlocal deformation of the momentum operator generates a local deformation of all one dimensional quantum mechanical systems. We analyse the low energy effects of this deformation on a harmonic oscillator, Landau levels, Lamb shift, and potential barrier. We also demonstrate that this deformation leads to a discretization of space.

  3. Recent Advances in Compressed Sensing: Discrete Uncertainty Principles and Fast Hyperspectral Imaging

    DTIC Science & Technology

    2015-03-26

    medical imaging , e.g., magnetic resonance imaging (MRI). Since the early 1980s, MRI has granted doctors the ability to distinguish between healthy tissue...chemical composition of a star. Conventional hyperspectral cameras are slow. Different methods of hyperspectral imaging either require time to process ...Recent Advances in Compressed Sensing: Discrete Uncertainty Principles and Fast Hyperspectral Imaging THESIS MARCH 2015 Megan E. Lewis, Second

  4. Genome wide expression profiling of angiogenic signaling and the Heisenberg uncertainty principle.

    PubMed

    Huber, Peter E; Hauser, Kai; Abdollahi, Amir

    2004-11-01

    Genome wide DNA expression profiling coupled with antibody array experiments using endostatin to probe the angiogenic signaling network in human endothelial cells were performed. The results reveal constraints on the measuring process that are of a similar kind as those implied by the uncertainty principle of quantum mechanics as described by Werner Heisenberg. We describe this analogy and argue for its heuristic utility in the conceptualization of angiogenesis as an important step in tumor formation.

  5. Energy distribution of massless particles on black hole backgrounds with generalized uncertainty principle

    SciTech Connect

    Li Zhongheng

    2009-10-15

    We derive new formulas for the spectral energy density and total energy density of massless particles in a general spherically symmetric static metric from a generalized uncertainty principle. Compared with blackbody radiation, the spectral energy density is strongly damped at high frequencies. For large values of r, the spectral energy density diminishes when r grows, but at the event horizon, the spectral energy density vanishes and therefore thermodynamic quantities near a black hole, calculated via the generalized uncertainty principle, do not require any cutoff parameter. We find that the total energy density can be expressed in terms of Hurwitz zeta functions. It should be noted that at large r (low local temperature), the difference between the total energy density and the Stefan-Boltzmann law is too small to be observed. However, as r approaches an event horizon, the effect of the generalized uncertainty principle becomes more and more important, which may be observable. As examples, the spectral energy densities in the background metric of a Schwarzschild black hole and of a Schwarzschild black hole plus quintessence are discussed. It is interesting to note that the maximum of the distribution shifts to higher frequencies when the quintessence equation of state parameter w decreases.

  6. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGES

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  7. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    SciTech Connect

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.

  8. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect

    Said, Jackson Levi; Adami, Kristian Zarb

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  9. Using the uncertainty principle to design simple interactions for targeted self-assembly.

    PubMed

    Edlund, E; Lindgren, O; Jacobi, M Nilsson

    2013-07-14

    We present a method that systematically simplifies isotropic interactions designed for targeted self-assembly. The uncertainty principle is used to show that an optimal simplification is achieved by a combination of heat kernel smoothing and Gaussian screening of the interaction potential in real and reciprocal space. We use this method to analytically design isotropic interactions for self-assembly of complex lattices and of materials with functional properties. The derived interactions are simple enough to narrow the gap between theory and experimental implementation of theory based designed self-assembling materials.

  10. Modified uncertainty principle from the free expansion of a Bose-Einstein condensate

    NASA Astrophysics Data System (ADS)

    Castellanos, Elías; Escamilla-Rivera, Celia

    2017-01-01

    In this paper, we present a theoretical and numerical analysis of the free expansion of a Bose-Einstein condensate, where we assume that the single particle energy spectrum is deformed due to a possible quantum structure of spacetime. Also, we consider the presence of interparticle interactions in order to study more realistic and specific scenarios. The modified free velocity expansion of the condensate leads in a natural way to a modification of the uncertainty principle, which allows us to investigate some possible features of the Planck scale regime in low-energy earth-based experiments.

  11. Before and beyond the precautionary principle: epistemology of uncertainty in science and law.

    PubMed

    Tallacchini, Mariachiara

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that "[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation". By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  12. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    SciTech Connect

    Tallacchini, Mariachiara . E-mail: mariachiara.tallacchini@unimi.it

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  13. Covariant energy–momentum and an uncertainty principle for general relativity

    SciTech Connect

    Cooperstock, F.I.; Dupre, M.J.

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  14. Massive vector particles tunneling from black holes influenced by the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Li, Xiang-Qian

    2016-12-01

    This study considers the generalized uncertainty principle, which incorporates the central idea of large extra dimensions, to investigate the processes involved when massive spin-1 particles tunnel from Reissner-Nordstrom and Kerr black holes under the effects of quantum gravity. For the black hole, the quantum gravity correction decelerates the increase in temperature. Up to O (1Mf/2), the corrected temperatures are affected by the mass and angular momentum of the emitted vector bosons. In addition, the temperature of the Kerr black hole becomes uneven due to rotation. When the mass of the black hole approaches the order of the higher dimensional Planck mass Mf, it stops radiating and yields a black hole remnant.

  15. Galilean and Lorentz Transformations in a Space with Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Tkachuk, V. M.

    2016-12-01

    We consider a space with Generalized Uncertainty Principle (GUP) which can be obtained in the frame of the deformed commutation relations. In the space with GUP we have found transformations relating coordinates and times of moving and rest frames of reference in the first order over the parameter of deformation. In the non-relativistic case we find the deformed Galilean transformation which is rotation in Euclidian space-time. This transformation is similar to the Lorentz one but written for Euclidean space-time where the speed of light is replaced by some velocity related to the parameter of deformation. We show that for relativistic particle in the space with GUP the coordinates of the rest and moving frames of reference satisfy the Lorentz transformation with some effective speed of light.

  16. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  17. A Pseudo-Quantum Triad: Schrödinger's Equation, the Uncertainty Principle, and the Heisenberg Group

    NASA Astrophysics Data System (ADS)

    de Gosson, Maurice A.

    2012-05-01

    We show that the paradigmatic quantum triad "Schrödinger equation-Uncertainty principle-Heisenberg group" emerges mathematically from classical mechanics. In the case of the Schrödinger equation, this is done by extending the metaplectic representation of linear Hamiltonian flows to arbitrary flows; for the Heisenberg group this follows from a careful analysis of the notion of phase of a Lagrangian manifold, and for the uncertainty principle it suffices to use tools from multivariate statistics together with the theory of John's minimum volume ellipsoid. Thus, the mathematical structure needed to make quantum mechanics emerge already exists in classical mechanics.

  18. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  19. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    PubMed

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.

  20. Revisiting the Calculation of I/V Profiles in Molecular Junctions Using the Uncertainty Principle.

    PubMed

    Ramos-Berdullas, Nicolás; Mandado, Marcos

    2014-04-17

    Ortiz and Seminario (J. Chem. Phys. 2007, 127, 111106/1-3) proposed some years ago a simple and direct approach to obtain I/V profiles from the combination of ab initio equilibrium electronic structure calculations and the uncertainty principle as an alternative or complementary tool to more sophisticated nonequilibrium Green's functions methods. In this work, we revisit the fundamentals of this approach and reformulate accordingly the expression of the electric current. By analogy to the spontaneous electron decay process in electron transitions, in our revision, the current is calculated upon the relaxing process from the "polarized" state induced by the external electric field to the electronic ground state. The electric current is obtained from the total charge transferred through the molecule and the corresponding electronic energy relaxation. The electric current expression proposed is more general compared with the previous expression employed by Ortiz and Seminario, where the charge variation must be tested among different slabs of atoms at the contact. This new approach has been tested on benzene-1,4-dithiolate attached to different gold clusters that represent the contact with the electrodes. Analysis of the total electron deformation density induced by the external electric voltage and properties associated with the electron deformation orbitals supports the conclusions obtained from the I/V profiles.

  1. The optimisation approach of ALARA in nuclear practice: an early application of the precautionary principle. Scientific uncertainty versus legal uncertainty.

    PubMed

    Lierman, S; Veuchelen, L

    2005-01-01

    The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law.

  2. A violation of the uncertainty principle implies a violation of the second law of thermodynamics.

    PubMed

    Hänggi, Esther; Wehner, Stephanie

    2013-01-01

    Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.

  3. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  4. Application of Risk Management and Uncertainty Concepts and Methods for Ecosystem Restoration: Principles and Best Practice

    DTIC Science & Technology

    2012-08-01

    The Louisiana Coastal Area ( LCA ) Conditionally Authorized Ecosystem Restoration projects are large-scale restoration projects seeking to restore...at how USACE planners addressed the risk of sea-level rise with respect to the White Ditch LCA restoration project, the ERDC-SAND2 model was used to...the various alternatives to this source of uncertainty. Risk and uncertainty were similarly evaluated for the other five LCA projects. In several of

  5. Quantifying uncertainties in first-principles alloy thermodynamics using cluster expansions

    NASA Astrophysics Data System (ADS)

    Aldegunde, Manuel; Zabaras, Nicholas; Kristensen, Jesper

    2016-10-01

    The cluster expansion is a popular surrogate model for alloy modeling to avoid costly quantum mechanical simulations. As its practical implementations require approximations, its use trades efficiency for accuracy. Furthermore, the coefficients of the model need to be determined from some known data set (training set). These two sources of error, if not quantified, decrease the confidence we can put in the results obtained from the surrogate model. This paper presents a framework for the determination of the cluster expansion coefficients using a Bayesian approach, which allows for the quantification of uncertainties in the predictions. In particular, a relevance vector machine is used to automatically select the most relevant terms of the model while retaining an analytical expression for the predictive distribution. This methodology is applied to two binary alloys, SiGe and MgLi, including the temperature dependence in their effective cluster interactions. The resulting cluster expansions are used to calculate the uncertainty in several thermodynamic quantities: ground state line, including the uncertainty in which structures are thermodynamically stable at 0 K, phase diagrams and phase transitions. The uncertainty in the ground state line is found to be of the order of meV/atom, showing that the cluster expansion is reliable to ab initio level accuracy even with limited data. We found that the uncertainty in the predicted phase transition temperature increases when including the temperature dependence of the effective cluster interactions. Also, the use of the bond stiffness versus bond length approximation to calculate temperature dependent properties from a reduced set of alloy configurations showed similar uncertainty to the approach where all training configurations are considered but at a much reduced computational cost.

  6. The precautionary principle and international conflict over domestic regulation: mitigating uncertainty and improving adaptive capacity.

    PubMed

    Oye, K A

    2005-01-01

    Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs.

  7. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    NASA Astrophysics Data System (ADS)

    Dupuy, Jean-Pierre; Grinbaum, Alexei

    2005-03-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). To cite this article: J.-P. Dupuy, A. Grinbaum, C. R. Geoscience 337 (2005).

  8. The effect of generalized uncertainty principle on square well, a case study

    SciTech Connect

    Ma, Meng-Sen; Zhao, Ren

    2014-08-15

    According to a special case (β = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except α. But the eigenstates will depend on another two parameters besides α.

  9. The uncertainty principle and a semiclassical nonlinear differential equation formulation for bound states

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Bhattacharyya, K.

    2000-10-01

    The kinship of a simple variational scheme involving the uncertainty product with a prevalent semiclassical nonlinear differential equation approach for finding energies of stationary states is established. This leads to a transparent physical interpretation of the embedded parameters in the latter approach, providing additionally a lower bound to the integration constant. The domain of applicability of this strategy is also extended to encompass neighbouring states. Other advantages of the simpler alternative route are stressed. Pilot calculations demonstrate nicely the efficacy of the endeavour.

  10. Phase-space noncommutative extension of the Robertson-Schrödinger formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Dias, Nuno Costa; Prata, João Nuno

    2015-03-01

    We revisit Ozawa's uncertainty principle (OUP) in the framework of noncommutative (NC) quantum mechanics. We derive a matrix version of OUP accommodating any NC structure in the phase space, and compute NC corrections to lowest order for two measurement interactions, namely the backaction evading quadrature amplifier and noiseless quadrature transducers. These NC corrections alter the nature of the measurement interaction, as a noiseless interaction may acquire noise, and an interaction of independent intervention may become dependent on the object system. However the most striking result is that noncommutativity may lead to a violation of the OUP itself. The NC corrections for the backaction evading quadrature amplifier reveal a new term which may potentially be amplified in such a way that the violation of the OUP becomes experimentally testable. On the other hand, the NC corrections to the noiseless quadrature transducer shows an incompatibility of this model with NC quantum mechanics. We discuss the implications of this incompatibility for NC quantum mechanics and for Ozawa's uncertainty principle.

  11. Some consequences of the generalized uncertainty principle induced ultraviolet wave-vector cutoff in one-dimensional quantum mechanics

    NASA Astrophysics Data System (ADS)

    Sailer, K.; Péli, Z.; Nagy, S.

    2013-04-01

    A projection method is proposed to treat the one-dimensional Schrödinger equation for a single particle when the generalized uncertainty principle (GUP) generates an ultraviolet (UV) wave-vector cutoff. The existence of a unique coordinate representation called the naive one is derived from the one-parameter family of discrete coordinate representations. In this bandlimited quantum mechanics a continuous potential is reconstructed from discrete sampled values observed by means of a particle in maximally localized states. It is shown that bandlimitation modifies the speed of the center and the spreading time of a Gaussian wave packet moving in free space. Indication is found that GUP accompanied by bandlimitation may cause departures of the low-lying energy levels of a particle in a box from those in ordinary quantum mechanics to be much less suppressed than commonly thought when GUP without bandlimitation is at work.

  12. Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators

    SciTech Connect

    Marchiolli, M.A.; Mendonça, P.E.M.F.

    2013-09-15

    We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.

  13. Niels Bohr's discussions with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger: the origins of the principles of uncertainty and complementarity

    SciTech Connect

    Mehra, J.

    1987-05-01

    In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger during 1920-1927 are treated. From the formulation of quantum mechanics in 1925-1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory - formulated in fall 1926 by Dirac, London, and Jordan - Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publicly enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of quantum mechanics and of physical theory as such - formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930 - were continued during the next decades. All these aspects are briefly summarized.

  14. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Technical Reports Server (NTRS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  15. Adding a strategic edge to human factors/ergonomics: principles for the management of uncertainty as cornerstones for system design.

    PubMed

    Grote, Gudela

    2014-01-01

    It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore.

  16. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    PubMed

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.

  17. Niels Bohr's discussions with Albert Einstein, Werner Heisenberg, and Erwin Schrödinger: The origins of the principles of uncertainty and complementarity

    NASA Astrophysics Data System (ADS)

    Mehra, Jagdish

    1987-05-01

    In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schrödinger during 1920 1927 are treated. From the formulation of quantum mechanics in 1925 1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory—formulated in fall 1926 by Dirac, London, and Jordan—Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publicly enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of qnautum mechanics and of physical theory as such—formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930—were continued during the next decades. All these aspects are briefly summarized.

  18. Uncertainty Analysis Principles and Methods

    DTIC Science & Technology

    2007-09-01

    122-07, September 2007 ⎟ ⎠ ⎞ ⎜ ⎝ ⎛ +Φ= − 2 11 n n PL (2-13) where nPn 2/11−= . Any points that lie outside x ±Lnsn are rejected. 2.2.7...sensor is a transistor , the number of junctions is 2. 3.1.35 Operator Bias. Error due to the perception or influence of a human operator or other

  19. Pauli effects in uncertainty relations

    NASA Astrophysics Data System (ADS)

    Toranzo, I. V.; Sánchez-Moreno, P.; Esquivel, R. O.; Dehesa, J. S.

    2014-10-01

    In this Letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information-based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  20. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a

  1. On the relativity and uncertainty of distance, time, and energy measurements by man. (1) Derivation of the Weber psychophysical law from the Heisenberg uncertainty principle applied to a superconductive biological detector. (2) The reverse derivation. (3) A human theory of relativity.

    PubMed

    Cope, F W

    1981-01-01

    The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature.

  2. Mutually Exclusive Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan

    2016-11-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  3. Mutually Exclusive Uncertainty Relations.

    PubMed

    Xiao, Yunlong; Jing, Naihuan

    2016-11-08

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  4. Mutually Exclusive Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan

    2016-01-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds. PMID:27824161

  5. Reflections on uncertainty in risk assessment and risk management by the Society of Environmental Toxicology and Chemistry (SETAC) precautionary principle workgroup.

    PubMed

    Sanderson, H; Stahl, C H; Irwin, R; Rogers, M D

    2005-01-01

    Quantitative uncertainty assessments and the distribution of risk are under scrutiny and significant criticism has been made of null hypothesis testing when careful consideration of Type I (false positive) and II (false negative) error rates have not been taken into account. An alternative method, equivalence testing, is discussed yielding more transparency and potentially more precaution in the quantifiable uncertainty assessments. With thousands of chemicals needing regulation in the near future and low public trust in the regulatory process, decision models are required with transparency and learning processes to manage this task. Adaptive, iterative, and learning decision making tools and processes can help decision makers evaluate the significance of Type I or Type II errors on decision alternatives and can reduce the risk of committing Type III errors (accurate answers to the wrong questions). Simplistic cost-benefit based decision-making tools do not incorporate the complex interconnectedness characterizing environmental risks, nor do they enhance learning, participation, or include social values and ambiguity. Hence, better decision-making tools are required, and MIRA is an attempt to include some of the critical aspects.

  6. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  7. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  8. Uncertainty in QSAR predictions.

    PubMed

    Sahlin, Ullrika

    2013-03-01

    It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle.

  9. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  10. Laser triangulation: fundamental uncertainty in distance measurement.

    PubMed

    Dorsch, R G; Häusler, G; Herrmann, J M

    1994-03-01

    We discuss the uncertainty limit in distance sensing by laser triangulation. The uncertainty in distance measurement of laser triangulation sensors and other coherent sensors is limited by speckle noise. Speckle arises because of the coherent illumination in combination with rough surfaces. A minimum limit on the distance uncertainty is derived through speckle statistics. This uncertainty is a function of wavelength, observation aperture, and speckle contrast in the spot image. Surprisingly, it is the same distance uncertainty that we obtained from a single-photon experiment and from Heisenberg's uncertainty principle. Experiments confirm the theory. An uncertainty principle connecting lateral resolution and distance uncertainty is introduced. Design criteria for a sensor with minimum distanc uncertainty are determined: small temporal coherence, small spatial coherence, a large observation aperture.

  11. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  12. Nab: Measurement principles, apparatus and uncertainties

    NASA Astrophysics Data System (ADS)

    Počanić, Dinko; Alarcon, R.; Alonzi, L. P.; Baeßler, S.; Balascuta, S.; Bowman, J. D.; Bychkov, M. A.; Byrne, J.; Calarco, J. R.; Cianciolo, V.; Crawford, C.; Frlež, E.; Gericke, M. T.; Greene, G. L.; Grzywacz, R. K.; Gudkov, V.; Hersman, F. W.; Klein, A.; Martin, J.; Page, S. A.; Palladino, A.; Penttilä, S. I.; Rykaczewski, K. P.; Wilburn, W. S.; Young, A. R.; Young, G. R.; Nab Collaboration

    2009-12-01

    The Nab collaboration will perform a precise measurement of a, the electron-neutrino correlation parameter, and b, the Fierz interference term in neutron beta decay, in the Fundamental Neutron Physics Beamline at the SNS, using a novel electric/magnetic field spectrometer and detector design. The experiment is aiming at the 10-3 accuracy level in Δa/a, and will provide an independent measurement of λ=GA/GV, the ratio of axial-vector to vector coupling constants of the nucleon. Nab also plans to perform the first ever measurement of b in neutron decay, which will provide an independent limit on the tensor weak coupling.

  13. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  14. Entropic uncertainty relations and their applications

    NASA Astrophysics Data System (ADS)

    Coles, Patrick J.; Berta, Mario; Tomamichel, Marco; Wehner, Stephanie

    2017-01-01

    Heisenberg's uncertainty principle forms a fundamental element of quantum mechanics. Uncertainty relations in terms of entropies were initially proposed to deal with conceptual shortcomings in the original formulation of the uncertainty principle and, hence, play an important role in quantum foundations. More recently, entropic uncertainty relations have emerged as the central ingredient in the security analysis of almost all quantum cryptographic protocols, such as quantum key distribution and two-party quantum cryptography. This review surveys entropic uncertainty relations that capture Heisenberg's idea that the results of incompatible measurements are impossible to predict, covering both finite- and infinite-dimensional measurements. These ideas are then extended to incorporate quantum correlations between the observed object and its environment, allowing for a variety of recent, more general formulations of the uncertainty principle. Finally, various applications are discussed, ranging from entanglement witnessing to wave-particle duality to quantum cryptography.

  15. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-03

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  16. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  17. Majorization formulation of uncertainty in quantum mechanics

    SciTech Connect

    Partovi, M. Hossein

    2011-11-15

    Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.

  18. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  19. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  20. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  1. Clarifying types of uncertainty: when are models accurate, and uncertainties small?

    PubMed

    Cox, Louis Anthony Tony

    2011-10-01

    Professor Aven has recently noted the importance of clarifying the meaning of terms such as "scientific uncertainty" for use in risk management and policy decisions, such as when to trigger application of the precautionary principle. This comment examines some fundamental conceptual challenges for efforts to define "accurate" models and "small" input uncertainties by showing that increasing uncertainty in model inputs may reduce uncertainty in model outputs; that even correct models with "small" input uncertainties need not yield accurate or useful predictions for quantities of interest in risk management (such as the duration of an epidemic); and that accurate predictive models need not be accurate causal models.

  2. Uncertainty in the Classroom--Teaching Quantum Physics

    ERIC Educational Resources Information Center

    Johansson, K. E.; Milstead, D.

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how…

  3. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  4. Calculating Measurement Uncertainties for Mass Spectrometry Data

    NASA Astrophysics Data System (ADS)

    Essex, R. M.; Goldberg, S. A.

    2006-12-01

    A complete and transparent characterization of measurement uncertainty is fundamentally important to the interpretation of analytical results. We have observed that the calculation and reporting of uncertainty estimates for isotopic measurement from a variety of analytical facilities are inconsistent, making it difficult to compare and evaluate data. Therefore, we recommend an approach to uncertainty estimation that has been adopted by both US national metrology facilities and is becoming widely accepted within the analytical community. This approach is outlined in the ISO "Guide to the Expression of Uncertainty in Measurement" (GUM). The GUM approach to uncertainty estimation includes four major steps: 1) Specify the measurand; 2) Identify uncertainty sources; 3) Quantify components by determining the standard uncertainty (u) for each component; and 4) Calculate combined standard uncertainty (u_c) by using established propagation laws to combine the various components. To obtain a desired confidence level, the combined standard uncertainty is multiplied by a coverage factor (k) to yield an expanded uncertainty (U). To be consistent with the GUM principles, it is also necessary create an uncertainty budget, which is a listing of all the components comprising the uncertainty and their relative contribution to the combined standard uncertainty. In mass spectrometry, Step 1 is normally the determination of an isotopic ratio for a particular element. Step 2 requires the identification of the many potential sources of measurement variability and bias including: gain, baseline, cup efficiency, Schottky noise, counting statistics, CRM uncertainties, yield calibrations, linearity calibrations, run conditions, and filament geometry. Then an equation expressing the relationship of all of the components to the measurement value must be written. To complete Step 3, these potential sources of uncertainty must be characterized (Type A or Type B) and quantified. This information

  5. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2015-11-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from [Berta et al., Nat. Phys. 6, 659 (2010)] is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the "uncertainty witness" lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from [Coles et al., Phys. Rev. Lett. 108, 210405 (2012)] makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM Quantum Experience and find reasonable agreement between our predictions and experimental outcomes.

  6. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  7. Correction of harmonic motion and Kepler orbit based on the minimal momentum uncertainty

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang; Hassanabadi, Hassan

    2017-03-01

    In this paper we consider the deformed Heisenberg uncertainty principle with the minimal uncertainty in momentum which is called a minimal momentum uncertainty principle (MMUP). We consider MMUP in D-dimension and its classical analogue. Using these we investigate the MMUP effect for the harmonic motion and Kepler orbit.

  8. Angular performance measure for tighter uncertainty relations

    SciTech Connect

    Hradil, Z.; Rehacek, J.; Klimov, A. B.; Rigas, I.; Sanchez-Soto, L. L.

    2010-01-15

    The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

  9. The uncertainties in estimating measurement uncertainties

    SciTech Connect

    Clark, J.P.; Shull, A.H.

    1994-07-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties.

  10. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  11. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  12. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  13. Entanglement and discord assisted entropic uncertainty relations under decoherence

    NASA Astrophysics Data System (ADS)

    Yao, ChunMei; Chen, ZhiHua; Ma, ZhiHao; Severini, Simone; Serafini, Alessio

    2014-09-01

    The uncertainty principle is a crucial aspect of quantum mechanics. It has been shown that quantum entanglement as well as more general notions of correlations, such as quantum discord, can relax or tighten the entropic uncertainty relation in the presence of an ancillary system. We explored the behaviour of entropic uncertainty relations for system of two qubits-one of which subjects to several forms of independent quantum noise, in both Markovian and non-Markovian regimes. The uncertainties and their lower bounds, identified by the entropic uncertainty relations, increase under independent local unital Markovian noisy channels, but they may decrease under non-unital channels. The behaviour of the uncertainties (and lower bounds) exhibit periodical oscillations due to correlation dynamics under independent non-Markovian reservoirs. In addition, we compare different entropic uncertainty relations in several special cases and find that discord-tightened entropic uncertainty relations offer in general a better estimate of the uncertainties in play.

  14. The physical origins of the uncertainty theorem

    NASA Astrophysics Data System (ADS)

    Giese, Albrecht

    2013-10-01

    The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

  15. Articulating and responding to uncertainties in clinical research.

    PubMed

    Djulbegovic, Benjamin

    2007-01-01

    This paper introduces taxonomy of clinical uncertaintes and argues that the choice of scientific method should match the underlying level of uncertainty. Clinical trial is one of these methods aiming to resolve clinical uncertainties. Whenever possible these uncertainties should be quantified. The paper further shows that the still ongoing debate about the usage of "equipoise" vs. "uncertainty principle" vs. "indifference" as an entry criterion to clinical trials actually refers to the question "whose uncertainty counts". This question is intimately linked to the control of research agenda, which is not quantifiable and hence is not solvable to equal acceptability to all interested parties. The author finally shows that there is a predictable relation between [acknowledgement of] uncertainty (the moral principle) on which trials are based and the ultimate outcomes of clinical trials. That is, [acknowledgement of] uncertainty determines a pattern of success in medicine and drives clinical discoveries.

  16. Uncertainty under quantum measures and quantum memory

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2017-04-01

    The uncertainty principle restricts potential information one gains about physical properties of the measured particle. However, if the particle is prepared in entanglement with a quantum memory, the corresponding entropic uncertainty relation will vary. Based on the knowledge of correlations between the measured particle and quantum memory, we have investigated the entropic uncertainty relations for two and multiple measurements and generalized the lower bounds on the sum of Shannon entropies without quantum side information to those that allow quantum memory. In particular, we have obtained generalization of Kaniewski-Tomamichel-Wehner's bound for effective measures and majorization bounds for noneffective measures to allow quantum side information. Furthermore, we have derived several strong bounds for the entropic uncertainty relations in the presence of quantum memory for two and multiple measurements. Finally, potential applications of our results to entanglement witnesses are discussed via the entropic uncertainty relation in the absence of quantum memory.

  17. Uncertainty for Part Density Determination: An Update

    SciTech Connect

    Valdez, Mario Orlando

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  18. Uncertainty relations for general unitary operators

    NASA Astrophysics Data System (ADS)

    Bagchi, Shrobona; Pati, Arun Kumar

    2016-10-01

    We derive several uncertainty relations for two arbitrary unitary operators acting on physical states of a Hilbert space. We show that our bounds are tighter in various cases than the ones existing in the current literature. Using the uncertainty relation for the unitary operators, we obtain the tight state-independent lower bound for the uncertainty of two Pauli observables and anticommuting observables in higher dimensions. With regard to the minimum-uncertainty states, we derive the minimum-uncertainty state equation by the analytic method and relate this to the ground-state problem of the Harper Hamiltonian. Furthermore, the higher-dimensional limit of the uncertainty relations and minimum-uncertainty states are explored. From an operational point of view, we show that the uncertainty in the unitary operator is directly related to the visibility of quantum interference in an interferometer where one arm of the interferometer is affected by a unitary operator. This shows a principle of preparation uncertainty, i.e., for any quantum system, the amount of visibility for two general noncommuting unitary operators is nontrivially upper bounded.

  19. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  20. Uncertainty relation for mutual information

    NASA Astrophysics Data System (ADS)

    Schneeloch, James; Broadbent, Curtis J.; Howell, John C.

    2014-12-01

    We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting that the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.

  1. Aspects of complementarity and uncertainty

    NASA Astrophysics Data System (ADS)

    Vathsan, Radhika; Qureshi, Tabish

    2016-08-01

    The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.

  2. Role of information theoretic uncertainty relations in quantum theory

    SciTech Connect

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  3. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  4. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Landé, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schr

  5. Optimising uncertainty in physical sample preparation.

    PubMed

    Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

    2005-11-01

    Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation.

  6. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  7. Parametric uncertainty in nanoscale optical dimensional measurements.

    PubMed

    Potzick, James; Marx, Egon

    2012-06-10

    Image modeling establishes the relation between an object and its image when an optical microscope is used to measure the dimensions of an object of size comparable to the illumination wavelength. It accounts for the influence of all of the parameters that can affect the image and relates the apparent feature width (FW) in the image to the true FW of the object. The values of these parameters, however, have uncertainties, and these uncertainties propagate through the model and lead to parametric uncertainty in the FW measurement, a key component of the combined measurement uncertainty. The combined uncertainty is required in order to decide if the result is adequate for its intended purpose and to ascertain if it is consistent with other results. The parametric uncertainty for optical photomask measurements derived using an edge intensity threshold approach has been described previously; this paper describes an image library approach to this issue and shows results for optical photomask metrology over a FW range of 10 nm to 8 µm using light of wavelength 365 nm. The principles will be described; a one-dimensional image library will be used; the method of comparing images, along with a simple interpolation method, will be explained; and results will be presented. This method is easily extended to any kind of imaging microscope and to more dimensions in parameter space. It is more general than the edge threshold method and leads to markedly different uncertainties for features smaller than the wavelength.

  8. Quantum preparation uncertainty and lack of information

    NASA Astrophysics Data System (ADS)

    Rozpędek, Filip; Kaniewski, Jędrzej; Coles, Patrick J.; Wehner, Stephanie

    2017-02-01

    The quantum uncertainty principle famously predicts that there exist measurements that are inherently incompatible, in the sense that their outcomes cannot be predicted simultaneously. In contrast, no such uncertainty exists in the classical domain, where all uncertainty results from ignorance about the exact state of the physical system. Here, we critically examine the concept of preparation uncertainty and ask whether similarly in the quantum regime, some of the uncertainty that we observe can actually also be understood as a lack of information (LOI), albeit a lack of quantum information. We answer this question affirmatively by showing that for the well known measurements employed in BB84 quantum key distribution (Bennett and Brassard 1984 Int. Conf. on Computer System and Signal Processing), the amount of uncertainty can indeed be related to the amount of available information about additional registers determining the choice of the measurement. We proceed to show that also for other measurements the amount of uncertainty is in part connected to a LOI. Finally, we discuss the conceptual implications of our observation to the security of cryptographic protocols that make use of BB84 states.

  9. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  10. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  11. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine.

  12. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  13. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  14. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  15. Economic uncertainty and econophysics

    NASA Astrophysics Data System (ADS)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  16. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  17. Generalized uncertainty principle and self-adjoint operators

    SciTech Connect

    Balasubramanian, Venkat; Das, Saurya; Vagenas, Elias C.

    2015-09-15

    In this work we explore the self-adjointness of the GUP-modified momentum and Hamiltonian operators over different domains. In particular, we utilize the theorem by von-Neumann for symmetric operators in order to determine whether the momentum and Hamiltonian operators are self-adjoint or not, or they have self-adjoint extensions over the given domain. In addition, a simple example of the Hamiltonian operator describing a particle in a box is given. The solutions of the boundary conditions that describe the self-adjoint extensions of the specific Hamiltonian operator are obtained.

  18. Phase-space noncommutative formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno

    2014-08-01

    Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.

  19. Principles and Algorithms for Causal Reasoning with Uncertainty

    DTIC Science & Technology

    1989-05-01

    example, if it is reasonable to assume that a light bulb will be working from one day to the next, and this assumipiun is treated as certain, then the...reasoner can decide that it is also certain that the light bulb will be working for a year. or a decade. Host: Don’t need many light bulbs that way...about invincible light bulbs . Host: So what do you do in the thesis? Robbie: I start a lot of cars. or at least I try to. See. the qualification

  20. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

  1. Time crystals from minimum time uncertainty

    NASA Astrophysics Data System (ADS)

    Faizal, Mir; Khalil, Mohammed M.; Das, Saurya

    2016-01-01

    Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal. As an application of our formalism, we analyze the effect of such a deformation on the rate of spontaneous emission in a hydrogen atom.

  2. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.

  3. Fragility, uncertainty, and healthcare.

    PubMed

    Rogers, Wendy A; Walker, Mary J

    2016-02-01

    Medicine seeks to overcome one of the most fundamental fragilities of being human, the fragility of good health. No matter how robust our current state of health, we are inevitably susceptible to future illness and disease, while current disease serves to remind us of various frailties inherent in the human condition. This article examines the relationship between fragility and uncertainty with regard to health, and argues that there are reasons to accept rather than deny at least some forms of uncertainty. In situations of current ill health, both patients and doctors seek to manage this fragility through diagnoses that explain suffering and provide some certainty about prognosis as well as treatment. However, both diagnosis and prognosis are inevitably uncertain to some degree, leading to questions about how much uncertainty health professionals should disclose, and how to manage when diagnosis is elusive, leaving patients in uncertainty. We argue that patients can benefit when they are able to acknowledge, and appropriately accept, some uncertainty. Healthy people may seek to protect the fragility of their good health by undertaking preventative measures including various tests and screenings. However, these attempts to secure oneself against the onset of biological fragility can cause harm by creating rather than eliminating uncertainty. Finally, we argue that there are good reasons for accepting the fragility of health, along with the associated uncertainties.

  4. Optimal Universal Uncertainty Relations

    PubMed Central

    Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi

    2016-01-01

    We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010

  5. Communicating scientific uncertainty.

    PubMed

    Fischhoff, Baruch; Davis, Alex L

    2014-09-16

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science.

  6. Uncertainty in chemistry.

    PubMed

    Menger, Fredric M

    2010-09-01

    It might come as a disappointment to some chemists, but just as there are uncertainties in physics and mathematics, there are some chemistry questions we may never know the answer to either, suggests Fredric M. Menger.

  7. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  8. Uncertainty in bulk-liquid hydrodynamics and biofilm dynamics creates uncertainties in biofilm reactor design.

    PubMed

    Boltz, J P; Daigger, G T

    2010-01-01

    While biofilm reactors may be classified as one of seven different types, the design of each is unified by fundamental biofilm principles. It follows that state-of-the art design of each biofilm reactor type is subject to the same uncertainties (although the degree of uncertainty may vary). This paper describes unifying biofilm principles and uncertainties of importance in biofilm reactor design. This approach to biofilm reactor design represents a shift from the historical approach which was based on empirical criteria and design formulations. The use of such design criteria was largely due to inherent uncertainty over reactor-scale hydrodynamics and biofilm dynamics, which correlate with biofilm thickness, structure and function. An understanding of two fundamental concepts is required to rationally design biofilm reactors: bioreactor hydrodynamics and biofilm dynamics (with particular emphasis on mass transfer resistances). Bulk-liquid hydrodynamics influences biofilm thickness control, surface area, and development. Biofilm dynamics influences biofilm thickness, structure and function. While the complex hydrodynamics of some biofilm reactors such as trickling filters and biological filters have prevented the widespread use of fundamental biofilm principles and mechanistic models in practice, reactors utilizing integrated fixed-film activated sludge or moving bed technology provide a bulk-liquid hydrodynamic environment allowing for their application. From a substrate transformation perspective, mass transfer in biofilm reactors defines the primary difference between suspended growth and biofilm systems: suspended growth systems are kinetically (i.e., biomass) limited and biofilm reactors are primarily diffusion (i.e., biofilm growth surface area) limited.

  9. Conundrums with uncertainty factors.

    PubMed

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets.

  10. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  11. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  12. SU(2) uncertainty limits

    NASA Astrophysics Data System (ADS)

    Shabbir, Saroosh; Björk, Gunnar

    2016-05-01

    Although progress has been made recently in defining nontrivial uncertainty limits for the SU(2) group, a description of the intermediate states bound by these limits remains lacking. In this paper we enumerate possible uncertainty relations for the SU(2) group that involve all three observables and that are, moreover, invariant under SU(2) transformations. We demonstrate that these relations however, even taken as a group, do not provide sharp, saturable bounds. To find sharp bounds, we systematically calculate the variance of the SU(2) operators for all pure states belonging to the N =2 and N =3 polarization excitation manifold (corresponding to spin 1 and spin 3/2). Lastly, and perhaps counter to expectation, we note that even pure states can reach the maximum uncertainty limit.

  13. Uncertainty and Dimensional Calibrations

    PubMed Central

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves. PMID:27805114

  14. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  15. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  16. Uncertainty and Dimensional Calibrations.

    PubMed

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves.

  17. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  18. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  19. Measurement Uncertainty Estimation in Amperometric Sensors: A Tutorial Review

    PubMed Central

    Helm, Irja; Jalukse, Lauri; Leito, Ivo

    2010-01-01

    This tutorial focuses on measurement uncertainty estimation in amperometric sensors (both for liquid and gas-phase measurements). The main uncertainty sources are reviewed and their contributions are discussed with relation to the principles of operation of the sensors, measurement conditions and properties of the measured samples. The discussion is illustrated by case studies based on the two major approaches for uncertainty evaluation–the ISO GUM modeling approach and the Nordtest approach. This tutorial is expected to be of interest to workers in different fields of science who use measurements with amperometric sensors and need to evaluate the uncertainty of the obtained results but are new to the concept of measurement uncertainty. The tutorial is also expected to be educative in order to make measurement results more accurate. PMID:22399887

  20. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  1. Uncertainty in NIST Force Measurements.

    PubMed

    Bartel, Tom

    2005-01-01

    This paper focuses upon the uncertainty of force calibration measurements at the National Institute of Standards and Technology (NIST). The uncertainty of the realization of force for the national deadweight force standards at NIST is discussed, as well as the uncertainties associated with NIST's voltage-ratio measuring instruments and with the characteristics of transducers being calibrated. The combined uncertainty is related to the uncertainty of dissemination for force transfer standards sent to NIST for calibration.

  2. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  3. Uncertainties in successive measurements

    NASA Astrophysics Data System (ADS)

    Distler, Jacques; Paban, Sonia

    2013-06-01

    When you measure an observable, A, in quantum mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some noncommuting observable, B. The standard uncertainty relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the postmeasurement state. We re-examine this problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum. In the latter case, the need to include a finite detector resolution, as part of what it means to measure such an observable, has dramatic implications for the result of successive measurements. Ozawa, [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.67.042105 67, 042105 (2003)] proposed an inequality satisfied in the case of successive measurements. Among our results, we show that his inequality is ineffective (can never come close to being saturated). For the cases of interest, we compute a sharper lower bound.

  4. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  5. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  6. Information content of systems as a physical principle

    NASA Astrophysics Data System (ADS)

    Czekaj, L.; Horodecki, M.; Horodecki, P.; Horodecki, R.

    2017-02-01

    To explain the conceptual gap between classical and quantum and other, hypothetical descriptions of the world, several principles have been proposed. So far, all these principles have not explicitly included the uncertainty relation. Here we introduce an information content principle (ICP) which represents a constrained uncertainty principle. The principle, by taking into account the encoding and decoding properties of a single physical system, is capable of separation, both classicality and quanta from a number of potential physical theories, including hidden variable theories. The ICP, which is satisfied by both classical and quantum theory, states that the amount of nonredundant information which may be extracted from a given system is bounded by a perfectly decodable information content of the system. We show that ICP allows one to discriminate theories which do not allow for correlations stronger than Tsirelson's bound. We show also how to apply the principle to composite systems, ruling out some theories despite that their elementary constituents behave quantumly.

  7. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  8. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  9. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  10. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  11. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  12. Uncertainty relations and approximate quantum error correction

    NASA Astrophysics Data System (ADS)

    Renes, Joseph M.

    2016-09-01

    The uncertainty principle can be understood as constraining the probability of winning a game in which Alice measures one of two conjugate observables, such as position or momentum, on a system provided by Bob, and he is to guess the outcome. Two variants are possible: either Alice tells Bob which observable she measured, or he has to furnish guesses for both cases. Here I derive uncertainty relations for both, formulated directly in terms of Bob's guessing probabilities. For the former these relate to the entanglement that can be recovered by action on Bob's system alone. This gives an explicit quantum circuit for approximate quantum error correction using the guessing measurements for "amplitude" and "phase" information, implicitly used in the recent construction of efficient quantum polar codes. I also find a relation on the guessing probabilities for the latter game, which has application to wave-particle duality relations.

  13. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    NASA Astrophysics Data System (ADS)

    Amoroso, Richard L.

    2010-12-01

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M̂4±C4 with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  14. Radar stage uncertainty

    USGS Publications Warehouse

    Fulford, J.M.; Davies, W.J.

    2005-01-01

    The U.S. Geological Survey is investigating the performance of radars used for stage (or water-level) measurement. This paper presents a comparison of estimated uncertainties and data for radar water-level measurements with float, bubbler, and wire weight water-level measurements. The radar sensor was also temperature-tested in a laboratory. The uncertainty estimates indicate that radar measurements are more accurate than uncorrected pressure sensors at higher water stages, but are less accurate than pressure sensors at low stages. Field data at two sites indicate that radar sensors may have a small negative bias. Comparison of field radar measurements with wire weight measurements found that the radar tends to measure slightly lower values as stage increases. Copyright ASCE 2005.

  15. How Uncertain is Uncertainty?

    NASA Astrophysics Data System (ADS)

    Vámos, Tibor

    The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.

  16. Uncertainties in transpiration estimates.

    PubMed

    Coenders-Gerrits, A M J; van der Ent, R J; Bogaard, T A; Wang-Erlandsson, L; Hrachowitz, M; Savenije, H H G

    2014-02-13

    arising from S. Jasechko et al. Nature 496, 347-350 (2013)10.1038/nature11983How best to assess the respective importance of plant transpiration over evaporation from open waters, soils and short-term storage such as tree canopies and understories (interception) has long been debated. On the basis of data from lake catchments, Jasechko et al. conclude that transpiration accounts for 80-90% of total land evaporation globally (Fig. 1a). However, another choice of input data, together with more conservative accounting of the related uncertainties, reduces and widens the transpiration ratio estimation to 35-80%. Hence, climate models do not necessarily conflict with observations, but more measurements on the catchment scale are needed to reduce the uncertainty range. There is a Reply to this Brief Communications Arising by Jasechko, S. et al. Nature 506, http://dx.doi.org/10.1038/nature12926 (2014).

  17. Multiresolutional models of uncertainty generation and reduction

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.

  18. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  19. Aggregating and Communicating Uncertainty.

    DTIC Science & Technology

    1980-04-01

    means for identifying and communicating uncertainty. i 12- APPENDIX A BIBLIOGRAPHY j| 1. Ajzen , Icek ; "Intuitive Theories of Events and the Effects...disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theo- ries of Events and the Effects of Base-Rate Information on Prediction...9 4i,* ,4.. -. .- S % to the criterion while disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theories of Events and the Effects

  20. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  1. Variants of Uncertainty

    DTIC Science & Technology

    1981-05-15

    Variants of Uncertainty Daniel Kahneman University of British Columbia Amos Tversky Stanford University DTI-C &%E-IECTE ~JUNO 1i 19 8 1j May 15, 1981... Dennett , 1979) in which different parts have ac- cess to different data, assign then different weights and hold different views of the situation...2robable and t..h1 provable. Oxford- Claredor Press, 1977. Dennett , D.C. Brainstorms. Hassocks: Harvester, 1979. Donchin, E., Ritter, W. & McCallum, W.C

  2. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  3. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  4. Equivalence of wave-particle duality to entropic uncertainty.

    PubMed

    Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie

    2014-12-19

    Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter.

  5. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  6. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  7. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  8. Position-momentum uncertainty relations based on moments of arbitrary order

    SciTech Connect

    Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.

    2011-05-15

    The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

  9. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory.

    PubMed

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-06-29

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details.

  10. Uncertainty bounds using sector theory

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Schmidt, David K.

    1989-01-01

    An approach based on sector-stability theory can furnish a description of the uncertainty associated with the frequency response of a model, given sector-bounds on the individual parameters of the model. The application of the sector-based approach to the formulation of useful uncertainty descriptions for linear, time-invariant multivariable systems is presently explored, and the approach is applied to two generic forms of parameter uncertainty in order to investigate its advantages and limitations. The results obtained show that sector-uncertainty bounds can be used to evaluate the impact of parameter uncertainties on the frequency response of the design model.

  11. Intuitions, principles and consequences

    PubMed Central

    Shaw, A

    2001-01-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. Key Words: Intuitions • principles • consequences • utilitarianism PMID:11233371

  12. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  13. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  14. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  15. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  16. Is the Precautionary Principle Really Incoherent?

    PubMed

    Boyer-Kassem, Thomas

    2017-02-28

    The Precautionary Principle has been an increasingly important principle in international treaties since the 1980s. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This article concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again.

  17. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  18. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  19. Uncertainty and precaution in environmental management.

    PubMed

    Krayer von Krauss, M; van Asselt, M B A; Henze, M; Ravetz, J; Beck, M B

    2005-01-01

    In this paper, two different visions of the relationship between science and policy are contrasted with one another: the "modern" vision and the "precautionary" vision. Conditions which must apply in order to invoke the Precautionary Principle are presented, as are some of the main challenges posed by the principle. The following central question remains: If scientific certainty cannot be provided, what may then justify regulatory interventions, and what degree of intervention is justifiable? The notion of "quality of information" is explored, and it is emphasized that there can be no absolute definition of good or bad quality. Collective judgments of quality are only possible through deliberation on the characteristics of the information, and on the relevance of the information to the policy context. Reference to a relative criterion therefore seems inevitable and legal complexities are to be expected. Uncertainty is presented as a multidimensional concept, reaching far beyond the conventional statistical interpretation of the concept. Of critical importance is the development of methods for assessing qualitative categories of uncertainty. Model quality assessment should observe the following rationale: identify a model that is suited to the purpose, yet bears some reasonable resemblance to the "real" phenomena. In this context, "purpose" relates to the policy and societal contexts in which the assessment results are to be used. It is therefore increasingly agreed that judgment of the quality of assessments necessarily involves the participation of non-modellers and non-scientists. A challenging final question is: How to use uncertainty information in policy contexts? More research is required in order to answer this question.

  20. Quantum Uncertainty Considerations for Gravitational Lens Interferometry

    NASA Astrophysics Data System (ADS)

    Doyle, Laurance R.; Carico, D. P.

    2007-05-01

    The measurement of the gravitational lense delay time between light paths has relied, to date, on the source having sufficient variability to allow photometric variations from each path to be compared. However, the delay times of many gravitational lenses cannot be measured because the intrinsic source amplitude variations are too small to be detectable. At the fundamental quantum mechanical level, such photometric “time stamps” allow which-path knowledge, removing the ability to obtain an interference pattern. However, if the two paths can be made equal (zero time delay) then interference can occur. We describe an interferometric approach to measuring gravitational lens delay times using a “quantum-eraser/restorer” approach, whereby the time travel along the two paths may be rendered measurably equal. Energy and time being non-commuting observables, constraints on the photon energy in the energy-time uncertainty principle_via adjustments of the width of the radio bandpass _dictate the uncertainty of the time delay and therefore whether the path taken along one or the other gravitational lens geodesic is “knowable.” If one starts with an interference pattern, for example, which-path information returns when the bandpass is broadened (constraints on the energy are relaxed) to the point where the uncertainty principle allows a knowledge of the arrival time to better than the gravitational lens delay time itself, at which point the interference pattern will disappear. We discuss the near-term feasibility of such measurements in light of current narrow-band radio detectors and known short time-delay gravitational lenses.

  1. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  2. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  3. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  4. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  5. DOD ELAP Lab Uncertainties

    DTIC Science & Technology

    2012-03-01

    IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO /IEC 17021  Accreditation for  Management  System  Certification Bodies that...certify to :  ISO   9001  (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO /IEC 17025:2005  Each has uncertainty...NOTES Presented at the 9th Annual DoD Environmental Monitoring and Data Quality (EDMQ) Workshop Held 26-29 March 2012 in La Jolla, CA. U.S

  6. Generalized uncertainty relations

    NASA Astrophysics Data System (ADS)

    Herdegen, Andrzej; Ziobro, Piotr

    2017-04-01

    The standard uncertainty relations (UR) in quantum mechanics are typically used for unbounded operators (like the canonical pair). This implies the need for the control of the domain problems. On the other hand, the use of (possibly bounded) functions of basic observables usually leads to more complex and less readily interpretable relations. In addition, UR may turn trivial for certain states if the commutator of observables is not proportional to a positive operator. In this letter we consider a generalization of standard UR resulting from the use of two, instead of one, vector states. The possibility to link these states to each other in various ways adds additional flexibility to UR, which may compensate some of the above-mentioned drawbacks. We discuss applications of the general scheme, leading not only to technical improvements, but also to interesting new insight.

  7. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  8. Medical decisions under uncertainty.

    PubMed

    Carmi, A

    1993-01-01

    The court applies the criteria of the reasonable doctor and common practice in order to consider the behaviour of a defendant physician. The meaning of our demand that the doctor expects that his or her acts or omissions will bring about certain implications is that, according to the present circumstances and subject to the limited knowledge of the common practice, the course of certain events or situations in the future may be assumed in spite of the fog of uncertainty which surrounds us. The miracles and wonders of creation are concealed from us, and we are not aware of the way and the nature of our bodily functioning. Therefore, there seems to be no way to avoid mistakes, because in several cases the correct diagnosis cannot be determined even with the most advanced application of all information available. Doctors find it difficult to admit that they grope in the dark. They wish to form clear and accurate diagnoses for their patients. The fact that their profession is faced with innumerable and unavoidable risks and mistakes is hard to swallow, and many of them claim that in their everyday work this does not happen. They should not content themselves by changing their style. A radical metamorphosis is needed. They should not be tempted to formulate their diagnoses in 'neutral' statements in order to be on the safe side. Uncertainty should be accepted and acknowledged by the profession and by the public at large as a human phenomenon, as an integral part of any human decision, and as a clear characteristic of any legal or medical diagnosis.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables.

  10. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  11. Impact of discharge data uncertainty on nutrient load uncertainty

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  12. Uncertainty quantification in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Rizzi, Francesco

    This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the

  13. Uncertainty and Surprise: An Introduction

    NASA Astrophysics Data System (ADS)

    McDaniel, Reuben R.; Driebe, Dean J.

    Much of the traditional scientific and applied scientific work in the social and natural sciences has been built on the supposition that the unknowability of situations is the result of a lack of information. This has led to an emphasis on uncertainty reduction through ever-increasing information seeking and processing, including better measurement and observational instrumentation. Pending uncertainty reduction through better information, efforts are devoted to uncertainty management and hierarchies of controls. A central goal has been the avoidance of surprise.

  14. Higher-order uncertainty relations

    NASA Astrophysics Data System (ADS)

    Wünsche, A.

    2006-07-01

    Using the non-negativity of Gram determinants of arbitrary order, we derive higher-order uncertainty relations for the symmetric uncertainty matrices of corresponding order n?>?2 to n Hermitean operators (n?=?2 is the usual case). The special cases of third-order and fourth-order uncertainty relations are considered in detail. The obtained third-order uncertainty relations are applied to the Lie groups SU(1,1) with three Hermitean basis operators (K1,K2,K0) and SU(2) with three Hermitean basis operators (J1,J2,J3) where, in particular, the group-coherent states of Perelomov type and of Barut Girardello type for SU(1,1) and the spin or atomic coherent states for SU(2) are investigated. The uncertainty relations for the determinant of the third-order uncertainty matrix are satisfied with the equality sign for coherent states and this determinant becomes vanishing for the Perelomov type of coherent states for SU(1,1) and SU(2). As an example of the application of fourth-order uncertainty relations, we consider the canonical operators (Q1,P1,Q2,P2) of two boson modes and the corresponding uncertainty matrix formed by the operators of the corresponding mean deviations, taking into account the correlations between the two modes. In two mathematical appendices, we prove the non-negativity of the determinant of correlation matrices of arbitrary order and clarify the principal structure of higher-order uncertainty relations.

  15. Simplified propagation of standard uncertainties

    SciTech Connect

    Shull, A.H.

    1997-06-09

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards` uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper.

  16. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  17. Principlism and communitarianism.

    PubMed

    Callahan, D

    2003-10-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help.

  18. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  19. Uncertainties in Arctic Precipitation

    NASA Astrophysics Data System (ADS)

    Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

    2012-12-01

    Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

  20. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  1. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  2. Uncertainties in nuclear fission data

    NASA Astrophysics Data System (ADS)

    Talou, Patrick; Kawano, Toshihiko; Chadwick, Mark B.; Neudecker, Denise; Rising, Michael E.

    2015-03-01

    We review the current status of our knowledge of nuclear fission data, and quantify uncertainties related to each fission observable whenever possible. We also discuss the roles that theory and experiment play in reducing those uncertainties, contributing to the improvement of our fundamental understanding of the nuclear fission process as well as of evaluated nuclear data libraries used in nuclear applications.

  3. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  4. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  5. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  6. Equivalence theorem of uncertainty relations

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2017-01-01

    We present an equivalence theorem to unify the two classes of uncertainty relations, i.e. the variance-based ones and the entropic forms, showing that the entropy of an operator in a quantum system can be built from the variances of a set of commutative operators. This means that an uncertainty relation in the language of entropy may be mapped onto a variance-based one, and vice versa. Employing the equivalence theorem, alternative formulations of entropic uncertainty relations are obtained for the qubit system that are stronger than the existing ones in the literature, and variance-based uncertainty relations for spin systems are reached from the corresponding entropic uncertainty relations.

  7. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health.

  8. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  9. Instructional Software Design Principles.

    ERIC Educational Resources Information Center

    Hazen, Margret

    1985-01-01

    Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)

  10. Generalized Uncertainty Relation and Hawking Radiation of the Black Hole

    NASA Astrophysics Data System (ADS)

    Zhao, Ren; Zhang, Lichun; Wu, Yueqin; Li, Huaifan

    2009-08-01

    Recently, there has been much attention devoted to the correction to the black hole radiation spectrum and the quantum corrections to Bekenstein-Hawking entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the radiation spectrum of arbitrary dimension Schwarzschild black hole after considering the generalized uncertainty principle. The correction value of Bekenstein-Hawking entropy is derived.

  11. The source dilemma hypothesis: Perceptual uncertainty contributes to musical emotion.

    PubMed

    Bonin, Tanor L; Trainor, Laurel J; Belyk, Michel; Andrews, Paul W

    2016-09-01

    Music can evoke powerful emotions in listeners. Here we provide the first empirical evidence that the principles of auditory scene analysis and evolutionary theories of emotion are critical to a comprehensive theory of musical emotion. We interpret these data in light of a theoretical framework termed "the source dilemma hypothesis," which predicts that uncertainty in the number, identity or location of sound objects elicits unpleasant emotions by presenting the auditory system with an incoherent percept, thereby motivating listeners to resolve the auditory ambiguity. We describe two experiments in which source location and timbre were manipulated to change uncertainty in the auditory scene. In both experiments, listeners rated tonal and atonal melodies with congruent auditory scene cues as more pleasant than melodies with incongruent auditory scene cues. These data suggest that music's emotive capacity relies in part on the perceptual uncertainty it produces regarding the auditory scene.

  12. Improved uncertainty relation in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Fei, Shao-Ming; Li-Jost, Xianqing

    2016-12-01

    Berta et al’s uncertainty principle in the presence of quantum memory (Berta et al 2010 Nat. Phys. 6 659) reveals uncertainties with quantum side information between the observables. In the recent important work of Coles and Piani (2014 Phys. Rev. A 89 022112), the entropic sum is controlled by the first and second maximum overlaps between the two projective measurements. We generalize the entropic uncertainty relation in the presence of quantum memory and find the exact dependence on all d largest overlaps between two measurements on any d-dimensional Hilbert space. Our bound is rigorously shown to be strictly tighter than previous entropic bounds in the presence of quantum memory, which have potential applications to quantum cryptography with entanglement witnesses and quantum key distributions.

  13. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  14. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  15. Error-disturbance uncertainty relations in neutron spin measurements

    NASA Astrophysics Data System (ADS)

    Sponar, Stephan

    2016-05-01

    Heisenberg’s uncertainty principle in a formulation of uncertainties, intrinsic to any quantum system, is rigorously proven and demonstrated in various quantum systems. Nevertheless, Heisenberg’s original formulation of the uncertainty principle was given in terms of a reciprocal relation between the error of a position measurement and the thereby induced disturbance on a subsequent momentum measurement. However, a naive generalization of a Heisenberg-type error-disturbance relation for arbitrary observables is not valid. An alternative universally valid relation was derived by Ozawa in 2003. Though universally valid, Ozawa’s relation is not optimal. Recently, Branciard has derived a tight error-disturbance uncertainty relation (EDUR), describing the optimal trade-off between error and disturbance under certain conditions. Here, we report a neutron-optical experiment that records the error of a spin-component measurement, as well as the disturbance caused on another spin-component to test EDURs. We demonstrate that Heisenberg’s original EDUR is violated, and Ozawa’s and Branciard’s EDURs are valid in a wide range of experimental parameters, as well as the tightness of Branciard’s relation.

  16. Uncertainty relations for characteristic functions

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz; Tasca, D. S.; Walborn, S. P.

    2016-02-01

    We present the uncertainty relation for the characteristic functions (ChUR) of the quantum mechanical position and momentum probability distributions. This inequality is more general than the Heisenberg uncertainty relation and is saturated in two extreme cases for wave functions described by periodic Dirac combs. We further discuss a broad spectrum of applications of the ChUR; in particular, we constrain quantum optical measurements involving general detection apertures and provide the uncertainty relation that is relevant for loop quantum cosmology. A method to measure the characteristic function directly using an auxiliary qubit is also briefly discussed.

  17. Assessing uncertainty in physical constants

    NASA Astrophysics Data System (ADS)

    Henrion, Max; Fischhoff, Baruch

    1986-09-01

    Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.

  18. Uncertainty of testing methods--what do we (want to) know?

    PubMed

    Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone

    2013-01-01

    It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods.

  19. Assessment Principles and Tools

    PubMed Central

    Golnik, Karl C.

    2014-01-01

    The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described. PMID:24791100

  20. Basic principles in the radiation dosimetry of nuclear medicine.

    PubMed

    Stabin, Michael; Xu, Xie George

    2014-05-01

    The basic principles of the use of radiation dosimetry in nuclear medicine are reviewed. The basic structure of the main mathematical equations are given and formal dosimetry systems are discussed. An extensive overview of the history and current status of anthropomorphic models (phantoms) is given. The sources and magnitudes of uncertainties in calculated internal dose estimates are reviewed.

  1. Group environmental preference aggregation: the principle of environmental justice

    SciTech Connect

    Davos, C.A.

    1986-01-01

    The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.

  2. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  3. Estimations of uncertainties of frequencies

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan

    2016-10-01

    Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common. The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints. We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.

  4. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.

  5. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the

  6. Dynamical Realism and Uncertainty Propagation

    NASA Astrophysics Data System (ADS)

    Park, Inkwan

    In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates

  7. Thermodynamic and relativistic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Artamonov, A. A.; Plotnikov, E. M.

    2017-01-01

    Thermodynamic uncertainty relation (UR) was verified experimentally. The experiments have shown the validity of the quantum analogue of the zeroth law of stochastic thermodynamics in the form of the saturated Schrödinger UR. We have also proposed a new type of UR for the relativistic mechanics. These relations allow us to consider macroscopic phenomena within the limits of the ratio of the uncertainty relations for different physical quantities.

  8. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  9. Pharmacological Fingerprints of Contextual Uncertainty

    PubMed Central

    Ruge, Diane; Stephan, Klaas E.

    2016-01-01

    Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses. PMID:27846219

  10. Uncertainty of empirical correlation equations

    NASA Astrophysics Data System (ADS)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  11. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  12. The Bayesian brain: phantom percepts resolve sensory uncertainty.

    PubMed

    De Ridder, Dirk; Vanneste, Sven; Freeman, Walter

    2014-07-01

    Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the

  13. Uncertainty in Bohr's response to the Heisenberg microscope

    NASA Astrophysics Data System (ADS)

    Tanona, Scott

    2004-09-01

    In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.

  14. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  15. Uncertainty in perception and the Hierarchical Gaussian Filter.

    PubMed

    Mathys, Christoph D; Lomakina, Ekaterina I; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H; Friston, Karl J; Stephan, Klaas E

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient-but at the same time intuitive-framework for the resolution of perceptual uncertainty in behaving agents.

  16. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  17. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  18. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  19. Intolerance of uncertainty in emotional disorders: What uncertainties remain?

    PubMed

    Shihata, Sarah; McEvoy, Peter M; Mullan, Barbara Ann; Carleton, R Nicholas

    2016-06-01

    The current paper presents a future research agenda for intolerance of uncertainty (IU), which is a transdiagnostic risk and maintaining factor for emotional disorders. In light of the accumulating interest and promising research on IU, it is timely to emphasize the theoretical and therapeutic significance of IU, as well as to highlight what remains unknown about IU across areas such as development, assessment, behavior, threat and risk, and relationships to cognitive vulnerability factors and emotional disorders. The present paper was designed to provide a synthesis of what is known and unknown about IU, and, in doing so, proposes broad and novel directions for future research to address the remaining uncertainties in the literature.

  20. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  1. Multiplatform application for calculating a combined standard uncertainty using a Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Niewinski, Marek; Gurnecki, Pawel

    2016-12-01

    The paper presents a new computer program for calculating a combined standard uncertainty. It implements the algorithm described in JCGM 101:20081 which is concerned with the use of a Monte Carlo method as an implementation of the propagation of distributions for uncertainty evaluation. The accuracy of the calculation has been obtained by using the high quality random number generators. The paper describes the main principles of the program and compares the obtained result with example problems presented in JCGM Supplement 1.

  2. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  3. Uncertainty Quantification for Airfoil Icing

    NASA Astrophysics Data System (ADS)

    DeGennaro, Anthony Matteo

    Ensuring the safety of airplane flight in icing conditions is an important and active arena of research in the aerospace community. Notwithstanding the research, development, and legislation aimed at certifying airplanes for safe operation, an analysis of the effects of icing uncertainties on certification quantities of interest is generally lacking. The central objective of this thesis is to examine and analyze problems in airfoil ice accretion from the standpoint of uncertainty quantification. We focus on three distinct areas: user-informed, data-driven, and computational uncertainty quantification. In the user-informed approach to uncertainty quantification, we discuss important canonical icing classifications and show how these categories can be modeled using a few shape parameters. We then investigate the statistical effects of these parameters. In the data-driven approach, we build statistical models of airfoil ice shapes from databases of actual ice shapes, and quantify the effects of these parameters. Finally, in the computational approach, we investigate the effects of uncertainty in the physics of the ice accretion process, by perturbing the input to an in-house numerical ice accretion code that we develop in this thesis.

  4. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  5. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  6. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  7. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  8. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  9. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  10. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  11. An uncertainty relation in terms of generalized metric adjusted skew information and correlation measure

    NASA Astrophysics Data System (ADS)

    Fan, Ya-Jing; Cao, Huai-Xin; Meng, Hui-Xian; Chen, Liang

    2016-12-01

    The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. In this paper, we prove a Schrödinger-type uncertainty relation in terms of generalized metric adjusted skew information and correlation measure by using operator monotone functions, which reads, U_ρ ^{(g,f)}(A)U_ρ ^{(g,f)}(B)≥ f(0)^2l/k| Corr_ρ ^{s(g,f)}(A,B)| ^2 for some operator monotone functions f and g, all n-dimensional observables A, B and a non-singular density matrix ρ . As applications, we derive some new uncertainty relations for Wigner-Yanase skew information and Wigner-Yanase-Dyson skew information.

  12. Fine-grained lower limit of entropic uncertainty in the presence of quantum memory.

    PubMed

    Pramanik, T; Chowdhury, P; Majumdar, A S

    2013-01-11

    The limitation on obtaining precise outcomes of measurements performed on two noncommuting observables of a particle as set by the uncertainty principle in its entropic form can be reduced in the presence of quantum memory. We derive a new entropic uncertainty relation based on fine graining, which leads to an ultimate limit on the precision achievable in measurements performed on two incompatible observables in the presence of quantum memory. We show that our derived uncertainty relation tightens the lower bound set by entropic uncertainty for members of the class of two-qubit states with maximally mixed marginals, while accounting for the recent experimental results using maximally entangled pure states and mixed Bell-diagonal states. An implication of our uncertainty relation on the security of quantum key generation protocols is pointed out.

  13. A Principle of Intentionality.

    PubMed

    Turner, Charles K

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett's model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone.

  14. Applying the four principles.

    PubMed

    Macklin, R

    2003-10-01

    Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle.

  15. A Principle of Intentionality

    PubMed Central

    Turner, Charles K.

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett’s model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone. PMID:28223954

  16. Principles of multisensory behavior.

    PubMed

    Otto, Thomas U; Dassy, Brice; Mamassian, Pascal

    2013-04-24

    The combined use of multisensory signals is often beneficial. Based on neuronal recordings in the superior colliculus of cats, three basic rules were formulated to describe the effectiveness of multisensory signals: the enhancement of neuronal responses to multisensory compared with unisensory signals is largest when signals occur at the same location ("spatial rule"), when signals are presented at the same time ("temporal rule"), and when signals are rather weak ("principle of inverse effectiveness"). These rules are also considered with respect to multisensory benefits as observed with behavioral measures, but do they capture these benefits best? To uncover the principles that rule benefits in multisensory behavior, we here investigated the classical redundant signal effect (RSE; i.e., the speedup of response times in multisensory compared with unisensory conditions) in humans. Based on theoretical considerations using probability summation, we derived two alternative principles to explain the effect. First, the "principle of congruent effectiveness" states that the benefit in multisensory behavior (here the speedup of response times) is largest when behavioral performance in corresponding unisensory conditions is similar. Second, the "variability rule" states that the benefit is largest when performance in corresponding unisensory conditions is unreliable. We then tested these predictions in two experiments, in which we manipulated the relative onset and the physical strength of distinct audiovisual signals. Our results, which are based on a systematic analysis of response time distributions, show that the RSE follows these principles very well, thereby providing compelling evidence in favor of probability summation as the underlying combination rule.

  17. Assessment of uncertainty in chemical models by Bayesian probabilities: Why, when, how?

    NASA Astrophysics Data System (ADS)

    Sahlin, Ullrika

    2015-07-01

    A prediction of a chemical property or activity is subject to uncertainty. Which type of uncertainties to consider, whether to account for them in a differentiated manner and with which methods, depends on the practical context. In chemical modelling, general guidance of the assessment of uncertainty is hindered by the high variety in underlying modelling algorithms, high-dimensionality problems, the acknowledgement of both qualitative and quantitative dimensions of uncertainty, and the fact that statistics offers alternative principles for uncertainty quantification. Here, a view of the assessment of uncertainty in predictions is presented with the aim to overcome these issues. The assessment sets out to quantify uncertainty representing error in predictions and is based on probability modelling of errors where uncertainty is measured by Bayesian probabilities. Even though well motivated, the choice to use Bayesian probabilities is a challenge to statistics and chemical modelling. Fully Bayesian modelling, Bayesian meta-modelling and bootstrapping are discussed as possible approaches. Deciding how to assess uncertainty is an active choice, and should not be constrained by traditions or lack of validated and reliable ways of doing it.

  18. Uncertainty analysis of BTDM-SFSW based fiber-optic time transfer

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Wu, Guiling; Li, Xinwan; Chen, Jianping

    2017-02-01

    In this paper, the uncertainty of time transfer based on a bidirectional time division multiplexing transmission over a single fiber with the same wavelength (BTDM-SFSW) is investigated via theoretical models and experimental measurements. According to the principle and system architecture, the uncertainty evaluation schemes for BTDM-SFSW based time transfer are presented and the uncertainty sources are identified accordingly. In order to avoid the effect of the temperature-dependent fiber delay, the measured time intervals at two sites are regarded as a whole to obtain an overall uncertainty of time interval measurements. For the uncertainty of time transfer modem calibration, aside from the type A uncertainty obtained under the applied calibration scheme, the system reproducibility against practical operation and the contribution of optical power-dependent receiving delays are also included. A mathematical model considering fiber dispersion, polarization mode dispersion (PMD) and the Sagnac effect is established to evaluate the uncertainty from the fiber link. The characteristics of the uncertainty sources in a long-distance fiber-optic time transfer testbed are then explored in detail. The combined expanded uncertainties with a coverage factor of 2 are calculated and experimentally validated over various non-calibrated fiber extensions.

  19. A framework for geometric quantification and forecasting of cost uncertainty for aerospace innovations

    NASA Astrophysics Data System (ADS)

    Schwabe, Oliver; Shehab, Essam; Erkoyuncu, John

    2016-07-01

    Quantification and forecasting of cost uncertainty for aerospace innovations is challenged by conditions of small data which arises out of having few measurement points, little prior experience, unknown history, low data quality, and conditions of deep uncertainty. Literature research suggests that no frameworks exist which specifically address cost estimation under such conditions. In order to provide contemporary cost estimating techniques with an innovative perspective for addressing such challenges a framework based on the principles of spatial geometry is described. The framework consists of a method for visualising cost uncertainty and a dependency model for quantifying and forecasting cost uncertainty. Cost uncertainty is declared to represent manifested and unintended future cost variance with a probability of 100% and an unknown quantity and innovative starting conditions considered to exist when no verified and accurate cost model is available. The shape of data is used as an organising principle and the attribute of geometrical symmetry of cost variance point clouds used for the quantification of cost uncertainty. The results of the investigation suggest that the uncertainty of a cost estimate at any future point in time may be determined by the geometric symmetry of the cost variance data in its point cloud form at the time of estimation. Recommendations for future research include using the framework to determine the "most likely values" of estimates in Monte Carlo simulations and generalising the dependency model introduced. Future work is also recommended to reduce the framework limitations noted.

  20. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  1. Uncertainty Can Increase Explanatory Credibility

    DTIC Science & Technology

    2013-08-01

    metacognitive cue to infer their conversational partner’s depth of processing . Keywords: explanations, confidence, uncertainty, collaborative reasoning...scope, i.e., those that account for only observed phenomena (Khemlani, Sussman, & Oppenheimer , 2011). These preferences show that properties intrinsic...Fischhoff, & Phillips , 1982; Lindley, 1982; McClelland & Bolger, 1994). Much of the research on subjective confidence addresses how individuals

  2. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  3. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  4. Saccade Adaptation and Visual Uncertainty

    PubMed Central

    Souto, David; Gegenfurtner, Karl R.; Schütz, Alexander C.

    2016-01-01

    Visual uncertainty may affect saccade adaptation in two complementary ways. First, an ideal adaptor should take into account the reliability of visual information for determining the amount of correction, predicting that increasing visual uncertainty should decrease adaptation rates. We tested this by comparing observers' direction discrimination and adaptation rates in an intra-saccadic-step paradigm. Second, clearly visible target steps may generate a slower adaptation rate since the error can be attributed to an external cause, instead of an internal change in the visuo-motor mapping that needs to be compensated. We tested this prediction by measuring saccade adaptation to different step sizes. Most remarkably, we found little correlation between estimates of visual uncertainty and adaptation rates and no slower adaptation rates with more visible step sizes. Additionally, we show that for low contrast targets backward steps are perceived as stationary after the saccade, but that adaptation rates are independent of contrast. We suggest that the saccadic system uses different position signals for adapting dysmetric saccades and for generating a trans-saccadic stable visual percept, explaining that saccade adaptation is found to be independent of visual uncertainty. PMID:27252635

  5. The face of uncertainty eats.

    PubMed

    Corwin, Rebecca L W

    2011-09-01

    The idea that foods rich in fat and sugar may be addictive has generated much interest, as well as controversy, among both scientific and lay communities. Recent research indicates that fatty and sugary food in-and-of itself is not addictive. Rather, the food and the context in which it is consumed interact to produce an addiction-like state. One of the contexts that appears to be important is the intermittent opportunity to consume foods rich in fat and sugar in environments where food is plentiful. Animal research indicates that, under these conditions, intake of the fatty sugary food escalates across time and binge-type behavior develops. However, the mechanisms that account for the powerful effect of intermittency on ingestive behavior have only begun to be elucidated. In this review, it is proposed that intermittency stimulates appetitive behavior that is associated with uncertainty regarding what, when, and how much of the highly palatable food to consume. Uncertainty may stimulate consumption of optional fatty and sugary treats due to differential firing of midbrain dopamine neurons, activation of the stress axis, and involvement of orexin signaling. In short, uncertainty may produce an aversive state that bingeing on palatable food can alleviate, however temporarily. "Food addiction" may not be "addiction" to food at all; it may be a response to uncertainty within environments of food abundance.

  6. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  7. Basic Principles of Chromatography

    NASA Astrophysics Data System (ADS)

    Ismail, Baraem; Nielsen, S. Suzanne

    Chromatography has a great impact on all areas of analysis and, therefore, on the progress of science in general. Chromatography differs from other methods of separation in that a wide variety of materials, equipment, and techniques can be used. [Readers are referred to references (1-19) for general and specific information on chromatography.]. This chapter will focus on the principles of chromatography, mainly liquid chromatography (LC). Detailed principles and applications of gas chromatography (GC) will be discussed in Chap. 29. In view of its widespread use and applications, high-performance liquid chromatography (HPLC) will be discussed in a separate chapter (Chap. 28). The general principles of extraction are first described as a basis for understanding chromatography.

  8. Structural Damage Assessment under Uncertainty

    NASA Astrophysics Data System (ADS)

    Lopez Martinez, Israel

    Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are

  9. Elemental principles of t-topos

    NASA Astrophysics Data System (ADS)

    Kato, G.

    2004-11-01

    In this paper, a sheaf-theoretic approach toward fundamental problems in quantum physics is made. For example, the particle-wave duality depends upon whether or not a presheaf is evaluated at a specified object. The t-topos theoretic interpretations of double-slit interference, uncertainty principle(s), and the EPR-type non-locality are given. As will be explained, there are more than one type of uncertainty principle: the absolute uncertainty principle coming from the direct limit object corresponding to the refinements of coverings, the uncertainty coming from a micromorphism of shortest observable states, and the uncertainty of the observation image. A sheaf theoretic approach for quantum gravity has been made by Isham-Butterfield in (Found. Phys. 30 (2000) 1707), and by Raptis based on abstract differential geometry in Mallios A. and Raptis I. Int. J. Theor. Phys. 41 (2002), qr-qc/0110033; Mallios A. Remarks on "singularities" (2002) qr-qc/0202028; Mallios A. and Raptis I. Int. J. Theor. Phys. 42 (2003) 1479, qr-qc/0209048. See also the preprint The translocal depth-structure of space-time, Connes' "Points, Speaking to Each Other", and the (complex) structure of quantum theory, for another approach relevant to ours. Special axioms of t-topos formulation are: i) the usual linear-time concept is interpreted as the image of the presheaf (associated with time) evaluated at an object of a t-site (i.e., a category with a Grothendieck topology). And an object of this t-site, which is said to be a generalized time period, may be regarded as a hidden variable and ii) every object (in a particle ur-state) of microcosm (or of macrocosm) is regarded as the microcosm (or macrocosm) component of a product category for a presheaf evaluated at an object in the t-site. The fundamental category hat S is defined as the category of ∏α in Δ Cα-valued presheaves on the t-site S, where Δ is an index set. The study of topological properties of S with respect to the nature of

  10. Uncertainty in 3D gel dosimetry

    NASA Astrophysics Data System (ADS)

    De Deene, Yves; Jirasek, Andrew

    2015-01-01

    Three-dimensional (3D) gel dosimetry has a unique role to play in safeguarding conformal radiotherapy treatments as the technique can cover the full treatment chain and provides the radiation oncologist with the integrated dose distribution in 3D. It can also be applied to benchmark new treatment strategies such as image guided and tracking radiotherapy techniques. A major obstacle that has hindered the wider dissemination of gel dosimetry in radiotherapy centres is a lack of confidence in the reliability of the measured dose distribution. Uncertainties in 3D dosimeters are attributed to both dosimeter properties and scanning performance. In polymer gel dosimetry with MRI readout, discrepancies in dose response of large polymer gel dosimeters versus small calibration phantoms have been reported which can lead to significant inaccuracies in the dose maps. The sources of error in polymer gel dosimetry with MRI readout are well understood and it has been demonstrated that with a carefully designed scanning protocol, the overall uncertainty in absolute dose that can currently be obtained falls within 5% on an individual voxel basis, for a minimum voxel size of 5 mm3. However, several research groups have chosen to use polymer gel dosimetry in a relative manner by normalizing the dose distribution towards an internal reference dose within the gel dosimeter phantom. 3D dosimetry with optical scanning has also been mostly applied in a relative way, although in principle absolute calibration is possible. As the optical absorption in 3D dosimeters is less dependent on temperature it can be expected that the achievable accuracy is higher with optical CT. The precision in optical scanning of 3D dosimeters depends to a large extend on the performance of the detector. 3D dosimetry with X-ray CT readout is a low contrast imaging modality for polymer gel dosimetry. Sources of error in x-ray CT polymer gel dosimetry (XCT) are currently under investigation and include inherent

  11. [Ethical principles in electronvulsivotherapy].

    PubMed

    Richa, S; De Carvalho, W

    2016-12-01

    ECT or electroconvulsive therapy (ECT) is a therapeutic technique invented in 1935 but which was really developed after World War II and then spreading widely until the mid 1960s. The source of this technique, and some forms of stigma including films, have participated widely to make it suspect from a moral point of view. The ethical principles that support the establishment of a treatment by ECT are those relating to any action in psychiatry and are based on the one hand on the founding principles of bioethics: autonomy, beneficence, non-malfeasance, and justice, and on the other hand on the information on the technical and consent to this type of care.

  12. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  13. Stereo-particle image velocimetry uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  14. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  15. Visualizing Flow of Uncertainty through Analytical Processes.

    PubMed

    Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu

    2012-12-01

    Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.

  16. Error-disturbance uncertainty relations studied in neutron optics

    NASA Astrophysics Data System (ADS)

    Sponar, Stephan; Sulyok, Georg; Demirel, Bulent; Hasegawa, Yuji

    2016-09-01

    Heisenberg's uncertainty principle is probably the most famous statement of quantum physics and its essential aspects are well described by a formulations in terms of standard deviations. However, a naive Heisenberg-type error-disturbance relation is not valid. An alternative universally valid relation was derived by Ozawa in 2003. Though universally valid Ozawa's relation is not optimal. Recently, Branciard has derived a tight error-disturbance uncertainty relation (EDUR), describing the optimal trade-off between error and disturbance. Here, we report a neutron-optical experiment that records the error of a spin-component measurement, as well as the disturbance caused on another spin-component to test EDURs. We demonstrate that Heisenberg's original EDUR is violated, and the Ozawa's and Branciard's EDURs are valid in a wide range of experimental parameters, applying a new measurement procedure referred to as two-state method.

  17. Robust optimization of nonlinear impulsive rendezvous with uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, YaZhong; Yang, Zhen; Li, HengNian

    2014-04-01

    The optimal rendezvous trajectory designs in many current research efforts do not incorporate the practical uncertainties into the closed loop of the design. A robust optimization design method for a nonlinear rendezvous trajectory with uncertainty is proposed in this paper. One performance index related to the variances of the terminal state error is termed the robustness performance index, and a two-objective optimization model (including the minimum characteristic velocity and the minimum robustness performance index) is formulated on the basis of the Lambert algorithm. A multi-objective, non-dominated sorting genetic algorithm is employed to obtain the Pareto optimal solution set. It is shown that the proposed approach can be used to quickly obtain several inherent principles of the rendezvous trajectory by taking practical errors into account. Furthermore, this approach can identify the most preferable design space in which a specific solution for the actual application of the rendezvous control should be chosen.

  18. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  19. Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification

    SciTech Connect

    Li, Chunyuan; Stevens, Andrew J.; Chen, Changyou; Pu, Yunchen; Gan, Zhe; Carin, Lawrence

    2016-08-10

    Learning the representation of shape cues in 2D & 3D objects for recognition is a fundamental task in computer vision. Deep neural networks (DNNs) have shown promising performance on this task. Due to the large variability of shapes, accurate recognition relies on good estimates of model uncertainty, ignored in traditional training of DNNs, typically learned via stochastic optimization. This paper leverages recent advances in stochastic gradient Markov Chain Monte Carlo (SG-MCMC) to learn weight uncertainty in DNNs. It yields principled Bayesian interpretations for the commonly used Dropout/DropConnect techniques and incorporates them into the SG-MCMC framework. Extensive experiments on 2D & 3D shape datasets and various DNN models demonstrate the superiority of the proposed approach over stochastic optimization. Our approach yields higher recognition accuracy when used in conjunction with Dropout and Batch-Normalization.

  20. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  1. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  2. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  3. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  4. Principles of Cancer Screening.

    PubMed

    Pinsky, Paul F

    2015-10-01

    Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike.

  5. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  6. Principles of sound ecotoxicology.

    PubMed

    Harris, Catherine A; Scott, Alexander P; Johnson, Andrew C; Panter, Grace H; Sheahan, Dave; Roberts, Mike; Sumpter, John P

    2014-03-18

    We have become progressively more concerned about the quality of some published ecotoxicology research. Others have also expressed concern. It is not uncommon for basic, but extremely important, factors to apparently be ignored. For example, exposure concentrations in laboratory experiments are sometimes not measured, and hence there is no evidence that the test organisms were actually exposed to the test substance, let alone at the stated concentrations. To try to improve the quality of ecotoxicology research, we suggest 12 basic principles that should be considered, not at the point of publication of the results, but during the experimental design. These principles range from carefully considering essential aspects of experimental design through to accurately defining the exposure, as well as unbiased analysis and reporting of the results. Although not all principles will apply to all studies, we offer these principles in the hope that they will improve the quality of the science that is available to regulators. Science is an evidence-based discipline and it is important that we and the regulators can trust the evidence presented to us. Significant resources often have to be devoted to refuting the results of poor research when those resources could be utilized more effectively.

  7. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  8. A Stronger Multi-observable Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-03-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound.

  9. Adjoint-Based Uncertainty Quantification with MCNP

    NASA Astrophysics Data System (ADS)

    Seifried, Jeffrey Edwin

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  10. A Stronger Multi-observable Uncertainty Relation

    PubMed Central

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-01-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound. PMID:28317917

  11. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  12. On the quantum mechanical solutions with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Shababi, Homa; Pedram, Pouria; Chung, Won Sang

    2016-06-01

    In this paper, we study two generalized uncertainty principles (GUPs) including [X,P] = iℏ(1 + βP2j) and [X,P] = iℏ(1 + βP2 + kβ2P4) which imply minimal measurable lengths. Using two momentum representations, for the former GUP, we find eigenvalues and eigenfunctions of the free particle and the harmonic oscillator in terms of generalized trigonometric functions. Also, for the latter GUP, we obtain quantum mechanical solutions of a particle in a box and harmonic oscillator. Finally we investigate the statistical properties of the harmonic oscillator including partition function, internal energy, and heat capacity in the context of the first GUP.

  13. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2008-06-30

    model (such as Gaussian, spherical or exponential) typically used in geostatistics, we define the robust variogram model as the median regression curve...variogram model estimation We define the robust variogram model as the median regression curve of the residual difference squares for station pairs of...develop methodologies that improve location uncertainties in the presence of correlated, systematic model errors and non-Gaussian measurement errors. We

  14. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  15. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  16. Uncertainty propagation in nuclear forensics.

    PubMed

    Pommé, S; Jerome, S M; Venchiarutti, C

    2014-07-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined.

  17. Blade tip timing (BTT) uncertainties

    NASA Astrophysics Data System (ADS)

    Russhard, Pete

    2016-06-01

    Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.

  18. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  19. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  20. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  1. The Principle of Maximum Conformality

    SciTech Connect

    Brodsky, Stanley J; Giustino, Di; /SLAC

    2011-04-05

    A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

  2. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  3. Principles of Glacier Mechanics

    NASA Astrophysics Data System (ADS)

    Waddington, Edwin D.

    Glaciers are awesome in size and move at a majestic pace, and they frequently occupy spectacular mountainous terrain. Naturally, many Earth scientists are attracted to glaciers. Some of us are even fortunate enough to make a career of studying glacier flow. Many others work on the large, flat polar ice sheets where there is no scenery. As a leader of one of the foremost research projects now studying the flow of mountain glaciers (Storglaciaren, Norway), Roger Hooke is well qualified to describe the principles of glacier mechanics. Principles of Glacier Mechanics is written for upper-level undergraduate students and graduate students with an interest in glaciers and the landforms that glaciers produce. While most of the examples in the text are drawn from valley glacier studies, much of the material is also relevant to “glacier flatland” on the polar ice sheets.

  4. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  5. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  6. Principles of plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Hutchinson, Ian H.

    The physical principles, techniques, and instrumentation of plasma diagnostics are examined in an introduction and reference work for students and practicing scientists. Topics addressed include basic plasma properties, magnetic diagnostics, plasma particle flux, and refractive-index measurements. Consideration is given to EM emission by free and bound electrons, the scattering of EM radiation, and ion processes. Diagrams, drawings, graphs, sample problems, and a glossary of symbols are provided.

  7. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  8. Software Engineering Principles.

    DTIC Science & Technology

    1980-07-01

    but many differences as well . ct goal: Develop a family of military message systems using 2nt software engineering principles :ovide useful product to...The hard copy could then be manually scanned , distributed, and logged. SMP would be useful in developing and testing MP. It would provide minimal...design decisions.t4 C. Alternative ways to develop the program 1. Start from scratch. 2. Start with Stage 3. Scan line by line and make required changes. 3

  9. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  10. Prediction and uncertainty in associative learning: examining controlled and automatic components of learned attentional biases.

    PubMed

    Luque, David; Vadillo, Miguel A; Le Pelley, Mike E; Beesley, Tom

    2017-08-01

    It has been suggested that attention is guided by two factors that operate during associative learning: a predictiveness principle, by which attention is allocated to the best predictors of outcomes, and an uncertainty principle, by which attention is allocated to learn about the less known features of the environment. Recent studies have shown that predictiveness-driven attention can operate rapidly and in an automatic way to exploit known relationships. The corresponding characteristics of uncertainty-driven attention, on the other hand, remain unexplored. In two experiments we examined whether both predictiveness and uncertainty modulate attentional processing in an adaptation of the dot probe task. This task provides a measure of automatic orientation to cues during associative learning. The stimulus onset asynchrony of the probe display was manipulated in order to explore temporal characteristics of predictiveness- and uncertainty-driven attentional effects. Results showed that the predictive status of cues determined selective attention, with faster attentional capture to predictive than to non-predictive cues. In contrast, the level of uncertainty slowed down responses to the probe regardless of the predictive status of the cues. Both predictiveness- and uncertainty-driven attentional effects were very rapid (at 250 ms from cue onset) and were automatically activated.

  11. Heisenberg's observability principle

    NASA Astrophysics Data System (ADS)

    Wolff, Johanna

    2014-02-01

    Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.

  12. Probing Mach's principle

    NASA Astrophysics Data System (ADS)

    Annila, Arto

    2012-06-01

    The principle of least action in its original form á la Maupertuis is used to explain geodetic and frame-dragging precessions which are customarily accounted for a curved space-time in general relativity. The least-time equations of motion agree with observations and are also in concert with general relativity. Yet according to the least-time principle, gravitation does not relate to the mathematical metric of space-time, but to a tangible energy density embodied by photons. The density of free space is in balance with the total mass of the Universein accord with the Planck law. Likewise, a local photon density and its phase distribution are in balance with the mass and charge distribution of a local body. Here gravitational force is understood as an energy density difference that will diminish when the oppositely polarized pairs of photons co-propagate from the energy-dense system of bodies to the energy-sparse system of the surrounding free space. Thus when the body changes its state of motion, the surrounding energy density must accommodate the change. The concurrent resistance in restructuring the surroundings, ultimately involving the entire Universe, is known as inertia. The all-around propagating energy density couples everything with everything else in accord with Mach’s principle.

  13. Principled Missing Data Treatments.

    PubMed

    Lang, Kyle M; Little, Todd D

    2016-04-04

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  14. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained.

  15. Accounting for Calibration Uncertainty in Detectors for High-Energy Astrophysics

    NASA Astrophysics Data System (ADS)

    Xu, Jin

    Systematic instrumental uncertainties in astronomical analyses have been generally ignored in data analysis due to the lack of robust principled methods, though the importance of incorporating instrumental calibration uncertainty is widely recognized by both users and instrument builders. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. Lee et al. (2011) introduced a so-called pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduces an ad hoc technique that simplifies computation by assuming that the current data is not useful in narrowing the uncertainty for the calibration product, i.e., that the prior and posterior distributions for the calibration products are the same. In the thesis, we focus on incorporating calibration uncertainty into a principled Bayesian X-ray spectral analysis, specifically we account for uncertainty in the so-called effective area curve and the photon redistribution matrix. X-ray spectral analysis models the distribution of the energies of X-ray photons emitted from an astronomical source. The effective area curve of an X-ray detector describes its sensitive as a function of the energy of incoming photons, and the photon redistribution matrix describes the probability distribution of the recorded (discrete) energy of a photon as a function of the true (discretized) energy. Starting with the effective area curve, we follow Lee et al. (2011) and use a principle component analysis (PCA) to efficiently represent the uncertainty. Here, however, we leverage this representation to enable a principled, fully Bayesian method to account for calibration uncertainty in high-energy spectral analysis. For the photon redistribution matrix, we first model each conditional distribution as a normal distribution and then apply PCA to the parameters describing the normal models. This results in an

  16. The physical principles of quantum mechanics. A critical review

    NASA Astrophysics Data System (ADS)

    Strocchi, F.

    2012-01-01

    The standard presentation of the principles of quantum mechanics is critically reviewed both from the experimental/operational point and with respect to the request of mathematical consistency and logical economy. A simpler and more physically motivated formulation is discussed. The existence of non commuting observables, which characterizes quantum mechanics with respect to classical mechanics, is related to operationally testable complementarity relations, rather than to uncertainty relations. The drawbacks of Dirac argument for canonical quantization are avoided by a more geometrical approach.

  17. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…

  18. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  19. Identifying the Rhetoric of Uncertainty Reduction.

    ERIC Educational Resources Information Center

    Williams, David E.

    Offering a rhetorical perspective of uncertainty reduction, this paper (1) discusses uncertainty reduction theory and dramatism; (2) identifies rhetorical strategies inherent in C. W. Berger and R. J. Calabrese's theory; (3) extends predicted outcome value to influenced outcome value; and (4) argues that the goal of uncertainty reduction and…

  20. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.

  1. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality.

  2. Uncertainty Relation and Inseparability Criterion

    NASA Astrophysics Data System (ADS)

    Goswami, Ashutosh K.; Panigrahi, Prasanta K.

    2017-02-01

    We investigate the Peres-Horodecki positive partial transpose criterion in the context of conserved quantities and derive a condition of inseparability for a composite bipartite system depending only on the dimensions of its subsystems, which leads to a bi-linear entanglement witness for the two qubit system. A separability inequality using generalized Schrodinger-Robertson uncertainty relation taking suitable operators, has been derived, which proves to be stronger than the bi-linear entanglement witness operator. In the case of mixed density matrices, it identically distinguishes the separable and non separable Werner states.

  3. Classification: Purposes, Principles, Progress, Prospects

    ERIC Educational Resources Information Center

    Sokal, Robert R.

    1974-01-01

    Clustering and other new techniques have changed classificatory principles and practice in many sciences. Discussed are defintions, purposes of classification, principles of classification, and recent trends. (Author/RH)

  4. Use of Combined Uncertainty of Pesticide Residue Results for Testing Compliance with Maximum Residue Limits (MRLs).

    PubMed

    Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, Árpád

    2015-05-13

    The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants.

  5. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  6. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  7. Innovative surgery and the precautionary principle.

    PubMed

    Meyerson, Denise

    2013-12-01

    Surgical innovation involves practices, such as new devices, technologies, procedures, or applications, which are novel and untested. Although innovative practices are believed to offer an improvement on the standard surgical approach, they may prove to be inefficacious or even dangerous. This article considers how surgeons considering innovation should reason in the conditions of uncertainty that characterize innovative surgery. What attitude to the unknown risks of innovative surgery should they take? The answer to this question involves value judgments about the acceptability of risk taking when satisfactory scientific information is not available. This question has been confronted in legal contexts, where risk aversion in the form of the precautionary principle has become increasingly influential as a regulatory response to innovative technologies that pose uncertain future hazards. This article considers whether it is appropriate to apply a precautionary approach when making decisions about innovative surgery.

  8. Differentiate climate change uncertainty from other uncertainty sources for water quality modeling with Bayesian framework

    NASA Astrophysics Data System (ADS)

    Jiang, S.; Liu, M.; Rode, M.

    2011-12-01

    Prediction of water quality under future climate changes is always associated with significant uncertainty resulting from the use of climate models and stochastic weather generator. The future related uncertainty is usually mixed with the intrinsic uncertainty sources arising from model structure and parameterization which present also for modeling past and current events. For an effective water quality management policy, the uncertainty sources have to be differentiated and quantified separately. This work applies the Baysian framework in two steps to quantify the climate change uncertainty as input uncertainty and the parameter uncertainty respectively. The HYPE model (Hydrological Prediction for the Environment) from SMHI is applied to simulate the nutrient (N, P) sources in a 100 km2 agricultural low-land catchment in Germany, Weida. The results show that climate change shifts the uncertainty space in terms of probability density function (PDF), and a large portion of future uncertainty is not covered by current uncertainty.

  9. Principles of Environmental Chemistry

    NASA Astrophysics Data System (ADS)

    Hathaway, Ruth A.

    2007-07-01

    Roy M. Harrison, Editor RSC Publishing; ISBN 0854043713; × + 363 pp.; 2006; $69.95 Environmental chemistry is an interdisciplinary science that includes chemistry of the air, water, and soil. Although it may be confused with green chemistry, which deals with potential pollution reduction, environmental chemistry is the scientific study of the chemical and biochemical principles that occur in nature. Therefore, it is the study of the sources, reactions, transport, effects, and fates of chemical species in the air, water, and soil environments, and the effect of human activity on them. Environmental chemistry not only explores each of these environments, but also closely examines the interfaces and boundaries where the environments intersect.

  10. Complex Correspondence Principle

    SciTech Connect

    Bender, Carl M.; Meisinger, Peter N.; Hook, Daniel W.; Wang Qinghai

    2010-02-12

    Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

  11. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  12. [Catastrophe medicine. General principles].

    PubMed

    Linde, H

    1980-10-17

    Catastrophe medicine is the organization of masses under difficult conditions. The present article is concerned with the competence and organization in the event of a catastrophe and describes the phasic course of a catastrophe situation. The most important element of effective first aid measures, screening for shock and pain treatment, and first surgical treatment and the principles of ballistic factors are dealt with. Particular attention is given to the evacuation of emergency patients from the scene of the catastrophe. A request is made for " Catastrophe medicine" to be included by the medical faculties and educational institutes in their course of study for paramedical personnel.

  13. Principles of Liquid Chromatography

    NASA Astrophysics Data System (ADS)

    Bakalyar, Stephen R.

    This article reviews the basic principles of high performance liquid chromatography (HPLC). The introductory section provides an overview of the HPLC technique, placing it in historical context and discussing the elementary facts of the separation mechanism. The next section discusses the nature of resolution, describing the two principal aspects, zone center separation and zone spreading. The third section takes a detailed look at how HPLC is used in practice to achieve a separation. It discusses the three key variables that need to be adjusted: retention, efficiency, and selectivity. A fourth section is concerned with various relationships of practical importance: flow rate, temperature, and pressure. A final section discusses future trends in HPLC.

  14. Principles of electromagnetic theory

    SciTech Connect

    Kovetz, A.H. )

    1990-01-01

    This book emphasizes the fundamental understanding of the laws governing the behavior of charge and current carrying bodies. Electromagnetism is presented as a classical theory, based-like mechanics-on principles that are independent of the atomic constitution of matter. This book is unique among electromagnetic texts in its treatment of the precise manner in which electromagnetism is linked to mechanics and thermodynamics. Applications include electrostriction, piezoelectricity, ferromagnetism, superconductivity, thermoelectricity, magnetohydrodynamics, radiation from charged particles, electromagnetic wave propagation and guided waves. There are many worked examples of dynamical and thermal effects of electromagnetic fields, and of effects resulting from the motion of bodies.

  15. Aspects of universally valid Heisenberg uncertainty relation

    NASA Astrophysics Data System (ADS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2013-01-01

    A numerical illustration of a universally valid Heisenberg uncertainty relation, which was proposed recently, is presented by using the experimental data on spin-measurements by J. Erhart et al. [Nat. Phys. 8, 185 (2012)]. This uncertainty relation is closely related to a modified form of the Arthurs-Kelly uncertainty relation, which is also tested by the spin-measurements. The universally valid Heisenberg uncertainty relation always holds, but both the modified Arthurs-Kelly uncertainty relation and the Heisenberg error-disturbance relation proposed by Ozawa, which was analyzed in the original experiment, fail in the present context of spin-measurements, and the cause of their failure is identified with the assumptions of unbiased measurement and disturbance. It is also shown that all the universally valid uncertainty relations are derived from Robertson's relation and thus the essence of the uncertainty relation is exhausted by Robertson's relation, as is widely accepted.

  16. Path planning under spatial uncertainty.

    PubMed

    Wiener, Jan M; Lafon, Matthieu; Berthoz, Alain

    2008-04-01

    In this article, we present experiments studying path planning under spatial uncertainties. In the main experiment, the participants' task was to navigate the shortest possible path to find an object hidden in one of four places and to bring it to the final destination. The probability of finding the object (probability matrix) was different for each of the four places and varied between conditions. Givensuch uncertainties about the object's location, planning a single path is not sufficient. Participants had to generate multiple consecutive plans (metaplans)--for example: If the object is found in A, proceed to the destination; if the object is not found, proceed to B; and so on. The optimal solution depends on the specific probability matrix. In each condition, participants learned a different probability matrix and were then asked to report the optimal metaplan. Results demonstrate effective integration of the probabilistic information about the object's location during planning. We present a hierarchical planning scheme that could account for participants' behavior, as well as for systematic errors and differences between conditions.

  17. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  18. Principles for School Drug Education

    ERIC Educational Resources Information Center

    Meyer, Lois

    2004-01-01

    This document presents a revised set of principles for school drug education. The principles for drug education in schools comprise an evolving framework that has proved useful over a number of decades in guiding the development of effective drug education. The first edition of "Principles for Drug Education in Schools" (Ballard et al.…

  19. Revealing a quantum feature of dimensionless uncertainty in linear and quadratic potentials by changing potential intervals

    NASA Astrophysics Data System (ADS)

    Kheiri, R.

    2016-09-01

    As an undergraduate exercise, in an article (2012 Am. J. Phys. 80 780-14), quantum and classical uncertainties for dimensionless variables of position and momentum were evaluated in three potentials: infinite well, bouncing ball, and harmonic oscillator. While original quantum uncertainty products depend on {{\\hslash }} and the number of states (n), a dimensionless approach makes the comparison between quantum uncertainty and classical dispersion possible by excluding {{\\hslash }}. But the question is whether the uncertainty still remains dependent on quantum number n. In the above-mentioned article, there lies this contrast; on the one hand, the dimensionless quantum uncertainty of the potential box approaches classical dispersion only in the limit of large quantum numbers (n\\to ∞ )—consistent with the correspondence principle. On the other hand, similar evaluations for bouncing ball and harmonic oscillator potentials are equal to their classical counterparts independent of n. This equality may hide the quantum feature of low energy levels. In the current study, we change the potential intervals in order to make them symmetric for the linear potential and non-symmetric for the quadratic potential. As a result, it is shown in this paper that the dimensionless quantum uncertainty of these potentials in the new potential intervals is expressed in terms of quantum number n. In other words, the uncertainty requires the correspondence principle in order to approach the classical limit. Therefore, it can be concluded that the dimensionless analysis, as a useful pedagogical method, does not take away the quantum feature of the n-dependence of quantum uncertainty in general. Moreover, our numerical calculations include the higher powers of the position for the potentials.

  20. Uncertainty quantification for quantum chemical models of complex reaction networks.

    PubMed

    Proppe, Jonny; Husch, Tamara; Simm, Gregor N; Reiher, Markus

    2016-12-22

    For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.

  1. Forest management under uncertainty for multiple bird population objectives

    USGS Publications Warehouse

    Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

  2. Revisiting Tversky's diagnosticity principle

    PubMed Central

    Evers, Ellen R. K.; Lakens, Daniël

    2013-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  3. [Principles of callus distraction].

    PubMed

    Hankemeier, S; Bastian, L; Gosling, T; Krettek, C

    2004-10-01

    Callus distraction is based on the principle of regenerating bone by continuous distraction of proliferating callus tissue. It has become the standard treatment of significant leg shortening and large bone defects. Due to many problems and complications, exact preoperative planning, operative technique and careful postoperative follow-up are essential. External fixators can be used for all indications of callus distraction. However, due to pin tract infections, pain and loss of mobility caused by soft tissue transfixation, fixators are applied in patients with open growth plates, simultaneous lengthening with continuous deformity corrections, and increased risk of infection. Distraction over an intramedullary nail allows removal of the external fixator at the end of distraction before callus consolidation (monorail method). The intramedullary nail protects newly formed callus tissue and reduces the risk of axial deviation and refractures. Recently developed, fully intramedullary lengthening devices eliminate fixator-associated complications and accelerate return to normal daily activities. This review describes principles of callus distraction, potential complications and their management.

  4. Principle of relative locality

    SciTech Connect

    Amelino-Camelia, Giovanni; Freidel, Laurent; Smolin, Lee; Kowalski-Glikman, Jerzy

    2011-10-15

    We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

  5. Principles of safety pharmacology.

    PubMed

    Pugsley, M K; Authier, S; Curtis, M J

    2008-08-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.).

  6. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  7. An integrated approach for addressing uncertainty in the delineation of groundwater management areas

    NASA Astrophysics Data System (ADS)

    Sousa, Marcelo R.; Frind, Emil O.; Rudolph, David L.

    2013-05-01

    Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.

  8. An integrated approach for addressing uncertainty in the delineation of groundwater management areas.

    PubMed

    Sousa, Marcelo R; Frind, Emil O; Rudolph, David L

    2013-05-01

    Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.

  9. On the Impact of Uncertainty in Initial Conditions of Hydrologic Models on Prediction

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.

    2015-12-01

    Determining the initial conditions for predictive models remains a challenge due to the uncertainty in measurement/identification of the state variables at the scale of interest. However, the characterization of uncertainty in initial conditions has arguably attracted less attention compared with other sources of uncertainty in hydrologic modelling (e.g, parameter, data, and structural uncertainty). This is perhaps because it is commonly believed that: (1) hydrologic systems (relatively rapidly) forget their initial conditions over time, and (2) other sources of uncertainty (e.g., in data) are dominant. This presentation revisits the basic principles of the theory of nonlinear dynamical systems in the context of hydrologic systems. Through simple example case studies, we demonstrate how and under what circumstances different hydrologic processes represent a range of attracting limit sets in their evolution trajectory in state space over time, including fixed points, limit cycles (periodic behaviour), torus (quasi-periodic behaviour), and strange attractors (chaotic behaviour). Furthermore, the propagation (or dissipation) of uncertainty in initial conditions of several hydrologic models through time, under any of the possible attracting limit sets, is investigated. This study highlights that there are definite situations in hydrology where uncertainty in initial conditions remains of significance. The results and insights gained have important implications for hydrologic modelling under non-stationarity in climate and environment.

  10. Risk management and the precautionary principle: a fuzzy logic model.

    PubMed

    Cameron, Enrico; Peloso, Gian Francesco

    2005-08-01

    The aim of this article is to illustrate a procedure for applying the precautionary principle within a strategy for reducing the possibility of underestimating the effective risk caused by a phenomenon, product, or process, and of adopting insufficient risk reduction measures or overlooking their need. We start by simply defining risk as the product between the numerical expression of the adverse consequences of an event and the likelihood of its occurrence or the likelihood that such consequences will occur. Uncertainty in likelihood estimates and several key concepts inherent to the precautionary principle, such as sufficient certainty, prevention, and desired level of protection, are represented as fuzzy sets. The strategy described may be viewed as a simplified example of a precautionary decision process that has been chiefly conceived as a theoretical contribution to the debate concerning the precautionary principle, the quantification of its application, and the formal approach to such problems.

  11. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    PubMed

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  12. Uncertainty in gridded CO2 emissions estimates

    NASA Astrophysics Data System (ADS)

    Hogue, Susannah; Marland, Eric; Andres, Robert J.; Marland, Gregg; Woodard, Dawn

    2016-05-01

    We are interested in the spatial distribution of fossil-fuel-related emissions of CO2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from the use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. Uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.

  13. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  14. Optimal invasive species management under multiple uncertainties.

    PubMed

    Kotani, Koji; Kakinaka, Makoto; Matsuda, Hiroyuki

    2011-09-01

    The management programs for invasive species have been proposed and implemented in many regions of the world. However, practitioners and scientists have not reached a consensus on how to control them yet. One reason is the presence of various uncertainties associated with the management. To give some guidance on this issue, we characterize the optimal strategy by developing a dynamic model of invasive species management under uncertainties. In particular, focusing on (i) growth uncertainty and (ii) measurement uncertainty, we identify how these uncertainties affect optimal strategies and value functions. Our results suggest that a rise in growth uncertainty causes the optimal strategy to involve more restrained removals and the corresponding value function to shift up. Furthermore, we also find that a rise in measurement uncertainty affects optimal policies in a highly complex manner, but their corresponding value functions generally shift down as measurement uncertainty rises. Overall, a rise in growth uncertainty can be beneficial, while a rise in measurement uncertainty brings about an adverse effect, which implies the potential gain of precisely identifying the current stock size of invasive species.

  15. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  16. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  17. Induction of models under uncertainty

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter

    1986-01-01

    This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.

  18. [Uncertainty, agency relationships and miscommunication].

    PubMed

    Ruiz Moreno, J

    1999-04-01

    This author finds it curious that for a long time, whenever I have tried to explain what an agent relationship is, everyone understands the concept with ease. "It's common sense," they usually respond. Nonetheless, it is curious to note to what point agent relationships are ignored by those persons having managerial responsibilities in the majority of businesses in any industrial or service field. We use this statement by the author to introduce the fundamental purpose of this article. Probably, once having read it, any professional would have a clearer idea about such important concepts (in order to realize how costs function and how they are generated inside a health organization) as uncertainty, agent relationships or asymmetrical information. Using a British Airways flight as a comparison, it will be difficult not to understand these terms in their practical applications to the health field.

  19. Uncertainty and instream flow standards

    USGS Publications Warehouse

    Castleberry, D.; Cech, J.; Erman, D.; Hankin, D.; Healey, M.; Kondolf, M.; Mengel, M.; Mohr, M.; Moyle, P.; Nielsen, Jennifer; Speed, T.; Williams, J.

    1996-01-01

    Several years ago, Science published an important essay (Ludwig et al. 1993) on the need to confront the scientific uncertainty associated with managing natural resources. The essay did not discuss instream flow standards explicitly, but its arguments apply. At an April 1995 workshop in Davis, California, all 12 participants agreed that currently no scientifically defensible method exists for defining the instream flows needed to protect particular species of fish or aquatic ecosystems (Williams, in press). We also agreed that acknowledging this fact is an essential step in dealing rationally and effectively with the problem.Practical necessity and the protection of fishery resources require that new instream flow standards be established and that existing standards be revised. However, if standards cannot be defined scientifically, how can this be done? We join others in recommending the approach of adaptive management. Applied to instream flow standards, this approach involves at least three elements.

  20. Uncertainties in Transfer Impedance Calculations

    NASA Astrophysics Data System (ADS)

    Schippers, H.; Verpoorte, J.

    2016-05-01

    The shielding effectiveness of metal braids of cables is governed by the geometry and the materials of the braid. The shielding effectiveness can be characterised by the transfer impedance of the metal braid. Analytical models for the transfer impedance contain in general two components, one representing diffusion of electromagnetic energy through the metal braid, and a second part representing leakage of magnetic fields through the braid. Possible sources of uncertainties in the modelling are inaccurate input data (for instance, the exact size of the braid diameter or wire diameter are not known) and imperfections in the computational model. The aim of the present paper is to estimate effects of variations of input data on the calculated transfer impedance.

  1. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  2. Genetics and psychiatry: a proposal for the application of the precautionary principle.

    PubMed

    Porteri, Corinna

    2013-08-01

    The paper suggests an application of the precautionary principle to the use of genetics in psychiatry focusing on scientific uncertainty. Different levels of uncertainty are taken into consideration--from the acknowledgement that the genetic paradigm is only one of the possible ways to explain psychiatric disorders, via the difficulties related to the diagnostic path and genetic methods, to the value of the results of studies carried out in this field. Considering those uncertainties, some measures for the use of genetics in psychiatry are suggested. Some of those measures are related to the conceptual limits of the genetic paradigm; others are related to present knowledge and should be re-evaluated.

  3. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  4. Kepler and Mach's Principle

    NASA Astrophysics Data System (ADS)

    Barbour, Julian

    The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.

  5. Principles of Induction Accelerators

    NASA Astrophysics Data System (ADS)

    Briggs*, Richard J.

    The basic concepts involved in induction accelerators are introduced in this chapter. The objective is to provide a foundation for the more detailed coverage of key technology elements and specific applications in the following chapters. A wide variety of induction accelerators are discussed in the following chapters, from the high current linear electron accelerator configurations that have been the main focus of the original developments, to circular configurations like the ion synchrotrons that are the subject of more recent research. The main focus in the present chapter is on the induction module containing the magnetic core that plays the role of a transformer in coupling the pulsed power from the modulator to the charged particle beam. This is the essential common element in all these induction accelerators, and an understanding of the basic processes involved in its operation is the main objective of this chapter. (See [1] for a useful and complementary presentation of the basic principles in induction linacs.)

  6. Basic Principles in Oncology

    NASA Astrophysics Data System (ADS)

    Vogl, Thomas J.

    The evolving field of interventional oncology can only be considered as a small integrative part in the complex area of oncology. The new field of interventional oncology needs a standardization of the procedures, the terminology, and criteria to facilitate the effective communication of ideas and appropriate comparison between treatments and new integrative technology. In principle, ablative therapy is a part of locoregional oncological therapy and is defined either as chemical ablation using ethanol or acetic acid, or thermotherapies such as radiofrequency, laser, microwave, and cryoablation. All these new evolving therapies have to be exactly evaluated and an adequate terminology has to be used to define imaging findings and pathology. All the different technologies and evaluated therapies have to be compared, and the results have to be analyzed in order to improve the patient outcome.

  7. Uncertainty quantification for holographic interferographic images

    NASA Astrophysics Data System (ADS)

    Centauri, Laurie Ann

    Current comparison methods for experimental and simulated holographic interferometric images are qualitative in nature. Previous comparisons of holographic interferometric images with computational fluid dynamics (CFD) simulations for validation have been performed qualitatively through visual comparison by a data analyst. By validating the experiments and CFD simulations in a quantifiable manner using a consistency analysis, the validation becomes a repeatable process that gives a consistency measure and a range of inputs over which the experiments and CFD simulations give consistent results. The quantification of uncertainty in four holographic interferometric experiments was performed for use in a data collaboration with CFD simulations for the purpose of validation. The model uncertainty from image-processing, the measurement uncertainty from experimental data variation, and the scenario uncertainty from the bias and parameter uncertainty was quantified. The scenario uncertainty was determined through comparison with an analytical solution at the helium inlet (height, x = 0), including the uncertainty in the experimental parameters from historical weather data. The model uncertainty was calculated through a Box-Behnkin sensitivity analysis on three image-processing code parameters. Measurement uncertainty was determined through a statistical analysis to determine the time-average and standard deviation in the interference fringe positions. An experimental design matrix of CFD simulations was performed by Weston Eldredge using a Box-Behnkin design with helium velocity, temperature, and air co-flow velocity as parameters in conjunction to provide simulated measurements for the data collaboration Data set. Over 3,200 holographic interferometric images were processed through the course of this study. When each permutation of these images is taken into account through all the image-processing steps, the total number of images processed is over 13,000. Probability

  8. Dynamical principles in neuroscience

    SciTech Connect

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-15

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

  9. Fault Management Guiding Principles

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  10. Dynamical principles in neuroscience

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  11. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-11-11

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our

  12. A Bayesian Foundation for Individual Learning Under Uncertainty

    PubMed Central

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J.; Stephan, Klaas E.

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID

  13. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  14. Assessment of SFR Wire Wrap Simulation Uncertainties

    SciTech Connect

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David; Swiler, Laura P.

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility.

  15. Sensitivity analysis of uncertainty in model prediction.

    PubMed

    Russi, Trent; Packard, Andrew; Feeley, Ryan; Frenklach, Michael

    2008-03-27

    Data Collaboration is a framework designed to make inferences from experimental observations in the context of an underlying model. In the prior studies, the methodology was applied to prediction on chemical kinetics models, consistency of a reaction system, and discrimination among competing reaction models. The present work advances Data Collaboration by developing sensitivity analysis of uncertainty in model prediction with respect to uncertainty in experimental observations and model parameters. Evaluation of sensitivity coefficients is performed alongside the solution of the general optimization ansatz of Data Collaboration. The obtained sensitivity coefficients allow one to determine which experiment/parameter uncertainty contributes the most to the uncertainty in model prediction, rank such effects, consider new or even hypothetical experiments to perform, and combine the uncertainty analysis with the cost of uncertainty reduction, thereby providing guidance in selecting an experimental/theoretical strategy for community action.

  16. Evacuation decision-making: process and uncertainty

    SciTech Connect

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.

  17. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  18. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  19. Incorporating Forecast Uncertainty in Utility Control Center

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  20. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  1. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However, MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  2. Uncertainty quantification of effective nuclear interactions

    SciTech Connect

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-02

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  3. Calculation of Measurement Uncertainty Using Prior Information

    PubMed Central

    Phillips, S. D.; Estler, W. T.; Levenson, M. S.; Eberhardt, K. R.

    1998-01-01

    We describe the use of Bayesian inference to include prior information about the value of the measurand in the calculation of measurement uncertainty. Typical examples show this can, in effect, reduce the expanded uncertainty by up to 85 %. The application of the Bayesian approach to proving workpiece conformance to specification (as given by international standard ISO 14253-1) is presented and a procedure for increasing the conformance zone by modifying the expanded uncertainty guard bands is discussed. PMID:28009370

  4. A Strategy for Uncertainty Visualization Design

    DTIC Science & Technology

    2009-10-01

    clearer understanding. As an example application of the UVDS, it is applied to current research regarding uncertainty visualization for the Canadian...the design a clearer understanding. As an example application of the UVDS, it is applied to current research regarding uncertainty visualization...TMs) aimed at creating foundational documents on the topic of uncertainty visualization which can be used in defence applications . In addition, the

  5. UK coastal flood risk; understanding the uncertainty

    NASA Astrophysics Data System (ADS)

    Lewis, Matt; Bates, Paul; Horsburgh, Kevin; Smith, Ros

    2010-05-01

    The sensitivity of flood risk mapping to the major sources of future climate uncertainty were investigated by propagating these uncertainties through a LISFLOOD inundation model of a significant flood event of the North Somerset coast, to the west of the UK. The largest source of uncertainty was found to be the effect of the global Mean Sea Level rise range of 18-59cm (as reported by the Intergovernmental Panel on Climate Change), with an approximate upper limit of 1m, by 2100. Therefore, MSL rise uncertainty needs to be quantified in future flood risk predictions. However, the uncertainty of the storm tide height along the coastline (i.e. the maximum water-level at the coast excluding wave effects) was found to significantly affect our results. Our evidence suggests that the current flood mapping approach of forcing the inundation model with an extreme water-level of constant return period is incorrect. We present a new technique which is based on the spatial characteristics of real events. This provides a more reliable spatial treatment of the storm tide uncertainty. The uncertainty of land roughness coefficients (0.018-0.09 for the study area, depending upon land use), used within the inundation model to control flood wave propagation, was found to affect inundation extents especially for larger inundation events. However, the sensitivity to roughness uncertainty was found to be much smaller than other factors, such as Mean Sea Level rise uncertainty. We present the results of propagating these uncertainties through an inundation model and develop probabilistic techniques to quantify these sources of future flood risk uncertainty. Keywords: future flood risk, uncertainty, inundation, LISFLOOD, sea-level rise, climate change

  6. Uncertainty propagation within the UNEDF models

    NASA Astrophysics Data System (ADS)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  7. One-parameter class of uncertainty relations based on entropy power.

    PubMed

    Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A

    2016-06-01

    We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.

  8. One-parameter class of uncertainty relations based on entropy power

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A.

    2016-06-01

    We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.

  9. Manufacturing uncertainty: contested science and the protection of the public's health and environment.

    PubMed

    Michaels, David; Monforton, Celeste

    2005-01-01

    Opponents of public health and environmental regulations often try to "manufacture uncertainty" by questioning the validity of scientific evidence on which the regulations are based. Though most identified with the tobacco industry, this strategy has also been used by producers of other hazardous products. Its proponents use the label "junk science" to ridicule research that threatens powerful interests. This strategy of manufacturing uncertainty is antithetical to the public health principle that decisions be made using the best evidence available. The public health system must ensure that scientific evidence is evaluated in a manner that assures the public's health and environment will be adequately protected.

  10. Signal-locality, uncertainty, and the subquantum H-theorem. II

    NASA Astrophysics Data System (ADS)

    Valentini, Antony

    1991-08-01

    In the pilot-wave formulation, signal-locality and the uncertainty principle are shown to be valid only for the equilibrium distribution P=| Ψ| 2 (which arises from the subquantum H-theorem proved earlier). The H-theorem then explains the emergence of effective locality and uncertainty from a deeper nonlocal and deterministic theory. In order to explain the present uneasy “peaceful coexistence” (or “conspiracy”) between relativity and quantum theory, we suggest that a subquantum analogue of Boltzmann's heat death has actually happened in the real universe.

  11. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  12. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  13. Sticking to its principles.

    PubMed

    1992-03-27

    Planned Parenthood says that rather than accept the Bush administration's gag rule it will give up federal funding of its operations. The gag rule forbids professionals at birth control clinics from even referring to abortion as an option to a pregnant woman, much less recommending one. President Bush has agreed to a policy which allows physicians but no one else at clinics to discuss abortion in at least some cases. In his view, according to White House officials, this was an admitted attempt to straddle the issue. Why he would want to straddle is understandable. The right wing of his party, which has always been suspicious of Mr. Bush, is pushing him to uphold what it regards as the Reagan legacy on this issue. The original gag rule, which prevented even physicians from discussing abortion as an option in almost all cases, was issued in the last presidents's 2nd term and upheld last year by the Supreme Court. Give Planned Parenthood credit for sticking to its principles. A lot of recipients of all sorts of federal funds want it both ways, take the money but not accept federal policy guidelines. When they find they can't, many "rise above principle," take the money and adjust policy accordingly. It is not going to be easy for Planned Parenthood now. Federal funds account for a significant portion of the organizations's budgets. Planned Parenthood of Maryland, for example, gets about $500,000 a year from the federal government, or about 12-13% of its total budget. It will either have to cut back on its services or increase its fundraising from other sources or charge women more for services--or all of those things. This is not the end of the story. It is certainly not the end of the political story. Pat Buchanan said of the new regulations, "I like the old position, to be quite candid." Thank goodness he never won a primary. George Bush would not have moved even as far as he hid on the gag rule. There will be a lot of agreement with the Buchanan view at the

  14. The Principles of War Reconsidered

    DTIC Science & Technology

    2009-06-01

    Africanus, Julius Caesar , Genghis Khan, Suleiman the Magnificent, Jomini, Douhet, and Giap (John M. Collins, Military Strategy: Principles, Practices...Alexander, Hannibal and Caesar have all acted on the same principles. To keep your forces united, to be vulnerable at no point, to bear down with...and circumstances. Originally, the principles were not expressed by the warriors (most simply did not have the time to write – Caesar being a notable

  15. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will

  16. Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Miyadera, Takayuki; Imai, Hideki

    2008-11-01

    A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.

  17. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  18. Certainty in Heisenberg's uncertainty principle: Revisiting definitions for estimation errors and disturbance

    NASA Astrophysics Data System (ADS)

    Dressel, Justin; Nori, Franco

    2014-02-01

    We revisit the definitions of error and disturbance recently used in error-disturbance inequalities derived by Ozawa and others by expressing them in the reduced system space. The interpretation of the definitions as mean-squared deviations relies on an implicit assumption that is generally incompatible with the Bell-Kochen-Specker-Spekkens contextuality theorems, and which results in averaging the deviations over a non-positive-semidefinite joint quasiprobability distribution. For unbiased measurements, the error admits a concrete interpretation as the dispersion in the estimation of the mean induced by the measurement ambiguity. We demonstrate how to directly measure not only this dispersion but also every observable moment with the same experimental data, and thus demonstrate that perfect distributional estimations can have nonzero error according to this measure. We conclude that the inequalities using these definitions do not capture the spirit of Heisenberg's eponymous inequality, but do indicate a qualitatively different relationship between dispersion and disturbance that is appropriate for ensembles being probed by all outcomes of an apparatus. To reconnect with the discussion of Heisenberg, we suggest alternative definitions of error and disturbance that are intrinsic to a single apparatus outcome. These definitions naturally involve the retrodictive and interdictive states for that outcome, and produce complementarity and error-disturbance inequalities that have the same form as the traditional Heisenberg relation.

  19. Guidelines for Risk and Uncertainty Analysis in Water Resources Planning. Volume 1. Principles - With Technical Appendices

    DTIC Science & Technology

    1992-03-01

    of such loss (3) the product of the amount that may be lost and the probability of losing it." The Risk and Insurance Management Society (RIMS) in...for Water Resources, August 1989. Rescher, N. Risk Washington D.C.: University Press of America, 1983. Risk and Insurance Management Society. Risk

  20. Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?

    NASA Astrophysics Data System (ADS)

    Mc Leod, David; Mc Leod, Roger

    2007-04-01

    String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that λp = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5

  1. Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.

    ERIC Educational Resources Information Center

    Wassermann, Selma

    2001-01-01

    Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill.…

  2. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  3. Using Principles of Programmed Instruction

    ERIC Educational Resources Information Center

    Huffman, Harry

    1971-01-01

    Although programmed instruction in accounting is available, it is limited in scope and in acceptance. Teachers, however, may apply principles of programming to the individualizing of instruction. (Author)

  4. Vedic principles of therapy.

    PubMed

    Boyer, R W

    2012-01-01

    This paper introduces Vedic principles of therapy as a holistic integration of healing and human development. The most integrative aspect is a "consciousness-based" approach in which the bottom line of the mind is consciousness itself, accessed by transcending mental activity to its simplest ground state. This directly contrasts with "unconscious-based" approaches that hold the basis of conscious mind is the unconscious, such as analytic, humanistic, and cognitive-behavioral approaches. Although not presented as a specific therapeutic approach, interventions associated with this Vedic approach have extensive support in the applied research literature. A brief review of experimental research toward a general model of mind-and cutting-edge developments in quantum physics toward nonlocal mind-shows a convergence on the ancient Vedic model of mind. Comparisons with contemporary therapies further show that the simplicity, subtlety, and holistic nature of the Vedic approach represent a significant advance over approaches which have overlooked the fundamental ground state of the mind.

  5. Bateman's principle and immunity.

    PubMed Central

    Rolff, Jens

    2002-01-01

    The immunocompetence handicap hypothesis (ICHH) of Folstad and Karter has inspired a large number of studies that have tried to understand the causal basis of parasite-mediated sexual selection. Even though this hypothesis is based on the double function of testosterone, a hormone restricted to vertebrates, studies of invertebrates have tended to provide central support for specific predictions of the ICHH. I propose an alternative hypothesis that explains many of the findings without relying on testosterone or other biochemical feedback loops. This alternative is based on Bateman's principle, that males gain fitness by increasing their mating success whilst females increase fitness through longevity because their reproductive effort is much higher. Consequently, I predict that females should invest more in immunity than males. The extent of this dimorphism is determined by the mating system and the genetic correlation between males and females in immune traits. In support of my arguments, I mainly use studies on insects that share innate immunity with vertebrates and have the advantage that they are easier to study. PMID:11958720

  6. Great Lakes Literacy Principles

    NASA Astrophysics Data System (ADS)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  7. Principles of alternative gerontology

    PubMed Central

    Bilinski, Tomasz; Bylak, Aneta; Zadrag-Tecza, Renata

    2016-01-01

    Surveys of taxonomic groups of animals have shown that contrary to the opinion of most gerontologists aging is not a genuine trait. The process of aging is not universal and its mechanisms have not been widely conserved among species. All life forms are subject to extrinsic and intrinsic destructive forces. Destructive effects of stochastic events are visible only when allowed by the specific life program of an organism. Effective life programs of immortality and high longevity eliminate the impact of unavoidable damage. Organisms that are capable of agametic reproduction are biologically immortal. Mortality of an organism is clearly associated with terminal specialisation in sexual reproduction. The longevity phenotype that is not accompanied by symptoms of senescence has been observed in those groups of animals that continue to increase their body size after reaching sexual maturity. This is the result of enormous regeneration abilities of both of the above-mentioned groups. Senescence is observed when: (i) an organism by principle switches off the expression of existing growth and regeneration programs, as in the case of imago formation in insect development; (ii) particular programs of growth and regeneration of progenitors are irreversibly lost, either partially or in their entirety, in mammals and birds. “We can't solve problems by using the same kind of thinking we used when we created them.” (Ascribed to Albert Einstein) PMID:27017907

  8. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  9. Uncertainty reasoning in expert systems

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  10. Communicating Storm Surge Forecast Uncertainty

    NASA Astrophysics Data System (ADS)

    Troutman, J. A.; Rhome, J.

    2015-12-01

    When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.

  11. [The beginning of the first principles: the anthropic principle].

    PubMed

    González de Posada, Francisco

    2004-01-01

    The nowadays classical Anthropic Principle is put both in the historical perspective of the traditional problem of "the place of man in the Universe', and in the confluence of several scientific "border" issues, some of which, due to their problematical nature, are also subject of philosophical analysis. On the one hand, the scientific uses of the Principle, related to the initial and constitutional conditions of "our Universe", are enumerated, as they are supposedly necessary for the appearance and consequent development of Life--up to Man--. On the other, an organized collection of the principles of today's Physics is synthetically exhibited. The object of this work is to determine the intrinsic scientific nature of the Anthropic Principle, and the role it plays in the global frame of the principles of Physics (Astrophysics, Astrobiology and Cosmology).

  12. The need for model uncertainty analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  13. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  14. Uncertainty and Engagement with Learning Games

    ERIC Educational Resources Information Center

    Howard-Jones, Paul A.; Demetriou, Skevi

    2009-01-01

    Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

  15. Uncertainties in the JPL planetary ephemeris

    NASA Astrophysics Data System (ADS)

    Folkner, W. M.

    2011-10-01

    The numerically integrated planetary ephemerides by JPL, IMCCE, and IPA are largely based on the same observation set and dynamical models. The differences between ephemerides are expected to be consistent within uncertainties. Uncertainties in the orbits of the major planets and the dwarf planet Pluto based on recent analysis at JPL are described.

  16. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  17. Methods of Dealing with Uncertainty: Panel Presentation.

    ERIC Educational Resources Information Center

    Pearce, Frank C.

    Rising energy costs, changing tax bases, increasing numbers of non-traditional students, and ever changing educational technology point to the fact that community college administrators will have to accept uncertainty as a normal planning component. Rather than ignoring uncertainty, a tendency that was evidenced in the reluctance of administrators…

  18. Uncertainty Propagation for Terrestrial Mobile Laser Scanner

    NASA Astrophysics Data System (ADS)

    Mezian, c.; Vallet, Bruno; Soheilian, Bahman; Paparoditis, Nicolas

    2016-06-01

    Laser scanners are used more and more in mobile mapping systems. They provide 3D point clouds that are used for object reconstruction and registration of the system. For both of those applications, uncertainty analysis of 3D points is of great interest but rarely investigated in the literature. In this paper we present a complete pipeline that takes into account all the sources of uncertainties and allows to compute a covariance matrix per 3D point. The sources of uncertainties are laser scanner, calibration of the scanner in relation to the vehicle and direct georeferencing system. We suppose that all the uncertainties follow the Gaussian law. The variances of the laser scanner measurements (two angles and one distance) are usually evaluated by the constructors. This is also the case for integrated direct georeferencing devices. Residuals of the calibration process were used to estimate the covariance matrix of the 6D transformation between scanner laser and the vehicle system. Knowing the variances of all sources of uncertainties, we applied uncertainty propagation technique to compute the variance-covariance matrix of every obtained 3D point. Such an uncertainty analysis enables to estimate the impact of different laser scanners and georeferencing devices on the quality of obtained 3D points. The obtained uncertainty values were illustrated using error ellipsoids on different datasets.

  19. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  20. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  1. Uncertainty Propagation in an Ecosystem Nutrient Budget.

    EPA Science Inventory

    New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

  2. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  3. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  4. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  5. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  6. Nonclassicality in phase-number uncertainty relations

    SciTech Connect

    Matia-Hernando, Paloma; Luis, Alfredo

    2011-12-15

    We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.

  7. The Stock Market: Risk vs. Uncertainty.

    ERIC Educational Resources Information Center

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  11. The inconstant "principle of constancy".

    PubMed

    Kanzer, M

    1983-01-01

    A review of the principle of constancy, as it appeared in Freud's writings, shows that it was inspired by his clinical observations, first with Breuer in the field of cathartic therapy and then through experiences in the early usage of psychoanalysis. The recognition that memories repressed in the unconscious created increasing tension, and that this was relieved with dischargelike phenomena when the unconscious was made conscious, was the basis for his claim to originality in this area. The two principles of "neuronic inertia" Freud expounded in the Project (1895), are found to offer the key to the ambiguous definition of the principle of constancy he was to offer in later years. The "original" principle, which sought the complete discharge of energy (or elimination of stimuli), became the forerunner of the death drive; the "extended" principle achieved balances that were relatively constant, but succumbed in the end to complete discharge. This was the predecessor of the life drives. The relation between the constancy and pleasure-unpleasure principles was maintained for twenty-five years largely on an empirical basis which invoked the concept of psychophysical parallelism between "quantity" and "quality." As the links between the two principles were weakened by clinical experiences attendant upon the growth of ego psychology, a revision of the principle of constancy was suggested, and it was renamed the Nirvana principle. Actually it was shifted from alignment with the "extended" principle of inertia to the original, so that "constancy" was incongruously identified with self-extinction. The former basis for the constancy principle, the extended principle of inertia, became identified with Eros. Only a few commentators seem aware of this radical transformation, which has been overlooked in the Standard Edition of Freud's writings. Physiological biases in the history and conception of the principle of constancy are noted in the Standard Edition. The historical

  12. Position-momentum uncertainty relations in the presence of quantum memory

    SciTech Connect

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  13. Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

    SciTech Connect

    Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

    2013-12-01

    The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

  14. Contending with uncertainty in conservation management decisions

    PubMed Central

    McCarthy, Michael A

    2014-01-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  15. Quantifying uncertainty in future ocean carbon uptake

    NASA Astrophysics Data System (ADS)

    Dunne, John P.

    2016-10-01

    Attributing uncertainty in ocean carbon uptake between societal trajectory (scenarios), Earth System Model construction (structure), and inherent natural variation in climate (internal) is critical to make progress in identifying, understanding, and reducing those uncertainties. In the present issue of Global Biogeochemical Cycles, Lovenduski et al. (2016) disentangle these drivers of uncertainty in ocean carbon uptake over time and space and assess the resulting implications for the emergence timescales of structural and scenario uncertainty over internal variability. Such efforts are critical for establishing realizable and efficient monitoring goals and prioritizing areas of continued model development. Under recently proposed climate stabilization targets, such efforts to partition uncertainty also become increasingly critical to societal decision-making in the context of carbon stabilization.

  16. Habitable zone dependence on stellar parameter uncertainties

    SciTech Connect

    Kane, Stephen R.

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  17. Uncertainties in derived temperature-height profiles

    NASA Technical Reports Server (NTRS)

    Minzner, R. A.

    1974-01-01

    Nomographs were developed for relating uncertainty in temperature T to uncertainty in the observed height profiles of both pressure p and density rho. The relative uncertainty delta T/T is seen to depend not only upon the relative uncertainties delta P/P or delta rho/rho, and to a small extent upon the value of T or H, but primarily upon the sampling-height increment Delta h, the height increment between successive observations of p or delta. For a fixed value of delta p/p, the value of delta T/T varies inversely with Delta h. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperatures with unacceptably large uncertainties.

  18. Multicandidate Elections: Aggregate Uncertainty in the Laboratory*

    PubMed Central

    Bouton, Laurent; Castanheira, Micael; Llorente-Saguer, Aniol

    2015-01-01

    The rational-voter model is often criticized on the grounds that two of its central predictions (the paradox of voting and Duverger’s Law) are at odds with reality. Recent theoretical advances suggest that these empirically unsound predictions might be an artifact of an (arguably unrealistic) assumption: the absence of aggregate uncertainty about the distribution of preferences in the electorate. In this paper, we propose direct empirical evidence of the effect of aggregate uncertainty in multicandidate elections. Adopting a theory-based experimental approach, we explore whether aggregate uncertainty indeed favors the emergence of non-Duverger’s law equilibria in plurality elections. Our experimental results support the main theoretical predictions: sincere voting is a predominant strategy under aggregate uncertainty, whereas without aggregate uncertainty, voters massively coordinate their votes behind one candidate, who wins almost surely. PMID:28298811

  19. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  20. Principles of periodontology.

    PubMed

    Dentino, Andrew; Lee, Seokwoo; Mailhot, Jason; Hefti, Arthur F

    2013-02-01

    Periodontal diseases are among the most common diseases affecting humans. Dental biofilm is a contributor to the etiology of most periodontal diseases. It is also widely accepted that immunological and inflammatory responses to biofilm components are manifested by signs and symptoms of periodontal disease. The outcome of such interaction is modulated by risk factors (modifiers), either inherent (genetic) or acquired (environmental), significantly affecting the initiation and progression of different periodontal disease phenotypes. While definitive genetic determinants responsible for either susceptibility or resistance to periodontal disease have yet to be identified, many factors affecting the pathogenesis have been described, including smoking, diabetes, obesity, medications, and nutrition. Currently, periodontal diseases are classified based upon clinical disease traits using radiographs and clinical examination. Advances in genomics, molecular biology, and personalized medicine may result in new guidelines for unambiguous disease definition and diagnosis in the future. Recent studies have implied relationships between periodontal diseases and systemic conditions. Answering critical questions regarding host-parasite interactions in periodontal diseases may provide new insight in the pathogenesis of other biomedical disorders. Therapeutic efforts have focused on the microbial nature of the infection, as active treatment centers on biofilm disruption by non-surgical mechanical debridement with antimicrobial and sometimes anti-inflammatory adjuncts. The surgical treatment aims at gaining access to periodontal lesions and correcting unfavorable gingival/osseous contours to achieve a periodontal architecture that will provide for more effective oral hygiene and periodontal maintenance. In addition, advances in tissue engineering have provided innovative means to regenerate/repair periodontal defects, based upon principles of guided tissue regeneration and utilization