The Conditional Uncertainty Principle
Gilad Gour; Varun Narasimhachar; Andrzej Grudka; Micha? Horodecki; Waldemar K?obus; Justyna ?odyga
2015-06-23
The uncertainty principle, which states that certain sets of quantum-mechanical measurements have a minimal joint uncertainty, has many applications in quantum cryptography. But in such applications, it is important to consider the effect of a (sometimes adversarially controlled) memory that can be correlated with the system being measured: The information retained by such a memory can in fact diminish the uncertainty of measurements. Uncertainty conditioned on a memory was considered in the past by Berta et al. (Ref. 1), who found a specific uncertainty relation in terms of the von Neumann conditional entropy. But this entropy is not the only measure that can be used to quantify conditional uncertainty. In the spirit of recent work by several groups (Refs. 2--6), here we develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent form. Our formalism is built around a mathematical relation that we call conditional majorization. We define and characterize conditional majorization, and use it to develop tools for the construction of measures of the conditional uncertainty of individual measurements, and also of the joint conditional uncertainty of sets of measurements. We demonstrate the use of this framework by deriving measure-independent conditional uncertainty relations of two types: (1) A lower bound on the minimal joint uncertainty that two remote parties (Bob and Eve) have about the outcome of a given pair of measurements performed by a third remote party (Alice), conditioned on arbitrary measurements that Bob and Eve make on their own systems. This lower bound is independent of the initial state shared by the three parties; (2) An initial state--dependent lower bound on the minimal joint uncertainty that Bob has about Alice's pair of measurements in a bipartite setting, conditioned on Bob's quantum system.
Uncertainty Principle Respects Locality
Dongsheng Wang
2015-04-19
The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one. The origin of the quantum structure of nature still remains to be explained, some post-quantum theory which is more complete in some sense than quantum mechanics is possible and might not necessarily be a hidden variable theory.
Uncertainty Principles Sparse Representation in Overcomplete Dictionaries
Hirn, Matthew
Uncertainty Principles Sparse Representation in Overcomplete Dictionaries Uncertainty Principles;Uncertainty Principles Sparse Representation in Overcomplete Dictionaries Outline 1 Uncertainty Principles Representation in Overcomplete Dictionaries Time/Frequency Dictionary Two Orthonormal Bases Dictionary Any
Gerbes and Heisenberg's Uncertainty Principle
J. M. Isidro
2006-03-31
We prove that a gerbe with a connection can be defined on classical phase space, taking the U(1)-valued phase of certain Feynman path integrals as Cech 2-cocycles. A quantisation condition on the corresponding 3-form field strength is proved to be equivalent to Heisenberg's uncertainty principle.
Review on Generalized Uncertainty Principle
Abdel Nasser Tawfik; Abdel Magied Diab
2015-09-22
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Review on Generalized Uncertainty Principle
Tawfik, Abdel Nasser
2015-01-01
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Directional Uncertainty Principle for Quaternion Fourier Transform
Eckhard Hitzer
2013-06-06
This paper derives a new directional uncertainty principle for quaternion valued functions subject to the quaternion Fourier transformation. This can be generalized to establish directional uncertainty principles in Clifford geometric algebras with quaternion subalgebras. We demonstrate this with the example of a directional spacetime algebra function uncertainty principle related to multivector wave packets.
Quantum Mechanics and the Generalized Uncertainty Principle
Jang Young Bang; Micheal S. Berger
2006-11-30
The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.
Gamma-Ray Telescope and Uncertainty Principle
ERIC Educational Resources Information Center
Shivalingaswamy, T.; Kagali, B. A.
2012-01-01
Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…
Disturbance, the uncertainty principle and quantum optics
NASA Technical Reports Server (NTRS)
Martens, Hans; Demuynck, Willem M.
1993-01-01
It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.
The Species Delimitation Uncertainty Principle
Adams, Byron J.
2001-01-01
If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874
The uncertainty principle as an evolutionary engine.
Matsuno, K
1992-01-01
Heisenberg's uncertainty principle in quantum mechanics underlies the genesis of evolutionary variability. When the uncertainty principle is coupled with the incontrovertible principle of the conservation of energy and material resources, there appears an uncertainty relationship between local fluctuations in the quantities to be conserved on a global scale and the rate of their local variation. Since the local fluctuations are accompanied by the non-vanishing rate of variation because of the uncertainty relationship, they generate subsequent fluctuations. Generativity latent in the uncertainty relationship is non-random and ubiquitous all through various evolutionary stages from abiotic synthesis of monomers and polymers up to the emergence of behavior-induced variability of organisms. PMID:1457736
Generalized uncertainty principle and black hole entropy
NASA Astrophysics Data System (ADS)
Ren, Zhao; Sheng-Li, Zhang
2006-10-01
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this Letter, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction terms of entropy, temperature and energy caused by the generalized uncertainty principle. We calculate Cardy-Verlinde formula after considering the correction. In our calculation, we only think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle and do not introduce any assumption. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the corrections caused by the generalized uncertainty principle to the black hole thermodynamic quantity of the complicated spacetime.
Noncommutativity, generalized uncertainty principle and FRW cosmology
A. Bina; K. Atazadeh; S. Jalalzadeh
2007-09-23
We consider the effects of noncommutativity and the generalized uncertainty principle on the FRW cosmology with a scalar field. We show that, the cosmological constant problem and removability of initial curvature singularity find natural solutions in this scenarios.
Uncertainty Principle: Classic and Quantum Aspects
N. V. Brazovskaja; V. Ye Brazovsky
2003-06-25
Some aspects of application of the Uncertainty Principle in the range of interaction radiation with matter surveyed. The procedure of adjustment is proposed at calculation of values of an electromagnetic energy in a quantum theory of a field.
Curriculum in Art Education: The Uncertainty Principle.
ERIC Educational Resources Information Center
Sullivan, Graeme
1989-01-01
Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the…
Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.
ERIC Educational Resources Information Center
McKerrow, K. Kelly; McKerrow, Joan E.
1991-01-01
The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)
An uncertainty principle for unimodular quantum groups
Crann, Jason [School of Mathematics and Statistics, Carleton University, Ottawa, Ontario K1S 5B6 (Canada); Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex (France); Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca [School of Mathematics and Statistics, Carleton University, Ottawa, Ontario K1S 5B6 (Canada)
2014-08-15
We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.
Generalized Uncertainty Principle and Dark Matter
Pisin Chen
2003-05-01
There have been proposals that primordial black hole remnants (BHRs) are the dark matter, but the idea is somewhat vague. Recently we argued that the generalized uncertainty principle (GUP) may prevent black holes from evaporating completely, in a similar way that the standard uncertainty principle prevents the hydrogen atom from collapsing. We further noted that the hybrid inflation model provides a plausible mechanism for production of large numbers of small black holes. Combining these we suggested that the dark matter might be composed of Planck-size BHRs. In this paper we briefly review these arguments, and discuss the reheating temperature as a result of black hole evaporation.
Space tests of the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Khodadi, M.
2015-08-01
A generalized uncertainty principle admitting a minimal measurable length contains a parameter of which the numerical value needs to be fixed. In fact, the application of the Generalized Uncertainty Principle (GUP) to some quantum mechanical problems offers different values for the upper bound of the GUP dimensionless parameter . In this work, by applying a GUP which is linear and quadratic in the correction to Newton's law of gravity, and then using the stability condition of the circular orbits of the planets, we propose an upper bound for . By using the astronomical data of the Solar System objects, a new and severe constraint on the upper bound of the parameter is derived. Also, using the modified Newtonian potential, inspired by a GUP which is linear and quadratic in , we investigate the possibility of measuring the relevant parameter through observables provided by the Galileo Navigation Satellite System.
Generalized Uncertainty Principle: Approaches and Applications
Abdel Nasser Tawfik; Abdel Magied Diab
2014-11-23
We review highlights from string theory, black hole physics and doubly special relativity and some "thought" experiments which were suggested to probe the shortest distance and/or the maximum momentum at the Planck scale. The models which are designed to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed entered the literature as the Generalized Uncertainty Principle (GUP). We compare between them. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. Furthermore, assuming modified dispersion relation allows for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of the gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. Another one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.
Generalized uncertainty principle: Approaches and applications
NASA Astrophysics Data System (ADS)
Tawfik, A.; Diab, A.
2014-11-01
In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.
Dilaton Cosmology, Noncommutativity and Generalized Uncertainty Principle
Babak Vakili
2008-02-26
The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length, results in the Generalized Uncertainty Principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. We extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.
Dilaton cosmology, noncommutativity, and generalized uncertainty principle
Vakili, Babak [Department of Physics, Shahid Beheshti University, Evin, Tehran 19839 (Iran, Islamic Republic of)
2008-02-15
The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.
Gravitational tests of the Generalized Uncertainty Principle
Fabio Scardigli; Roberto Casadio
2014-07-01
We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.
Gravitational tests of the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Scardigli, Fabio; Casadio, Roberto
2015-09-01
We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.
Quantum randomness certified by the uncertainty principle
NASA Astrophysics Data System (ADS)
Vallone, Giuseppe; Marangon, Davide G.; Tomasin, Marco; Villoresi, Paolo
2014-11-01
We present an efficient method to extract the amount of true randomness that can be obtained by a quantum random number generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min-entropy and max-entropy. We tested our method with two different QRNGs by using a train of qubits or ququart and demonstrated the scalability toward practical applications.
Robertson-Schrödinger formulation of Ozawa's uncertainty principle
NASA Astrophysics Data System (ADS)
Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Nuno Prata, João
2015-07-01
A more general measurement disturbance uncertainty principle is presented in a Robertson-Schrödinger formulation. It is shown that it is stronger and having nicer properties than Ozawa's uncertainty relations. In particular it is invariant under symplectic transformations. One shows also that there are states of the probe (measuring device) that saturate the matrix formulation of measurement disturbance uncertainty principle.
Generalized uncertainty principle and black hole thermodynamics
NASA Astrophysics Data System (ADS)
Gangopadhyay, Sunandan; Dutta, Abhijit; Saha, Anirban
2014-02-01
We study the Schwarzschild and Reissner-Nordström black hole thermodynamics using the simplest form of the generalized uncertainty principle (GUP) proposed in the literature. The expressions for the mass-temperature relation, heat capacity and entropy are obtained in both cases from which the critical and remnant masses are computed. Our results are exact and reveal that these masses are identical and larger than the so called singular mass for which the thermodynamics quantities become ill-defined. The expression for the entropy reveals the well known area theorem in terms of the horizon area in both cases upto leading order corrections from GUP. The area theorem written in terms of a new variable which can be interpreted as the reduced horizon area arises only when the computation is carried out to the next higher order correction from GUP.
Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.
ERIC Educational Resources Information Center
Roth, Wolff-Michael
1993-01-01
Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)
Open Timelike Curves Violate Heisenberg's Uncertainty Principle
NASA Astrophysics Data System (ADS)
Pienaar, J. L.; Ralph, T. C.; Myers, C. R.
2013-02-01
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg’s uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.
ERIC Educational Resources Information Center
Bartell, Lawrence S.
1985-01-01
Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)
String Theory, Scale Relativity and the Generalized Uncertainty Principle
Carlos Castro
1996-11-06
Extensions (modifications) of the Heisenberg Uncertainty principle are derived within the framework of the theory of Special Scale-Relativity proposed by Nottale. In particular, generalizations of the Stringy Uncertainty Principle are obtained where the size of the strings is bounded by the Planck scale and the size of the Universe. Based on the fractal structures inherent with two dimensional Quantum Gravity, which has attracted considerable interest recently, we conjecture that the underlying fundamental principle behind String theory should be based on an extension of the Scale Relativity principle where both dynamics as well as scales are incorporated in the same footing.
Uncertainty Principle, Shannon-Nyquist Sampling and Beyond
NASA Astrophysics Data System (ADS)
Fujikawa, Kazuo; Ge, Mo-Lin; Liu, Yu-Long; Zhao, Qing
2015-06-01
Donoho and Stark have shown that a precise deterministic recovery of missing information contained in a time interval shorter than the time-frequency uncertainty limit is possible. We analyze this signal recovery mechanism from a physics point of view and show that the well-known Shannon-Nyquist sampling theorem, which is fundamental in signal processing, also uses essentially the same mechanism. The uncertainty relation in the context of information theory, which is based on Fourier analysis, provides a criterion to distinguish Shannon-Nyquist sampling from compressed sensing. A new signal recovery formula, which is analogous to Donoho-Stark formula, is given using the idea of Shannon-Nyquist sampling; in this formulation, the smearing of information below the uncertainty limit as well as the recovery of information with specified bandwidth take place. We also discuss the recovery of states from the domain below the uncertainty limit of coordinate and momentum in quantum mechanics and show that in principle the state recovery works by assuming ideal measurement procedures. The recovery of the lost information in the sub-uncertainty domain means that the loss of information in such a small domain is not fatal, which is in accord with our common understanding of the uncertainty principle, although its precise recovery is something we are not used to in quantum mechanics. The uncertainty principle provides a universal sampling criterion covering both the classical Shannon-Nyquist sampling theorem and the quantum mechanical measurement.
Demonstration of the angular uncertainty principle for single photons
NASA Astrophysics Data System (ADS)
Jack, B.; Aursand, P.; Franke-Arnold, S.; Ireland, D. G.; Leach, J.; Barnett, S. M.; Padgett, M. J.
2011-06-01
We present an experimental demonstration of a form of the angular uncertainty principle for single photons. Producing light from type I down-conversion, we use spatial light modulators to perform measurements on signal and idler photons. By measuring states in the angle and orbital angular momentum basis, we demonstrate the uncertainty relation of Franke-Arnold et al (2004 New J. Phys. 6 103). We consider two manifestations of the uncertainty relation. In the first we herald the presence of a photon by detection of its paired partner and demonstrate the uncertainty relation on this single photon. In the second, we perform orbital angular momentum measurements on one photon and angular measurements on its correlated partner exploring, in this way, the uncertainty relation through non-local measurements.
Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete
Ferguson, Thomas S.
samples. Consider a discrete-time signal f CN and a randomly chosen set of frequencies of mean size NRobust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Abstract This paper considers the model problem of reconstructing an object from incomplete frequency
The Uncertainty Principle, Virtual Particles and Real Forces
ERIC Educational Resources Information Center
Jones, Goronwy Tudor
2002-01-01
This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…
Single-Slit Diffraction and the Uncertainty Principle
ERIC Educational Resources Information Center
Rioux, Frank
2005-01-01
A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.
Entropic Law of Force, Emergent Gravity and the Uncertainty Principle
M. A. Santos; I. V. Vancea
2011-10-26
The entropic formulation of the inertia and the gravity relies on quantum, geometrical and informational arguments. The fact that the results are completly classical is missleading. In this paper we argue that the entropic formulation provides new insights into the quantum nature of the inertia and the gravity. We use the entropic postulate to determine the quantum uncertainty in the law of inertia and in the law of gravity in the Newtonian Mechanics, the Special Relativity and in the General Relativity. These results are obtained by considering the most general quantum property of the matter represented by the Uncertainty Principle and by postulating an expression for the uncertainty of the entropy such that: i) it is the simplest quantum generalization of the postulate of the variation of the entropy and ii) it reduces to the variation of the entropy in the absence of the uncertainty.
Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle
NASA Astrophysics Data System (ADS)
Oppenheim, Jacob N.; Magnasco, Marcelo O.
2013-01-01
The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4?). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.
Effects of the Generalized Uncertainty Principle on Quantum Tunneling
Blado, Gardo; Jennings, James; Ceyanes, Joshuah; Sepulveda, Rafael
2015-01-01
In a previous paper [Blado G, Owens C, and Meyers V 2014 Quantum Wells and the Generalized Uncertainty Principle Eur. J. Phys. 35 065011], we showed that quantum gravity effects can be discussed with only a background in non-relativistic quantum mechanics at the undergraduate level by looking at the effect of the generalized uncertainty principle (GUP) on the finite and infinite square wells. In this paper, we derive the GUP corrections to the tunneling probability of simple quantum mechanical systems which are accessible to undergraduates (alpha decay, simple models of quantum cosmogenesis and gravitational tunneling radiation) and which employ the WKB approximation, a topic discussed in undergraduate quantum mechanics classes. It is shown that the GUP correction increases the tunneling probability in each of the examples discussed.
Relativistic orbits in a generalized uncertainty principle spacetime
NASA Astrophysics Data System (ADS)
Taus, Rhys; Mureika, Jonas
2015-04-01
Many theories of quantum gravity corroborate the notion of a minimal length scale. The generalized uncertainty principle (GUP), an extension of the Heisenberg uncertainty principle also incorporates this feature. Recent work has yielded a modification to the Schwarzschild solution that incorporates the GUP, making the theory self-complete and modifying the associated black hole characteristics. In this project, corrections to the orbits of timelike and lightlike test particles are explored in the GUP spacetime through the modified effective potential. Corrections to the classical experimental tests, notably the advancement of the perihelion of mercury and the deflection of starlight, are found and compared to results from other studies of a non-commutative spacetime.
The Generalized Uncertainty Principle and Black Hole Remnants
Ronald J. Adler; Pisin Chen; David I. Santiago
2001-06-26
In the current standard viewpoint small black holes are believed to emit radiation as black bodies at the Hawking temperature, at least until they reach Planck size, after which their fate is open to conjecture. A cogent argument against the existence of remnants is that, since no evident quantum number prevents it, black holes should radiate completely away to photons and other ordinary stable particles and vacuum, like any unstable quantum system. Here we argue the contrary, that the generalized uncertainty principle may prevent their total evaporation in exactly the same way that the uncertainty principle prevents the hydrogen atom from total collapse: the collapse is prevented, not by symmetry, but by dynamics, as a minimum size and mass are approached.
Sub-Planckian black holes and the Generalized Uncertainty Principle
NASA Astrophysics Data System (ADS)
Carr, Bernard; Mureika, Jonas; Nicolini, Piero
2015-07-01
The Black Hole Uncertainty Principle correspondence suggests that there could exist black holes with mass beneath the Planck scale but radius of order the Compton scale rather than Schwarzschild scale. We present a modified, self-dual Schwarzschild-like metric that reproduces desirable aspects of a variety of disparate models in the sub-Planckian limit, while remaining Schwarzschild in the large mass limit. The self-dual nature of this solution under M ? M -1 naturally implies a Generalized Uncertainty Principle with the linear form . We also demonstrate a natural dimensional reduction feature, in that the gravitational radius and thermodynamics of sub-Planckian objects resemble that of (1 + 1)-D gravity. The temperature of sub-Planckian black holes scales as M rather than M -1 but the evaporation of those smaller than 10-36 g is suppressed by the cosmic background radiation. This suggests that relics of this mass could provide the dark matter.
Uncertainty principle for Gabor systems and the Zak transform
Czaja, Wojciech; Zienkiewicz, Jacek [Institute of Mathematics, University of Wrodaw, Plac Grunwaldzki 2/4, 50-384 Wrodaw (Poland)
2006-12-15
We show that if g(set-membership sign)L{sup 2}(R) is a generator of a Gabor orthonormal basis with the lattice ZxZ, then its Zak transform Z(g) satisfies {nabla}Z(g)(negated-set-membership sign)L{sup 2}([0,1){sup 2}). This is a generalization and extension of the Balian-Low uncertainty principle.
Generalized uncertainty principle, quantum gravity and Horava-Lifshitz gravity
Yun Soo Myung
2009-01-01
We investigate a close connection between generalized uncertainty principle (GUP) and deformed Horava-Lifshitz (HL) gravity. The GUP commutation relations correspond to the UV-quantum theory, while the canonical commutation relations represent the IR-quantum theory. Inspired by this UV\\/IR quantum mechanics, we obtain the GUP-corrected graviton propagator by introducing UV-momentum p=p0i(1+betap02) and compare this with tensor propagators in the HL gravity. Two
Factorization in the quantum mechanics with the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Chung, Won Sang
2015-07-01
In this paper, we discuss the quantum mechanics with the generalized uncertainty principle (GUP) where the commutation relation is given by [x?,p?] = i?(1 + ?p? + ?p?2). For this algebra, we obtain the eigenfunction of the momentum operator. We also study the GUP corrected quantum particle in a box. Finally, we apply the factorization method to the harmonic oscillator in the presence of a minimal observable length and obtain the energy eigenvalues by applying the perturbation method.
Generalized uncertainty principle: implications for black hole complementarity
NASA Astrophysics Data System (ADS)
Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han
2014-12-01
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.
Generalized Uncertainty Principle: Implications for Black Hole Complementarity
Pisin Chen; Yen Chin Ong; Dong-han Yeom
2014-12-10
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N-dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N-dependence, and whether black hole complementarity is indeed correct.
UNCERTAINTY PRINCIPLE By reanalysing the experiment on Heisenberg Gamma-Ray
Groppi, Christopher
1 UNCERTAINTY PRINCIPLE IS UNTENABLE By reanalysing the experiment on Heisenberg Gamma-Ray that uncertainty principle is untenable. Key words : uncertainty principle; experiment on Heisenberg Gamma-Ray Microscope; ideal experiment Ideal Experiment 1 1 Experiment on Heisenberg Gamma-Ray Microscope A free
Uncertainty principle certifies genuine source of intrinsic randomness
Trina Chakraborty; Manik Banik; Pinaki Patra
2013-11-15
The Born's rule introduces intrinsic randomness to the outcomes of a measurement performed on a quantum mechanical system. But, if the system is prepared in the eigenstate of an observable then the measurement outcome of that observable is completely predictable and hence there is no intrinsic randomness. On the other hand, if two incompatible observables are measured (either sequentially on a particle or simultaneously on two identical copies of the particle) then uncertainty principle guarantees intrinsic randomness in the subsequent outcomes independent of the preparation state of the system. In this article we show that this is true not only in quantum mechanics but for any no-signaling probabilistic theory. Also the minimum amount of intrinsic randomness that can be guaranteed for arbitrarily prepared state of the system is quantified by the amount of (un)certainty.
Nonequilibrium fluctuation-dissipation inequality and nonequilibrium uncertainty principle.
Fleming, C H; Hu, B L; Roura, Albert
2013-07-01
The fluctuation-dissipation relation is usually formulated for a system interacting with a heat bath at finite temperature, and often in the context of linear response theory, where only small deviations from the mean are considered. We show that for an open quantum system interacting with a nonequilibrium environment, where temperature is no longer a valid notion, a fluctuation-dissipation inequality exists. Instead of being proportional, quantum fluctuations are bounded below by quantum dissipation, whereas classically the fluctuations vanish at zero temperature. The lower bound of this inequality is exactly satisfied by (zero-temperature) quantum noise and is in accord with the Heisenberg uncertainty principle, in both its microscopic origins and its influence upon systems. Moreover, it is shown that there is a coupling-dependent nonequilibrium fluctuation-dissipation relation that determines the nonequilibrium uncertainty relation of linear systems in the weak-damping limit. PMID:23944409
Classical Dynamics Based on the Minimal Length Uncertainty Principle
NASA Astrophysics Data System (ADS)
Chung, Won Sang
2015-06-01
In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the ?-deformed Poisson bracket for corresponding classical variables. We use the ?-deformed Poisson bracket to discuss some physical problems in the ?-deformed classical dynamics. Finally, we consider the (?,?)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + ? hat {x}2 + ? hat {p}2 ) . For two small parameters ?,?, we discuss the free fall of particle and a composite system in a uniform gravitational field.
Universal uncertainty principle and quantum state control under conservation laws
Ozawa, M
2004-01-01
Heisenberg's uncertainty principle, exemplified by the gamma ray thought experiment, suggests that any finite precision measurement disturbs any observables noncommuting with the measured observable. Here, it is shown that this statement contradicts the limit of the accuracy of measurements under conservation laws originally found by Wigner in 1950s, and should be modified to correctly derive the unavoidable noise caused by the conservation law induced decoherence. The obtained accuracy limit leads to an interesting conclusion that a widely accepted, but rather naive, physical encoding of qubits for quantum computing suffers significantly from the decoherence induced by the angular momentum conservation law.
Roura, Albert
2015-01-01
Atom interferometry tests of universality of free fall based on the differential measurement of two different atomic species provide a useful complement to those based on macroscopic masses. However, when striving for the highest possible sensitivities, gravity gradients pose a serious challenge. Indeed, the relative initial position and velocity for the two species need to be controlled with extremely high accuracy, which can be rather demanding in practice and whose verification may require rather long integration times. Furthermore, in highly sensitive configurations gravity gradients lead to a drastic loss of contrast. These difficulties can be mitigated by employing wave packets with narrower position and momentum widths, but this is ultimately limited by Heisenberg's uncertainty principle. We present a novel scheme that simultaneously overcomes the loss of contrast and the initial co-location problem. In doing so, it circumvents the fundamental limitations due to Heisenberg's uncertainty principle and e...
Hawking temperature for various kinds of black holes from Heisenberg uncertainty principle
Fabio Scardigli
2006-07-04
Hawking temperature is computed for a large class of black holes (with spherical, toroidal and hyperboloidal topologies) using only laws of classical physics plus the "classical" Heisenberg Uncertainty Principle. This principle is shown to be fully sufficient to get the result, and there is no need to this scope of a Generalized Uncertainty Principle.
Scalar field cosmology modified by the Generalized Uncertainty Principle
Paliathanasis, Andronikos; Pramanik, Souvik
2015-01-01
We consider quintessence scalar field cosmology in which the Lagrangian of the scalar field is modified by the Generalized Uncertainty Principle. We show that the perturbation terms which arise from the deformed algebra are equivalent with the existence of a second scalar field, where the two fields interact in the kinetic part. Moreover, we consider a spatially flat Friedmann-Lema\\^{\\i}tre-Robertson-Walker spacetime (FLRW), and we derive the gravitational field equations. We show that the modified equation of state parameter $w_{GUP}$ can cross the phantom divide line; that is $w_{GUP}<-1$. Furthermore, we derive the field equations in the dimensionless parameters, the dynamical system which arises is a singular perturbation system in which we study the existence of the fixed points in the slow manifold. Finally, we perform numerical simulations for some well known models and we show that for these models with the specific initial conditions, the parameter $w_{GUP}$ crosses the phantom barrier.
Long-range mutual information and topological uncertainty principle
Chao-Ming Jian; Isaac H. Kim; Xiao-Liang Qi
2015-09-10
Ordered phases in Landau paradigm can be diagnosed by a local order parameter, whereas topologically ordered phases cannot be detected in such a way. In this paper, we propose long-range mutual information(LRMI) as a unified diagnostic for both conventional long-range order and topological order. Using the LRMI, we characterize orders in $n+1$D gapped systems as $m$-membrane condensates with $ 0 \\leq m \\leq n-1$. The familiar conventional order and 2+1D topological orders are respectively identified as $0$-membrane and $1$-membrane condensates. We propose and study the topological uncertainty principle, which describes the non-commuting nature of non-local order parameters in topological orders.
Long-range mutual information and topological uncertainty principle
Jian, Chao-Ming; Qi, Xiao-Liang
2015-01-01
Ordered phases in Landau paradigm can be diagnosed by a local order parameter, whereas topologically ordered phases cannot be detected in such a way. In this paper, we propose long-range mutual information(LRMI) as a unified diagnostic for both conventional long-range order and topological order. Using the LRMI, we characterize orders in $n+1$D gapped systems as $m$-membrane condensates with $ 0 \\leq m \\leq n-1$. The familiar conventional order and 2+1D topological orders are respectively identified as $0$-membrane and $1$-membrane condensates. We propose and study the topological uncertainty principle, which describes the non-commuting nature of non-local order parameters in topological orders.
Entropy Bound with Generalized Uncertainty Principle in General Dimensions
Weijian Wang; Da Huang
2012-03-26
In this letter, the entropy bound for local quantum field theories (LQFT) is studies in a class of models of the generalized uncertainty principle(GUP) which predicts a minimal length as a reflection of the quantum gravity effects. Both bosonic and fermionic fields confined in arbitrary spatial dimension $d\\geq4$ ball ${\\cal B}^{d}$ are investigated. It is found that the GUP leads to the same scaling $A_{d-2}^{(d-3)/(d-2)}$ correction to the entropy bound for bosons and fermions, although the coefficients of this correction are different for each case. Based on our calculation, we conclude that the GUP effects can become manifest at the short distance scale. Some further implications and speculations of our results are also discussed.
Albert Roura
2015-09-27
Atom interferometry tests of universality of free fall based on the differential measurement of two different atomic species provide a useful complement to those based on macroscopic masses. However, when striving for the highest possible sensitivities, gravity gradients pose a serious challenge. Indeed, the relative initial position and velocity for the two species need to be controlled with extremely high accuracy, which can be rather demanding in practice and whose verification may require rather long integration times. Furthermore, in highly sensitive configurations gravity gradients lead to a drastic loss of contrast. These difficulties can be mitigated by employing wave packets with narrower position and momentum widths, but this is ultimately limited by Heisenberg's uncertainty principle. We present a novel scheme that simultaneously overcomes the loss of contrast and the initial co-location problem. In doing so, it circumvents the fundamental limitations due to Heisenberg's uncertainty principle and eases the experimental realization by relaxing the requirements on initial co-location by several orders of magnitude.
Remarks on the Fact that the Uncertainty Principle Does Not Determine the Quantum State
Maurice de Gosson; Franz Luef
2007-03-07
We discuss the relation between density matrices and the uncertainty principle; this allows us to justify and explain a recent statement by Man'ko et al. We thereafter use Hardy's uncertainty principle to prove a new result for Wigner distributions dominated by a Gaussian and we relate this result to the coarse-graining of phase-space by "quantum blobs".
Molecular Response Theory in Terms of the Uncertainty Principle.
Harde, Hermann; Grischkowsky, Daniel
2015-08-27
We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared. PMID:26280761
Weak values, 'negative probability' and the uncertainty principle
D. Sokolovski
2009-05-23
A quantum transition can be seen as a result of interference between various pathways(e.g. Feynman paths) which can be labelled by a variable $f$. An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of $\\bar{f}$. We show $\\bar{f}$ to be an average obtained with amplitude distribution which can, in general, take negative values which, in accordance with the uncertainty principle, need not contain information about the actual range of the values $f$ which contribute to the transition. It is also demonstrated that the moments of such alternating distributions have a number of unusual properties which may lead to misinterpretation of the weak measurement results.We provide a detailed analysis of weak measurements with and without post-selection. Examples include the double slit diffraction experiment,weak von Neumann and von Neumann-like measurements, traversal time for an elastic collision, the phase time, the local angular momentum(LAM) and the 'three-box case' of {\\it Aharonov et al}
Verification of the Uncertainty Principle by Using Diffraction of Light Waves
ERIC Educational Resources Information Center
Nikolic, D.; Nesic, Lj
2011-01-01
We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…
Effect of uncertainty principle on the Wigner function-based simulation of quantum transport
NASA Astrophysics Data System (ADS)
Kim, Kyoung-Youm; Kim, Saehwa
2015-09-01
We investigate the effect of uncertainty principle on the simulation of quantum transport based on the Wigner function. We show that due to the positional uncertainty of electrons within the device, which bounds the region for nonlocal potential correlation, a constraint is imposed via the uncertainty principle on the possible momentum resolution of the Wigner function. It is numerically demonstrated that its violation deteriorates the simulation results significantly in configurations where the quantum effects are crucial.
Robertson-Schrödinger-type formulation of Ozawa's noise-disturbance uncertainty principle
NASA Astrophysics Data System (ADS)
Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno
2014-04-01
In this work we derive a matrix formulation of a noise-disturbance uncertainty relation, which is akin to the Robertson-Schrödinger uncertainty principle. Our inequality is stronger than Ozawa's uncertainty principle and takes noise-disturbance correlations into account. Moreover, we show that for certain types of measurement interactions it is covariant with respect to linear symplectic transformations of the noise and disturbance operators. Finally, we also study the tightness of our matrix inequality.
Generalized Uncertainty Principle and Recent Cosmic Inflation Observations
Abdel Nasser Tawfik; Abdel Magied Diab
2014-10-29
The recent background imaging of cosmic extragalactic polarization (BICEP2) observations are believed as an evidence for the cosmic inflation. BICEP2 provided a first direct evidence for the inflation, determined its energy scale and debriefed witnesses for the quantum gravitational processes. The ratio of scalar-to-tensor fluctuations $r$ which is the canonical measurement of the gravitational waves, was estimated as $r=0.2_{-0.05}^{+0.07}$. Apparently, this value agrees well with the upper bound value corresponding to PLANCK $r\\leq 0.012$ and to WMAP9 experiment $r=0.2$. It is believed that the existence of a minimal length is one of the greatest predictions leading to modifications in the Heisenberg uncertainty principle or a GUP at the Planck scale. In the present work, we investigate the possibility of interpreting recent BICEP2 observations through quantum gravity or GUP. We estimate the slow-roll parameters, the tensorial and the scalar density fluctuations which are characterized by the scalar field $\\phi$. Taking into account the background (matter and radiation) energy density, $\\phi$ is assumed to interact with the gravity and with itself. We first review the Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe and then suggest modification in the Friedmann equation due to GUP. By using a single potential for a chaotic inflation model, various inflationary parameters are estimated and compared with the PLANCK and BICEP2 observations. While GUP is conjectured to break down the expansion of the early Universe (Hubble parameter and scale factor), two inflation potentials based on certain minimal supersymmetric extension of the standard model result in $r$ and spectral index matching well with the observations. Corresponding to BICEP2 observations, our estimation for $r$ depends on the inflation potential and the scalar field. A power-law inflation potential does not.
Violation of the Robertson-Schrödinger uncertainty principle and non-commutative quantum mechanics
Catarina Bastos; Orfeu Bertolami; Nuno Costa Dias; João Nuno Prata
2012-11-26
We show that a possible violation of the Robertson-Schr\\"odinger uncertainty principle may signal the existence of a deformation of the Heisenberg-Weyl algebra. More precisely, we prove that any Gaussian in phase-space (even if it violates the Robertson-Schr\\"odinger uncertainty principle) is always a quantum state of an appropriate non-commutative extension of quantum mechanics. Conversely, all canonical non-commutative extensions of quantum mechanics display states that violate the Robertson-Schr\\"odinger uncertainty principle.
World-Crystal Uncertainty Principle and Micro Black Holes
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Scardigli, Fabio
We formulate generalized uncertainty relations in a crystal-like universe whose lattice spacing is of order of Planck length — a "world crystal". For energies near the border of the Brillouin zone, i.e. for Planckian energies, the uncertainty relation for position and momentum does not pose any lower bound. We apply these results to micro black holes physics, where we derive a new mass-temperature relation for Schwarzschild micro black holes. In contrast to standard results based on Heisenberg and stringy uncertainty relations, our mass-temperature formula predicts both a finite Hawking's temperature and a zero rest-mass remnant at the end of the black hole evaporation.
Generalized uncertainty principle and entropy of three-dimensional rotating acoustic black hole
NASA Astrophysics Data System (ADS)
Zhao, HuiHua; Li, GuangLiang; Zhang, LiChun
2012-07-01
Using the new equation of state density from the generalized uncertainty principle, we investigate statistics entropy of a 3-dimensional rotating acoustic black hole. When ? introduced in the generalized uncertainty principle takes a specific value, we obtain an area entropy and a correction term associated with the acoustic black hole. In this method, there does not exist any divergence and one needs not the small mass approximation in the original brick-wall model.
Entropy bound of local quantum field theory with generalized uncertainty principle
Yong-Wan Kim; Hyung Won Lee; Yun Soo Myung
2009-02-25
We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound $A^{3/4}$ rather than $A$ with $A$ the boundary area.
A Discussion on Heisenberg Uncertainty Principle in the Picture of Special Relativity
Luca Nanni
2015-01-09
In this note the formulation of the Heisenberg uncertainty principle (HUP) in the picture of the special relativity is given. The inequality shows that the product of quantum conjugate variables uncertainties is greater than an amount that is not more a constant but depends on the speed of the system on which the measurement is taken.
Micro Black Holes Physics from World-Crystal Uncertainty Principle
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Scardigli, Fabio
2011-11-01
We formulate generalized uncertainty relations in a crystal-like universe whose lattice spacing is of order of Planck length -- a "world crystal". For energies near the border of the Brillouin zone, i.e., for Planckian energies, the uncertainty relation for position and momentum does not pose any lower bound. We apply these results to micro black holes physics, where we derive a new mass-temperature relation for Schwarzschild micro black holes. In contrast to standard results based on Heisenberg and stringy uncertainty relations, our mass-temperature formula predicts both a finite Hawking's temperature and a zero rest-mass remnant at the end of the black hole evaporation. We also briefly mention some connections of the world crystal paradigm with 't Hooft's quantization and double special relativity.
Quantum covariance, quantum Fisher information and the uncertainty principle
Paolo Gibilisco; Fumio Hiai; Denes Petz
2007-12-07
In this paper the relation between quantum covariances and quantum Fisher informations are studied. This study is applied to generalize a recently proved uncertainty relation based on quantum Fisher information. The proof given hereconsiderably simplifies the previously proposed proofs and leads to more general inequalities.
Path Integral for Dirac oscillator with generalized uncertainty principle
Benzair, H. [Laboratoire LRPPS, Universite de Kasdi Merbah-Ouargla, BP 511, Route Ghardaia, 30000 Ouargla (Algeria); Laboratoire de Physique Theorique, Universite de Jijel BP98 Ouled Aissa, 18000 Jijel (Algeria); Boudjedaa, T. [Laboratoire de Physique Theorique, Universite de Jijel BP98 Ouled Aissa, 18000 Jijel (Algeria); Merad, M. [Laboratoire (L.S.D.C), Universite de Oum El Bouaghi, 04000 Oum El Bouaghi (Algeria)
2012-12-15
The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.
NASA Astrophysics Data System (ADS)
Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie
2011-12-01
Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students’ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students’ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students’ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students’ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.
Experimental verification of the Heisenberg uncertainty principle for hot fullerene molecules
Olaf Nairz; Markus Arndt; Anton Zeilinger
2001-05-14
The Heisenberg uncertainty principle for material objects is an essential corner stone of quantum mechanics and clearly visualizes the wave nature of matter. Here we report a demonstration of the Heisenberg uncertainty principle for the most massive, complex and hottest single object so far, the fullerene molecule C70 at a temperature of 900 K. We find a good quantitative agreement with the theoretical expectation: dx * dp = h, where dx is the width of the restricting slit, dp is the momentum transfer required to deflect the fullerene to the first interference minimum and h is Planck's quantum of action.
The entropy of the noncommutative acoustic black hole based on generalized uncertainty principle
Anacleto, M A; Passos, E; Santos, W P
2014-01-01
In this paper we investigate statistics entropy of a 3-dimensional rotating acoustic black hole based on generalized uncertainty principle. In our results we obtain an area entropy and a correction term associated with the noncommutative acoustic black hole when $ \\lambda $ introduced in the generalized uncertainty principle takes a specific value. However, in this method, is not need to introduce the ultraviolet cut-off and divergences are eliminated. Moreover, the small mass approximation is not necessary in the original brick-wall model.
The entropy of the noncommutative acoustic black hole based on generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Anacleto, M. A.; Brito, F. A.; Passos, E.; Santos, W. P.
2014-10-01
In this paper we investigate statistical entropy of a 3-dimensional rotating acoustic black hole based on generalized uncertainty principle. In our results we obtain an area entropy and a correction term associated with the noncommutative acoustic black hole when ? introduced in the generalized uncertainty principle takes a specific value. However, in this method, it is not needed to introduce the ultraviolet cut-off and divergences are eliminated. Moreover, the small mass approximation is not necessary in the original brick-wall model.
The uncertainty principle enables non-classical dynamics in an interferometer
NASA Astrophysics Data System (ADS)
Dahlsten, Oscar C. O.; Garner, Andrew J. P.; Vedral, Vlatko
2014-08-01
The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics.
NASA Astrophysics Data System (ADS)
Mejjaoli, Hatem; Trimèche, Khalifa
2015-09-01
In this paper, we prove various mathematical aspects of the qualitative uncertainty principle, including Hardy's, Cowling-Price's theorem, Morgan's theorem, Beurling, Gelfand-Shilov, Miyachi theorems.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
The uncertainty principle does not entirely determine the non-locality of quantum theory
Ravishankar Ramanathan; Dardo Goyeneche; Piotr Mironowicz; Pawe? Horodecki
2015-06-16
One of the most intriguing discoveries regarding quantum non-local correlations in recent years was the establishment of a direct correspondence between the quantum value of non-local games and the strength of the fine-grained uncertainty relations in \\textit{Science, vol. 330, no. 6007, 1072 (2010)}. It was shown that while the degree of non-locality in any theory is generally determined by a combination of two factors - the strength of the uncertainty principle and the degree of steering allowed in the theory, the most paradigmatic games in quantum theory have degree of non-locality purely determined by the uncertainty principle alone. In this context, the fundamental question arises: is this a universal property of optimal quantum strategies for all non-local games? Indeed, the above mentioned feature occurs in surprising situations, even when the optimal strategy for the game involves non-maximally entangled states. However, here we definitively prove that the answer to the question is negative, by presenting explicit counter-examples of non-local games and fully analytical optimal quantum strategies for these, where a definite trade-off between steering and uncertainty is absolutely necessary. We provide an intuitive explanation in terms of the Hughston-Jozsa-Wootters theorem for when the relationship between the uncertainty principle and the quantum game value breaks down.
ERIC Educational Resources Information Center
Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie
2011-01-01
Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…
Quantum States and Hardy's Formulation of the Uncertainty Principle : a Symplectic Approach
Maurice de Gosson; Franz Luef
2007-03-07
We express the condition for a phase space Gaussian to be the Wigner distribution of a mixed quantum state in terms of the symplectic capacity of the associated Wigner ellipsoid. Our results are motivated by Hardy's formulation of the uncertainty principle for a function and its Fourier transform. As a consequence we are able to state a more general form of Hardy's theorem.
Rioux, Frank
Hydrogen Atom and Helium Ion Spatial and Momentum Distribution Functions Illustrate for oneelectron species such as the hydrogen atom and the helium ion. The coordinate 1s wave function for the hydrogen atom (z=1) and helium ion (z=2) clearly illustrate the uncertainty principle. 0 2 4 6 r 2 1 r
G. M. Bosyk; M. Portesi; F. Holik; A. Plastino
2012-06-14
We revisit, in the framework of Mach-Zehnder interferometry, the connection between the complementarity and uncertainty principles of quantum mechanics. Specifically, we show that, for a pair of suitably chosen observables, the trade-off relation between the complementary path information and fringe visibility is equivalent to the uncertainty relation given by Schr\\"odinger and Robertson, and to the one provided by Landau and Pollak as well. We also employ entropic uncertainty relations (based on R\\'enyi entropic measures) and study their meaning for different values of the entropic parameter. We show that these different values define regimes which yield qualitatively different information concerning the system, in agreement with findings of [A. Luis, Phys. Rev. A 84, 034101 (2011)]. We find that there exists a regime for which the entropic uncertinty relations can be used as criteria to pinpoint non trivial states of minimum uncertainty.
Tawfik, A., E-mail: a.tawfik@eng.mti.edu.eg [Egyptian Center for Theoretical Physics (ECTP), MTI University, 11571 Cairo (Egypt)
2013-07-01
We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.
The Quark-Gluon Plasma Equation of State and The Generalized Uncertainty Principle
L. I. AbouSalem; N. M. El Naggar; I. A. Elmashad
2015-09-30
The quark-gluon plasma (QGP) equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP) is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of $n_{f}=0$, $n_{f}=2$ and $n_{f}=2+1$ flavors for the energy density, the pressure and the interaction measure. The exciting point is the large value of bag pressure especially in case of $n_{f}=2+1$ flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that, the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.
The Quark-Gluon Plasma Equation of State and The Generalized Uncertainty Principle
L. I. AbouSalem; N. M. El Naggar; I. Elmashad
2015-07-13
The quark-gluon plasma (QGP) equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP) is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of $n_{f}=0$, $n_{f}=2$ and $n_{f}=2+1$ flavors for the energy density, the pressure and the interaction measure. The exciting point is the large value of bag pressure especially in case of $n_{f}=2+1$ flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that, the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.
Energy-time uncertainty principle and lower bounds on sojourn time
Joachim Asch; Olivier Bourget; Victor Cortes; Claudio Fernandez
2015-07-23
One manifestation of quantum resonances is a large sojourn time, or autocorrelation, of states which are initially localized. We elaborate on Lavine's time-energy uncertainty principle and give an estimate on the sojourn time. The bound involves Fermi's Golden Rule for the case of perturbed embedded eigenstates. Only very mild regularity is required. We illustrate the theory by applications to resonances for time dependent- and multistate systems .
A Computational Model of Limb Impedance Control Based on Principles of Internal Model Uncertainty
Mitrovic, Djordje; Klanke, Stefan; Osu, Rieko; Kawato, Mitsuo; Vijayakumar, Sethu
2010-01-01
Efficient human motor control is characterized by an extensive use of joint impedance modulation, which is achieved by co-contracting antagonistic muscles in a way that is beneficial to the specific task. While there is much experimental evidence available that the nervous system employs such strategies, no generally-valid computational model of impedance control derived from first principles has been proposed so far. Here we develop a new impedance control model for antagonistic limb systems which is based on a minimization of uncertainties in the internal model predictions. In contrast to previously proposed models, our framework predicts a wide range of impedance control patterns, during stationary and adaptive tasks. This indicates that many well-known impedance control phenomena naturally emerge from the first principles of a stochastic optimization process that minimizes for internal model prediction uncertainties, along with energy and accuracy demands. The insights from this computational model could be used to interpret existing experimental impedance control data from the viewpoint of optimality or could even govern the design of future experiments based on principles of internal model uncertainty. PMID:21049061
Generalized Uncertainty Principle and Correction Value to the Kerr Black Hole Entropy
NASA Astrophysics Data System (ADS)
Ya, Zhang; Shuang-Qi, Hu; Ren, Zhao; Huai-Fan, Li
2008-02-01
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction term caused by the generalized uncertainty principle. Because in our calculation we think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle, we derive that the coefficient of the logarithmic term of the black hole entropy correction term is positive. This result is different from the known result at present. Our method is valid not only for single horizon spacetime but also for spin axial symmetric spacetimes with double horizons. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the entropy correction of the complicated spacetime.
Generalized Uncertainty Principle and Black Hole Entropy of Higher-Dimensional de Sitter Spacetime
NASA Astrophysics Data System (ADS)
Zhao, Hai-Xia; Li, Huai-Fan; Hu, Shuang-Qi; Zhao, Ren
2007-09-01
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking black hole entropy. In particular, many researchers have expressed a vested interest in the coefficient of the logarithmic term of the black hole entropy correction term. In this paper, we calculate the correction value of the black hole entropy by utilizing the generalized uncertainty principle and obtain the correction term caused by the generalized uncertainty principle. Because in our calculation we think that the Bekenstein-Hawking area theorem is still valid after considering the generalized uncertainty principle, we derive that the coefficient of the logarithmic term of the black hole entropy correction term is positive. This result is different from the known result at present. Our method is valid not only for four-dimensional spacetimes but also for higher-dimensional spacetimes. In the whole process, the physics idea is clear and calculation is simple. It offers a new way for studying the entropy correction of the complicated spacetime.
Geodesics, Mass and the Uncertainty Principle in a Warped de Sitter Space-time
Jose A. Magpantay
2011-08-03
We present the explicit solution to the geodesic equations in a warped de Sitter space-time proposed by Randall-Sundrum. We find that a test particle moves in the bulk and is not restricted on a 3-brane (to be taken as our universe). On the 3-brane, the test particle moves with uniform velocity, giving the appearance that it is not subject to a force. But computing the particle's energy using the energy-momentum tensor yields a time-dependent energy that suggests a time-dependent mass. Thus, the extra force, which is the effect of the warped extra dimension on the particle's motion on the 3-brane, does not change the velocity but the mass of the particle. The particle's motion in the bulk also results in a time-dependent modification of the Heisenberg uncertainty principle as viewed on the 3-brane. These two results show that the classical physics along the extra dimension results in the time-dependence of particle masses and the uncertainty principle. If the particle masses are time-independent and the Heisenberg's uncertainty principle is to remain unchanged, then there must be a non-gravitational force that will restrict all particles on the 3-brane. Finally, we just note that although classically, these time-dependent corrections on the 3-brane can be removed, quantum mechanical corrections along the extra dimension will restore back the problem.
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
2015-09-05
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore »achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less
Remnant mass and entropy of black holes and modified uncertainty principle
NASA Astrophysics Data System (ADS)
Dutta, Abhijit; Gangopadhyay, Sunandan
2014-06-01
In this paper, we study the thermodynamics of black holes using a generalized uncertainty principle (GUP) with a correction term linear order in the momentum uncertainty. The mass-temperature relation and heat capacity are calculated from which critical and remnant masses are obtained. The results are exact and are found to be identical. The entropy expression gives the famous area theorem upto leading order corrections from GUP. In particular, the linear order term in GUP leads to a correction to the area theorem. Finally, the area theorem can be expressed in terms of a new variable termed as reduced horizon area only when the calculation is done to the next higher order correction from GUP.
Modified uncertainty principle from the free expansion of a Bose-Einstein Condensate
Elías Castellanos; Celia Escamilla-Rivera
2015-09-21
We develop a theoretical and numerical analysis of the free expansion of a Bose-Einstein condensate, in which we assume that the single particle energy spectrum is deformed due to a possible quantum structure of space time. Also we consider the presence of inter particle interactions in order to study more realistic and specific scenarios. The modified free velocity expansion of the condensate leads in a natural way to a modification of the uncertainty principle, which allows us to investigate some possible features of the Planck scale regime in low-energy earth-based experiments.
Generalized uncertainty principle, quantum gravity and Ho?ava-Lifshitz gravity
NASA Astrophysics Data System (ADS)
Myung, Yun Soo
2009-10-01
We investigate a close connection between generalized uncertainty principle (GUP) and deformed Ho?ava-Lifshitz (HL) gravity. The GUP commutation relations correspond to the UV-quantum theory, while the canonical commutation relations represent the IR-quantum theory. Inspired by this UV/IR quantum mechanics, we obtain the GUP-corrected graviton propagator by introducing UV-momentum pi =p0 i (1 + ?p02) and compare this with tensor propagators in the HL gravity. Two are the same up to p04-order.
Modified uncertainty principle from the free expansion of a Bose-Einstein Condensate
Elías Castellanos; Celia Escamilla-Rivera
2015-07-01
We develop an analytical and numerical analysis of the free expansion of a Bose-Einstein condensate, in which we assume that the single particle energy spectrum is deformed due to a possible quantum structure of space time. Also we consider the presence of inter particle interactions in order to study more realistic and specific scenarios. The modified free velocity expansion of the condensate leads in a natural way to a modification of the uncertainty principle, which allows us to investigate some possible features of the Planck scale regime in low-energy earth-based experiments.
Before and beyond the precautionary principle: Epistemology of uncertainty in science and law
Tallacchini, Mariachiara [Bioethics, Faculty of Biotechnology, University of Milan, Via Celoria 10, 20100 Milan (Italy) and Science Technology and Law, Law Faculty, University of Piacenza, Via Emilia Parmense 84, 29100 Piacenza (Italy)]. E-mail: mariachiara.tallacchini@unimi.it
2005-09-01
The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.
Masanao Ozawa
2015-07-23
Heisenberg's uncertainty principle was originally posed for the limit of the accuracy of simultaneous measurement of non-commuting observables as stating that canonically conjugate observables can be measured simultaneously only with the constraint that the product of their mean errors should be no less than a limit set by Planck's constant. However, Heisenberg with the subsequent completion by Kennard has long been credited only with a constraint for state preparation represented by the product of the standard deviations. Here, we show that Heisenberg actually proved the constraint for the accuracy of simultaneous measurement but assuming an obsolete postulate for quantum mechanics. This assumption, known as the repeatability hypothesis, formulated explicitly by von Neumann and Schr\\"{o}dinger, was broadly accepted until the 1970s, but abandoned in the 1980s, when completely general quantum measurement theory was established. We also survey the author's recent proposal for a universally valid reformulation of Heisenberg's uncertainty principle under the most general assumption on quantum measurement.
NASA Astrophysics Data System (ADS)
Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Dias, Nuno Costa; Prata, João Nuno
2015-03-01
We revisit Ozawa's uncertainty principle (OUP) in the framework of noncommutative (NC) quantum mechanics. We derive a matrix version of OUP accommodating any NC structure in the phase space, and compute NC corrections to lowest order for two measurement interactions, namely the backaction evading quadrature amplifier and noiseless quadrature transducers. These NC corrections alter the nature of the measurement interaction, as a noiseless interaction may acquire noise, and an interaction of independent intervention may become dependent on the object system. However the most striking result is that noncommutativity may lead to a violation of the OUP itself. The NC corrections for the backaction evading quadrature amplifier reveal a new term which may potentially be amplified in such a way that the violation of the OUP becomes experimentally testable. On the other hand, the NC corrections to the noiseless quadrature transducer shows an incompatibility of this model with NC quantum mechanics. We discuss the implications of this incompatibility for NC quantum mechanics and for Ozawa's uncertainty principle.
Covariant energy–momentum and an uncertainty principle for general relativity
Cooperstock, F.I., E-mail: cooperst@uvic.ca [Department of Physics and Astronomy, University of Victoria, P.O. Box 3055, Victoria, B.C. V8W 3P6 (Canada); Dupre, M.J., E-mail: mdupre@tulane.edu [Department of Mathematics, Tulane University, New Orleans, LA 70118 (United States)
2013-12-15
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.
NASA Astrophysics Data System (ADS)
Nasseri, M.; Ansari, A.; Zahraie, B.
2014-02-01
The most important drawback of standard fuzzy arithmetic is unrealistic accumulation of input uncertainties which results in divergence of fuzzy outputs. Some of the currently available methods of simulating fuzzy systems provide the results which tend to naive large values after several time steps of the system simulation. In this paper, a new fuzzy arithmetic operator based on fuzzy extension principle has been proposed for simulation and assessment of uncertainty in hydrological systems. Implementing the concept of fuzzy approximate reasoning in the proposed approach in this study represents acceptable behavior in uncertainty propagation from the parameters and structures of the models to the outputs. To show the efficiency of the proposed fuzzy arithmetic operator in the context of hydrologic modeling, two nonlinear monthly water balance models have been examined and their outputs have been compared with the results obtained by standard fuzzy arithmetic and the Vertex method. One small humid basin in France and a middle size basin in a semiarid region in Iran have been the case studies of this research. In this paper, the lower and upper bounds and the most frequent values of the model parameters inferred from the sampling-simulation procedure have been used to define triangular fuzzy membership functions. Three statistical indicators have been used to evaluate efficiency of the methods based on the bracket observations and coverage of the uncertainty bounds. The estimated values of these indicators have shown that both Vertex and the proposed methods outperform standard fuzzy arithmetic. Also, the proposed method has provided better or roughly equal efficiencies compared with Vertex method over both basins.
Covariant energy-momentum and an uncertainty principle for general relativity
NASA Astrophysics Data System (ADS)
Cooperstock, F. I.; Dupre, M. J.
2013-12-01
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy-momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy-momentum.
Medved, A.J.M. [School of Mathematics, Statistics and Computer Science, Victoria University of Wellington, P.O. Box 600, Wellington (New Zealand); Vagenas, Elias C. [Departament d'Estructura i Constituents de la Materia and CER for Astrophysics, Particle Physics and Cosmology, Universitat de Barcelona, Avinguda Diagonal 647 E-08028, Barcelona (Spain)
2004-12-15
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking (black hole) entropy. In particular, many researchers have expressed a vested interest in fixing the coefficient of the subleading logarithmic term. In the current paper, we are able to make some degree of progress in this direction by utilizing the generalized uncertainty principle (GUP). Notably, the GUP reduces to the conventional Heisenberg relation in situations of weak gravity but transcends it when gravitational effects can no longer be ignored. Ultimately, we formulate the quantum-corrected entropy in terms of an expansion that is consistent with all previous findings. Moreover, we demonstrate that the logarithmic prefactor (indeed, any coefficient of the expansion) can be expressed in terms of a single parameter that should be determinable via the fundamental theory.
Tawfik, Abdel Nasser
2015-01-01
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking (black hole) entropy, which relates the entropy to the cross-sectional area of the black hole horizon. Using generalized uncertainty principle (GUP), corrections to the geometric entropy and thermodynamics of black hole will be introduced. The impact of GUP on the entropy near the horizon of three types of black holes; Schwarzschild, Garfinkle-Horowitz-Strominger and Reissner-Nordstr\\"om is determined. It is found that the logarithmic divergence in the entropy-area relation turns to be positive. The entropy $S$, which is assumed to be related to horizon's two-dimensional area, gets an additional terms, for instance $2\\, \\sqrt{\\pi}\\, \\alpha\\, \\sqrt{S}$, where $\\alpha$ is the GUP parameter.
Abdel Nasser Tawfik; Eiman Abou El Dahab
2015-02-19
Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking (black hole) entropy, which relates the entropy to the cross-sectional area of the black hole horizon. Using generalized uncertainty principle (GUP), corrections to the geometric entropy and thermodynamics of black hole will be introduced. The impact of GUP on the entropy near the horizon of three types of black holes; Schwarzschild, Garfinkle-Horowitz-Strominger and Reissner-Nordstr\\"om is determined. It is found that the logarithmic divergence in the entropy-area relation turns to be positive. The entropy $S$, which is assumed to be related to horizon's two-dimensional area, gets an additional terms, for instance $2\\, \\sqrt{\\pi}\\, \\alpha\\, \\sqrt{S}$, where $\\alpha$ is the GUP parameter.
An uncertainty principle underlying the pinwheel structure in the primary visual cortex
Davide Barbieri; Giovanna Citti; Gonzalo Sanguinetti; Alessandro Sarti
2010-07-08
The visual information in V1 is processed by an array of modules called orientation preference columns. In some species including humans, orientation columns are radially arranged around singular points like the spokes of a wheel, that are called pinwheels. The pinwheel structure has been observed first with optical imaging techniques and more recently by in vivo two-photon imaging proving their organization with single cell precision. In this research we provide evidence that pinwheels are de facto optimal distributions for coding at the best angular position and momentum. In the last years many authors have recognized that the functional architecture of V1 is locally invariant with respect to the symmetry group of rotations and translations SE(2). In the present study we show that the orientation cortical maps used to construct pinwheels can be modeled as coherent states, i.e. the configurations best localized both in angular position and angular momentum. The theory we adopt is based on the well known uncertainty principle, introduced by Heisenberg in quantum mechanics and later extended to many other groups of invariance. Here we state a corresponding principle in the cortical geometry with SE(2) symmetry, and by computing its minimizers we obtain a model of orientation activity maps in the cortex. As it is well known the pinwheels configuration is directly constructed from these activity maps, and we will be able to formally reproduce their structure starting from the group symmetries of the functional architecture of the visual cortex. The primary visual cortex is then modeled as an integrated system in which the set of simple cells implements the SE(2) group, the horizontal connectivity implements its Lie algebra and the pinwheels implement its minimal uncertainty states.
G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul
2010-01-01
An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.
Principles for Robust On-orbit Uncertainties Traceable to the SI (Invited)
NASA Astrophysics Data System (ADS)
Shirley, E. L.; Dykema, J. A.; Fraser, G. T.; Anderson, J.
2009-12-01
Climate-change research requires space-based measurements of the Earth’s spectral radiance, reflectance, and atmospheric properties with unprecedented accuracy. Increases in measurement accuracy would improve and accelerate the quantitative determination of decadal climate change. The increases would also permit attribution of climate change to anthropogenic causes and foster understanding of climate evolution on an accelerated time scale. Beyond merely answering key questions about global climate change, accurate measurements would also be of benefit by testing and refining climate models to enhance and quantify their predictive value. Accurate measurements imply traceability to the SI system of units. In this regard, traceability is a property of the result of a measurement, or the value of a standard, whereby it can be related to international standards through an unbroken chain of comparisons, all having stated (and realistic) uncertainties. SI-traceability allows one to compare measurements independent of locale, time, or sensor. In this way, SI-traceability alleviates the urgency to maintain a false assurance of measurement accuracy by having an unbroken time series of observations continually adjusted so that measurement results obtained with a given instrument match the measurement results of its recent predecessors. Moreover, to make quantitative inferences from measurement results obtained in various contexts, which might range, for instance, from radiometry to atmospheric chemistry, having SI-traceability throughout all work is essential. One can derive principles for robust claims of SI-traceability from lessons learned by the scientific community. In particular, National Measurement Institutes (NMIs), such as NIST, use several strategies in their realization of practical SI-traceable measurements of the highest accuracy: (1.) basing ultimate standards on fundamental physical phenomena, such as the Quantum Hall resistance, instead of measurement artifacts; (2.) developing a variety of approaches to measure a given physical quantity; (3.) conducting intercomparisons of measurements performed by different institutions; (4.) perpetually seeking complete understanding of all sources of measurement bias and uncertainty; (5.) rigorously analyzing measurement uncertainties; and (6.) maintaining a high level of transparency that permits peer review of measurement practices. It is imperative to establish SI-traceability at the beginning of an environmental satellite program. This includes planning for system-level pre-launch and, in particular, on-orbit instrument calibration. On-orbit calibration strategies should be insensitive to reasonably expected perturbations that arise during launch or on orbit, and one should employ strategies to validate on-orbit traceability. As a rule, optical systems with simple designs tend to be more amenable to robust calibration schemes.
f(R) in Holographic and Agegraphic Dark Energy Models and the Generalized Uncertainty Principle
Barun Majumder
2013-07-16
We studied a unified approach with the holographic, new agegraphic and the $f(R)$ dark energy model to construct the form of $f(R)$ which in general responsible for the curvature driven explanation of the very early inflation along with presently observed late time acceleration. We considered the generalized uncertainty principle in our approach which incorporated the corrections in the entropy area relation and thereby modified the energy densities for the cosmological dark energy models considered. We found that holographic and new agegraphic $f(R)$ gravity models can behave like phantom or quintessence models in the spatially flat FRW universe. We also found a distinct term in the form of $f(R)$ which goes as $R^{\\frac{3}{2}}$ due to the consideration of the GUP modified energy densities. Although the presence of this term in the action can have its importance in explaining the early inflationary scenario but Capozziello {\\it et.al.} recently showed that $f(R) \\sim R^{\\frac{3}{2}}$ leads to an accelerated expansion, {\\it i.e.}, a negative value for the deceleration parameter $q$ which fit well with SNeIa and WMAP data.
Kratzer's molecular potential in quantum mechanics with a generalized uncertainty principle
Djamil Bouaziz
2015-03-07
The Kratzer's potential $V(r)=g_{1}/r^{2}-g_{2}/r$ is studied in quantum mechanics with a generalized uncertainty principle, which includes a minimal length $\\left( \\Delta X\\right) _{\\min}=\\hbar\\sqrt{5\\beta}$. In momentum representation, the Schr\\"{o}dinger equation is a generalized Heun's differential equation, which reduces to a hypergeometric and to a Heun's equations in special cases. We explicitly show that the presence of this finite length regularizes the potential in the range of the coupling constant $g_{1}$ where the corresponding Hamiltonian is not self-adjoint. In coordinate space, we perturbatively derive an analytical expression for the bound states spectrum in the first order of the deformation parameter $\\beta$. We qualitatively discuss the effect of the minimal length on the vibration-rotation energy levels of diatomic molecules, through the Kratzer interaction. By comparison with an experimental result of the hydrogen molecule, an upper bound for the minimal length is found to be of about $0.01$ \\AA . We argue that the minimal length would have some physical importance in studying the spectra of such systems
f (R )-modified gravity, Wald entropy, and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Hammad, Fayçal
2015-08-01
Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an f (R )-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking modified gravity theories. Two different versions of the GUP are used, and it is found that only one of them yields a viable modified gravity model. Conversely, it is possible to find a general formulation of the GUP that would reproduce Wald entropy formula for any f (R ) theory of gravity.
Kratzer's molecular potential in quantum mechanics with a generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Bouaziz, Djamil
2015-04-01
The Kratzer's potential V(r) =g1 /r2 -g2 / r is studied in quantum mechanics with a generalized uncertainty principle, which includes a minimal length (?X)min = ??{ 5 ? }. In momentum representation, the Schrödinger equation is a generalized Heun's differential equation, which reduces to a hypergeometric and to a Heun's equations in special cases. We explicitly show that the presence of this finite length regularizes the potential in the range of the coupling constant g1 where the corresponding Hamiltonian is not self-adjoint. In coordinate space, we perturbatively derive an analytical expression for the bound states spectrum in the first order of the deformation parameter ?. We qualitatively discuss the effect of the minimal length on the vibration-rotation energy levels of diatomic molecules, through the Kratzer interaction. By comparison with an experimental result of the hydrogen molecule, an upper bound for the minimal length is found to be of about 0.01 Å. We argue that the minimal length would have some physical importance in studying the spectra of such systems.
$f(R)$-Modified Gravity, Wald Entropy, and the Generalized Uncertainty Principle
Hammad, Fayçal
2015-01-01
Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an $f(R)$-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking mo...
$f(R)$-Modified Gravity, Wald Entropy, and the Generalized Uncertainty Principle
Fayçal Hammad
2015-08-04
Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an $f(R)$-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking modified gravity theories. Two different versions of the GUP are used and it is found that only one of them yields a viable modified gravity model. Conversely, it is possible to find a general formulation of the GUP that would reproduce Wald entropy formula for any $f(R)$-theory of gravity.
Ozawa, Masanao
2003-04-01
The Heisenberg uncertainty principle states that the product of the noise in a position measurement and the momentum disturbance caused by that measurement should be no less than the limit set by Planck's constant ({Dirac_h}/2{pi})/2 as demonstrated by Heisenberg's thought experiment using a {gamma}-ray microscope. Here it is shown that this common assumption is not universally true: a universally valid trade-off relation between the noise and the disturbance has an additional correlation term, which is redundant when the intervention brought by the measurement is independent of the measured object, but which allows the noise-disturbance product much below Planck's constant when the intervention is dependent. A model of measuring interaction with dependent intervention shows that Heisenberg's lower bound for the noise-disturbance product is violated even by a nearly nondisturbing precise position measurement. An experimental implementation is also proposed to realize the above model in the context of optical quadrature measurement with currently available linear optical devices.
Övgün, A
2015-01-01
Quantum gravity has exciting peculiarities on the Planck scale.The effect of generalized uncertainty principle to the entangled scalar/fermion particles' tunneling from a Schwarzschild black hole embedded in an electromagnetic Universe is investigated by the help of semi-classical tunneling method. The quantum corrected Hawking temperature of this black hole with an external parameter "a" which can be used to increase the Hawking temperature for the entangled particles is investigated.
NASA Astrophysics Data System (ADS)
Mahanta, Chandra Rekha; Misra, Rajesh
2015-08-01
In the Generalized Uncertainty Principle (GUP), there should be a minimal black hole whose size is comparable to the minimal length so that it cannot evaporate completely through the thermal radiation. Again, the black hole is not allowed to have a mass less than a scale of order Planck mass, which suggested a black hole remnant. We study the warped AdS3 rotating black hole and calculate the entropy, heat capacity and critical mass with the help of GUP. We compute the area theorem with GUP correction.
Uncertainty principle on a world crystal: Absence of black hole remnants?
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Scardigli, Fabio
2012-06-01
We study uncertainty relations as formulated in a crystal-like universe, whose lattice spacing is of order of Planck length. For Planck energies, the uncertainty relation for position and momenta has a lower bound equal to zero. Connections of this result with double special relativity, and with't Hooft's deterministic quantization proposal, are briefly pointed out. We then apply our formulae to micro black holes, and we derive a new mass-temperature relation for Schwarzschild (micro) black holes. In contrast to standard results based on Heisenberg and stringy uncertainty relations, we obtain both a finite Hawking's temperature and a zero rest-mass remnant at the end of the micro black hole evaporation.
The effect of generalized uncertainty principle on square well, a case study
Ma, Meng-Sen; Zhao, Ren
2014-08-15
According to a special case (? = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except ?. But the eigenstates will depend on another two parameters besides ?.
NASA Astrophysics Data System (ADS)
Furrer, Fabian
2014-10-01
A big challenge in continuous-variable quantum key distribution is to prove security against arbitrary coherent attacks including realistic assumptions such as finite-size effects. Recently, such a proof has been presented in [Phys. Rev. Lett. 109, 100502 (2012), 10.1103/PhysRevLett.109.100502] for a two-mode squeezed state protocol based on a novel uncertainty relation with quantum memories. But the transmission distances were fairly limited due to a direct reconciliation protocol. We prove here security against coherent attacks of a reverse-reconciliation protocol under similar assumptions but allowing distances of over 16 km for experimentally feasible parameters. We further clarify the limitations when using the uncertainty relation with quantum memories in security proofs of continuous-variable quantum key distribution.
Oye, K A
2005-01-01
Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs. PMID:16304935
Donald J. Kouri
2015-02-06
At the recent QSCP XIX, the author claimed a procedure of using a scaled Fourier transform (the scaling being determined by the detailed interaction and particle mass for a harmonic oscillator) to achieve simultaneous resolution of position and momentum greater than the standard Heisenberg value of 1/2. The procedure is, in fact, invalid for quantum mechanics. The purpose of this paper is simply to give the correct analysis of the uncertainty product, thereby clarifying the error made.
Hamilton-Jacobi Many-Worlds Theory and the Heisenberg Uncertainty Principle
Tipler, Frank J
2010-01-01
I show that the classical Hamilton-Jacobi (H-J) equation can be used as a technique to study quantum mechanical problems. I first show that the the Schr\\"odinger equation is just the classical H-J equation, constrained by a condition that forces the solutions of the H-J equation to be everywhere $C^2$. That is, quantum mechanics is just classical mechanics constrained to ensure that ``God does not play dice with the universe.'' I show that this condition, which imposes global determinism, strongly suggests that $\\psi^*\\psi$ measures the density of universes in a multiverse. I show that this interpretation implies the Born Interpretation, and that the function space for $\\psi$ is larger than a Hilbert space, with plane waves automatically included. Finally, I use H-J theory to derive the momentum-position uncertainty relation, thus proving that in quantum mechanics, uncertainty arises from the interference of the other universes of the multiverse, not from some intrinsic indeterminism in nature.
Hamilton-Jacobi Many-Worlds Theory and the Heisenberg Uncertainty Principle
Frank J. Tipler
2010-07-26
I show that the classical Hamilton-Jacobi (H-J) equation can be used as a technique to study quantum mechanical problems. I first show that the the Schr\\"odinger equation is just the classical H-J equation, constrained by a condition that forces the solutions of the H-J equation to be everywhere $C^2$. That is, quantum mechanics is just classical mechanics constrained to ensure that ``God does not play dice with the universe.'' I show that this condition, which imposes global determinism, strongly suggests that $\\psi^*\\psi$ measures the density of universes in a multiverse. I show that this interpretation implies the Born Interpretation, and that the function space for $\\psi$ is larger than a Hilbert space, with plane waves automatically included. Finally, I use H-J theory to derive the momentum-position uncertainty relation, thus proving that in quantum mechanics, uncertainty arises from the interference of the other universes of the multiverse, not from some intrinsic indeterminism in nature.
David R Geelan
2013-06-14
Recent empirical work in the field of 'weak measurements' has yielded novel ways of more directly accessing and exploring the quantum wavefunction. Measuring either position or momentum for a photon in a 'weak' manner yields a wide range of possible values for the measurement, and can be done in such a way as to only minimally effect the wavefunction rather than to collapse it to a specific precise value. Measuring the other complementary variable (position or momentum) precisely at a later time ('post-selection') and averaging the weak measurements can yield information about the wavefunction that is not directly experimentally obtainable using other methods. This paper discusses two recent papers on weak measurement in the context of the uncertainty principle more broadly, and considers some possibilities for further research.
NASA Astrophysics Data System (ADS)
McNabb, Sean; Balam Matagamon, Kaan; Pawa Matagamon, Sagamo
2007-10-01
Heisenberg's succeeded in inducing Schr"odinger to recant his standing wave model for the electron. They subsequently successively received Nobel awards. On August 30, 2007, David Mc Leod stated: ``Dad, while I was learning high school chemistry, and was being taught the Heisenberg Uncertainty Principle, I said to myself, `This is BS!''' Son and ``Dad'' had been discussing their traveling wave/standing wave, TW/SW, model for the electron. Dave had discussed, in the context of beta decay, how an electron and an antineutrino were emitted together. Dave then said, ``An electron is an antiparticle.'' Because, a ``string-like'' electron had to have been a segment ``cut out of'' one of our neutron-string models. It had to have antinodes at either free end. One end, modeled as a transversely vibrating entity, had to ``eject'' the occupant for the loop to close. The TW/SW model cannot be a point or particle. De Broglie is correct, but The Principle should be recast: It is philosophically unsound. Quantum Mechanics, and String theory, could be foundational beneficiaries. Schr"odinger seems incomplete.
NASA Astrophysics Data System (ADS)
McLeod, David; McLeod, Roger
2008-04-01
The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.
NASA Technical Reports Server (NTRS)
Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.
1992-01-01
The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.
NASA Astrophysics Data System (ADS)
Mazurova, Elena; Lapshin, Aleksey
2013-04-01
The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.
Grote, Gudela
2014-01-01
It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124
Heisenberg's Uncertainty : an Ill-Defined Notion ?
Paris-Sud XI, Université de
Heisenberg's Uncertainty : an Ill-Defined Notion ? Elem´er E Rosinger Department of Mathematics the use of the Heisenberg Uncertainty Principle, a principle which it calls an "ill-defined notion ..." Henry R Sturman 1 hal-00684501,version1-2Apr2012 #12;1. Heisenberg's Uncertainty Principle, the Axioms
Dimensions of the Precautionary Principle
Per Sandin
1999-01-01
This essay attempts to provide an analytical apparatus which may be used for finding an authoritative formulation of the Precautionary Principle. Several formulations of the Precautionary Principle are examined. Four dimensions of the principle are identified: (1) the threat dimension, (2) the uncertainty dimension, (3) the action dimension, and (4) the command dimension. It is argued that the Precautionary Principle
Two new kinds of uncertainty relations
NASA Technical Reports Server (NTRS)
Uffink, Jos
1994-01-01
We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.
NASA Astrophysics Data System (ADS)
Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann
2010-05-01
To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.
Entropic uncertainty relations for multiple measurements
NASA Astrophysics Data System (ADS)
Liu, Shang; Mu, Liang-Zhu; Fan, Heng
2015-04-01
We present the entropic uncertainty relations for multiple measurement settings which demonstrate the uncertainty principle of quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory, and can be proven by a unified method. Our results recover some well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. The bounds of those relations which quantify the extent of the uncertainty take concise forms and are easy to calculate. Those uncertainty relations might play important roles in the foundations of quantum theory. Potential experimental demonstration of those entropic uncertainty relations is discussed.
Uncertainty in Computational Aerodynamics
NASA Technical Reports Server (NTRS)
Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.
2003-01-01
An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.
Uncertainty in QSAR predictions.
Sahlin, Ullrika
2013-03-01
It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle. PMID:23614548
NASA Astrophysics Data System (ADS)
Mc Leod, Roger David; Mc Leod, David M.
2007-10-01
Vision, via transform space: ``Nature behaves in a reciprocal way;' also, Rect x pressure-input sense-reports as Sinc p, indicating brain interprets reciprocal ``p'' space as object space. Use Mott's and Sneddon's Wave Mechanics and Its Applications. Wave transformation functions are strings of positron, electron, proton, and neutron; uncertainty is a semantic artifact. Neutrino-string de Broglie-Schr"odinger wave-function models for electron, positron, suggest three-quark models for protons, neutrons. Variably vibrating neutrino-quills of this model, with appropriate mass-energy, can be a vertical proton string, quills leftward; thread string circumferentially, forming three interlinked circles with ``overpasses''. Diameters are 2:1:2, center circle has quills radially outward; call it a down quark, charge --1/3, charge 2/3 for outward quills, the up quarks of outer circles. String overlap summations are nodes; nodes also far left and right. Strong nuclear forces may be --px. ``Dislodging" positron with neutrino switches quark-circle configuration to 1:2:1, `downers' outside. Unstable neutron charge is 0. Atoms build. With scale factors, retinal/vision's, and quantum mechanics,' spatial Fourier transforms/inverses are equivalent.
Bartley, David; Lidén, Göran
2008-08-01
The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill. PMID:18573808
Schapira, Lidia
2014-03-01
Uncertainty is triggered by many events during the experience of illness - from hearing bad news to meeting a new doctor. Oncology professionals need to recognize the intense feelings associated with uncertainty and respond empathically to patients. This article describes opportunities to strengthen the therapeutic connection and minimize uncertainty. PMID:24337763
NASA Astrophysics Data System (ADS)
Koch, Michael
Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.
Nab: Measurement Principles, Apparatus and Uncertainties
Pocanic, Dinko [University of Virginia; Bowman, James D [ORNL; Cianciolo, Vince [ORNL; Greene, Geoffrey [University of Tennessee, Knoxville (UTK); Grzywacz, Robert [University of Tennessee, Knoxville (UTK); Penttila, Seppo [Oak Ridge National Laboratory (ORNL); Rykaczewski, Krzysztof Piotr [ORNL; Young, Glenn R [ORNL; The, Nab [Collaboration affiliations
2009-01-01
The Nab collaboration will perform a precise measurement of a, the electron-neutrino correlation parameter, and b, the Fierz interference term in neutron beta decay, in the Fundamental Neutron Physics Beamline at the SNS, using a novel electric/magnetic field spectrometer and detector design. The experiment is aiming at the 10{sup -3} accuracy level in {Delta}a/a, and will provide an independent measurement of {lambda} = G{sub A}/G{sub V}, the ratio of axial-vector to vector coupling constants of the nucleon. Nab also plans to perform the first ever measurement of b in neutron decay, which will provide an independent limit on the tensor weak coupling.
Uncertainty principles and ideal atomic decomposition
David L. Donoho; Xiaoming Huo
2001-01-01
Suppose a discrete-time signal S(t), 0⩽t
Uncertainty principle with quantum Fisher information
Attila Andai
2007-10-11
In this paper we prove a nontrivial lower bound for the determinant of the covariance matrix of quantum mechanical observables, which was conjectured by Gibilisco, Isola and Imparato. The lower bound is given in terms of the commutator of the state and the observables and their scalar product, which is generated by an arbitrary symmetric operator monotone function.
An Uncertainty Principle for Fluxes Gregory Moore
Gustafsson, Torgny
- strengths are differential forms F (M). More technically - the result applies to the class of "gen of differential characters has the form: H = T × × V T: Connected torus of topologically trivial flat fields: W-th/0605200 with D. Freed and G. Segal #12;Introduction Today I'll be talking about some interesting
Rényi entropy uncertainty relation for successive projective measurements
NASA Astrophysics Data System (ADS)
Zhang, Jun; Zhang, Yang; Yu, Chang-shui
2015-06-01
We investigate the uncertainty principle for two successive projective measurements in terms of Rényi entropy based on a single quantum system. Our results cover a large family of the entropy (including the Shannon entropy) uncertainty relations with a lower optimal bound. We compare our relation with other formulations of the uncertainty principle in two-spin observables measured on a pure quantum state of qubit. It is shown that the low bound of our uncertainty relation has better tightness.
Reformulating the Quantum Uncertainty Relation
Li, Jun-Li; Qiao, Cong-Feng
2015-01-01
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197
Reformulating the Quantum Uncertainty Relation
Jun-Li Li; Cong-Feng Qiao
2015-09-17
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in $N$-dimensional Hilbert space.
Reformulating the Quantum Uncertainty Relation.
Li, Jun-Li; Qiao, Cong-Feng
2015-01-01
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197
Reformulating the Quantum Uncertainty Relation
NASA Astrophysics Data System (ADS)
Li, Jun-Li; Qiao, Cong-Feng
2015-08-01
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.
NSDL National Science Digital Library
Moore, P.G.
This article, authored by P.G. Moore for the Royal Statistical Society's website, provides well-defined exercises to assess the probabilities of decision-making and the degree of uncertainty. The author states the focus of the article as: "When analyzing situations which involve decisions to be made as between alternative courses of action under conditions of uncertainty, decision makers and their advisers are often called upon to assess judgmental probability distributions of quantities whose true values are unknown to them. How can this judgment be taught?" Moore provides five different exercises and even external reference for those interested in further study of the topic.
Generalized Entropic Uncertainty Relations with Tsallis' Entropy
NASA Technical Reports Server (NTRS)
Portesi, M.; Plastino, A.
1996-01-01
A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.
ERIC Educational Resources Information Center
Duerdoth, Ian
2009-01-01
The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…
Uncertainty in Visual Processes Predicts Geometrical Optical Illusions
Daume III, Hal
Uncertainty in Visual Processes Predicts Geometrical Optical Illusions Cornelia Ferm¨uller1 is that there is a general uncertainty principle which governs the workings of vision systems, and optical illusions are an artifact of this principle. Key words: optical illusions, motion perception, bias, estimation processes
Holographic positon uncertainty and the quantum-classical transition
Herzenberg, C L
2010-01-01
Arguments based on general principles of quantum mechanics have suggested that a minimum length associated with Planck-scale unification may in the context of the holographic principle entail a new kind of observable uncertainty in the transverse position of macroscopically separated objects. Here, we address potential implications of such a position uncertainty for establishing an additional threshold between quantum and classical behavior.
The link between entropic uncertainty and nonlocality
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Hänggi, Esther
2013-02-01
Two of the most intriguing features of quantum physics are the uncertainty principle and the occurrence of nonlocal correlations. The uncertainty principle states that there exist pairs of incompatible measurements on quantum systems such that their outcomes cannot both be predicted. On the other hand, nonlocal correlations of measurement outcomes at different locations cannot be explained by classical physics, but appear in the presence of entanglement. Here, we show that these two fundamental quantum effects are quantitatively related. Namely, we provide an entropic uncertainty relation for the outcomes of two binary measurements, where the lower bound on the uncertainty is quantified in terms of the maximum Clauser-Horne-Shimony-Holt value that can be achieved with these measurements. We discuss applications of this uncertainty relation in quantum cryptography, in particular, to certify quantum sources using untrusted devices.
Principles of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Landé, Alfred
2013-10-01
Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ? (x) and ? (p); 11. Complementarity; 12. Mathematical relation between ? (x) and ? (p) for free particles; 13. General relation between ? (q) and ? (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ? (t) and ? (?); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ? and ?; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for ?p (q) and Xq (p); 39. Differential equation for ?? (q); 40. The general probability amplitude ??' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schrödinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrödinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.
Abolishing the maximum tension principle
NASA Astrophysics Data System (ADS)
Da¸browski, Mariusz P.; Gohar, H.
2015-09-01
We find the series of example theories for which the relativistic limit of maximum tension Fmax =c4 / 4 G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Calculating Measurement Uncertainties for Mass Spectrometry Data
NASA Astrophysics Data System (ADS)
Essex, R. M.; Goldberg, S. A.
2006-12-01
A complete and transparent characterization of measurement uncertainty is fundamentally important to the interpretation of analytical results. We have observed that the calculation and reporting of uncertainty estimates for isotopic measurement from a variety of analytical facilities are inconsistent, making it difficult to compare and evaluate data. Therefore, we recommend an approach to uncertainty estimation that has been adopted by both US national metrology facilities and is becoming widely accepted within the analytical community. This approach is outlined in the ISO "Guide to the Expression of Uncertainty in Measurement" (GUM). The GUM approach to uncertainty estimation includes four major steps: 1) Specify the measurand; 2) Identify uncertainty sources; 3) Quantify components by determining the standard uncertainty (u) for each component; and 4) Calculate combined standard uncertainty (u_c) by using established propagation laws to combine the various components. To obtain a desired confidence level, the combined standard uncertainty is multiplied by a coverage factor (k) to yield an expanded uncertainty (U). To be consistent with the GUM principles, it is also necessary create an uncertainty budget, which is a listing of all the components comprising the uncertainty and their relative contribution to the combined standard uncertainty. In mass spectrometry, Step 1 is normally the determination of an isotopic ratio for a particular element. Step 2 requires the identification of the many potential sources of measurement variability and bias including: gain, baseline, cup efficiency, Schottky noise, counting statistics, CRM uncertainties, yield calibrations, linearity calibrations, run conditions, and filament geometry. Then an equation expressing the relationship of all of the components to the measurement value must be written. To complete Step 3, these potential sources of uncertainty must be characterized (Type A or Type B) and quantified. This information is often readily available (e.g., CRM Certificate Values) but for some variables it may not be possible or practical to quantify uncertainty (e.g., filament geometry effects). Therefore, to complete step 4 and calculate a combined standard uncertainty, an approach that confounds many of the potential sources of measurement uncertainty may be required. The uncertainty calculated using the GUM approach should be reported in an uncertainty budget, which is generated using the equation created in Step 2 and evaluating each uncertainty component by either an analytical method or a numerical approach.
NASA Astrophysics Data System (ADS)
Lamport, Leslie
2012-08-01
Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.
The physical origins of the uncertainty theorem
NASA Astrophysics Data System (ADS)
Giese, Albrecht
2013-10-01
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.
Time Crystals from Minimum Time Uncertainty
Mir Faizal; Mohammed M. Khalil; Saurya Das
2014-12-29
Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra, and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal.
Uncertainty relation for mutual information
NASA Astrophysics Data System (ADS)
Schneeloch, James; Broadbent, Curtis J.; Howell, John C.
2014-12-01
We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting that the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.
Role of information theoretic uncertainty relations in quantum theory
NASA Astrophysics Data System (ADS)
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-04-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson-Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Environmental Modeling: Coping with Uncertainty
Politècnica de Catalunya, Universitat
Division Mathematical Modeling and Analysis Group Los Alamos National Laboratory C. Larry Winter (NCAR in Environmental Modeling Uncertainty in processes physical Computational uncertainty uncertainty Model Parameter. Quantitative modeling of environmental processes 2. Parametric uncertainty 3. Current approaches to uncertainty
Cleghorn, R. A.
1965-01-01
There are four lines of development that might be called psychosomatic principles. The first represents the work initiated by Claude Bernard, Cannon, and others, in neurophysiology and endocrinology in relationship to stress. The second is the application of psychoanalytic formulations to the understanding of illness. The third is in the development of the social sciences, particularly anthropology, social psychology and sociology with respect to the emotional life of man, and, fourth, there is an increased application of epidemiological techniques to the understanding and incidence of disease and its causes. These principles can be applied to the concepts of comprehensive medicine and they bid fair to be unifying and helpful in its study. This means that future practitioners, as well as those working in the field of psychosomatic medicine, are going to have to have a much more precise knowledge of the influence of emotions on bodily processes. PMID:14259334
Uncertainty in hydrological signatures
NASA Astrophysics Data System (ADS)
McMillan, Hilary; Westerberg, Ida
2015-04-01
Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.
Operational definitions of uncertainty
Edelgard Hund; D. Luc Massart; Johanna Smeyers-Verbeke
2001-01-01
Very different approaches for the estimation of the uncertainty related to measurement results are found in the literature and in published guidelines. This article analyses and compares them. It is clear that `uncertainty' is not, and should not be, the same in all situations. As a consequence, operational definitions of uncertainty are proposed that take into account the differences in
Direct Aerosol Forcing Uncertainty
Mccomiskey, Allison
2008-01-15
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
Uncertainty and Methods for Dealing with Uncertainty.
Wiederhold, Gio
Â fallible, it cannot also be intelligent. Alan Turing, 1946 [A. Hodges: Alan Turing: The Enigma, p.361 to use uncertainty methods in their work, and test and assess alternative formulations. 1. Generalities
Quantum Uncertainty and Error-Disturbance Tradeoff
Yu-Xiang Zhang; Shengjun Wu; Zeng-Bing Chen
2014-11-03
The uncertainty principle is often interpreted by the tradeoff between the error of a measurement and the consequential disturbance to the followed ones, which originated long ago from Heisenberg himself but now falls into reexamination and even heated debate. Here we show that the tradeoff is switched on or off by the quantum uncertainties of two involved non-commuting observables: if one is more certain than the other, there is no tradeoff; otherwise, they do have tradeoff and the Jensen-Shannon divergence gives it a good characterization.
Uncertainty in audiometer calibration
NASA Astrophysics Data System (ADS)
Aurélio Pedroso, Marcos; Gerges, Samir N. Y.; Gonçalves, Armando A., Jr.
2004-02-01
The objective of this work is to present a metrology study necessary for the accreditation of audiometer calibration procedures at the National Brazilian Institute of Metrology Standardization and Industrial Quality—INMETRO. A model for the calculation of measurement uncertainty was developed. Metrological aspects relating to audiometer calibration, traceability and measurement uncertainty were quantified through comparison between results obtained at the Industrial Noise Laboratory—LARI of the Federal University of Santa Catarina—UFSC and the Laboratory of Electric/acoustics—LAETA of INMETRO. Similar metrological performance of the measurement system used in both laboratories was obtained, indicating that the interlaboratory results are compatible with the expected values. The uncertainty calculation was based on the documents: EA-4/02 Expression of the Uncertainty of Measurement in Calibration (European Co-operation for Accreditation 1999 EA-4/02 p 79) and Guide to the Expression of Uncertainty in Measurement (International Organization for Standardization 1993 1st edn, corrected and reprinted in 1995, Geneva, Switzerland). Some sources of uncertainty were calculated theoretically (uncertainty type B) and other sources were measured experimentally (uncertainty type A). The global value of uncertainty calculated for the sound pressure levels (SPLs) is similar to that given by other calibration institutions. The results of uncertainty related to measurements of SPL were compared with the maximum uncertainties Umax given in the standard IEC 60645-1: 2001 (International Electrotechnical Commission 2001 IEC 60645-1 Electroacoustics—Audiological Equipment—Part 1:—Pure-Tone Audiometers).
Uncertainties in Gapped Graphene
Eylee Jung; Kwang S. Kim; DaeKil Park
2012-03-20
Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.
Physics and Operational Research: measure of uncertainty via Nonlinear Programming
NASA Astrophysics Data System (ADS)
Davizon-Castillo, Yasser A.
2008-03-01
Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.
Eldar, Yonina
and frequency. This basic principle was then extended by Landau, Pollack, Slepian, and later Donoho and Stark][4]. The uncertainty principle has deep philosophical interpretations. For example, in the context of quantum mechanics
Uncertainty Analysis of Thermal Comfort Parameters
NASA Astrophysics Data System (ADS)
Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages
2015-07-01
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote (PMV) and predicted percentage dissatisfied (PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
Uncertainty Analysis of Thermal Comfort Parameters
NASA Astrophysics Data System (ADS)
Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages
2015-08-01
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
Beier, Meghan L.
2015-01-01
Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700
Uncertainty in hydrological signatures
NASA Astrophysics Data System (ADS)
Westerberg, I. K.; McMillan, H. K.
2015-04-01
Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, including for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.
Information-Disturbance theorem and Uncertainty Relation
Takayuki Miyadera; Hideki Imai
2007-07-31
It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.
Uncertainty of decibel levels.
Taraldsen, Gunnar; Berge, Truls; Haukland, Frode; Lindqvist, Bo Henry; Jonasson, Hans
2015-09-01
The mean sound exposure level from a source is routinely estimated by the mean of the observed sound exposures from repeated measurements. A formula for the standard uncertainty based on the Guide to the expression of Uncertainty in Measurement (GUM) is derived. An alternative formula is derived for the case where the GUM method fails. The formulas are applied on several examples, and compared with a Monte Carlo calculation of the standard uncertainty. The recommended formula can be seen simply as a convenient translation of the uncertainty on an energy scale into the decibel level scale, but with a theoretical foundation. PMID:26428824
Adaptive framework for uncertainty analysis in electromagnetic field measurements.
Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano
2015-04-01
Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. PMID:25143178
Controlling entropic uncertainty bound through memory effects
Göktu? Karpat; Jyrki Piilo; Sabrina Maniscalco
2015-04-09
One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.
Uncertainty in quantum mechanics: faith or fantasy?
Penrose, Roger
2011-12-13
The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications. PMID:22042902
Controlling entropic uncertainty bound through memory effects
NASA Astrophysics Data System (ADS)
Karpat, Göktu?; Piilo, Jyrki; Maniscalco, Sabrina
2015-09-01
One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.
Controlling entropic uncertainty bound through memory effects
Göktu? Karpat; Jyrki Piilo; Sabrina Maniscalco
2015-09-19
One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.
MOUSE UNCERTAINTY ANALYSIS SYSTEM
The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...
Uncertainty Analysis Economic Evaluations
Bhulai, Sandjai
uncertainties in typical oil and gas projects: 1. The oil price, 2. The investments (capex) and operating HV Amsterdam The Netherlands #12;2 Summary Large companies like Shell often deal with big projects the influence of these uncertainties on the economic indicators. Economic evaluations in the oil industry
Electoral Knowledge and Uncertainty.
ERIC Educational Resources Information Center
Blood, R. Warwick; And Others
Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…
NASA Astrophysics Data System (ADS)
Warner, Scott; Smith, Barton
2013-11-01
The random uncertainty of 2-component (2C) Particle Image Velocimetry (PIV) has recently been addressed in three unique methods called the Uncertainty Surface Method (USM) from Utah State University, Image Matching (IM) method from Lavision and Delft, and correlation Signal to Noise Ration (SNR) methods from Virginia Tech. Since 3C (stereo) Particle Image Velocimetry (PIV) velocity fields are derived from two, 2C fields, random uncertainties from the 2C fields clearly propagate into the 3C field. In this work, we will demonstrate such a propagation using commercial PIV software and the USM method, although the propagation works similarly for any 2C random uncertainty method. Stereo calibration information is needed to perform this propagation. As a starting point, a pair of 2C uncertainty fields will be combined in exactly the same manner as velocity fields to form a 3C uncertainty field using commercial software. Correlated uncertainties between the components in the two 2C fields will be addressed. These results will then by compared to a more rigorous propagation, which requires access to the calibration information. The random uncertainty of 2-component (2C) Particle Image Velocimetry (PIV) has recently been addressed in three unique methods called the Uncertainty Surface Method (USM) from Utah State University, Image Matching (IM) method from Lavision and Delft, and correlation Signal to Noise Ration (SNR) methods from Virginia Tech. Since 3C (stereo) Particle Image Velocimetry (PIV) velocity fields are derived from two, 2C fields, random uncertainties from the 2C fields clearly propagate into the 3C field. In this work, we will demonstrate such a propagation using commercial PIV software and the USM method, although the propagation works similarly for any 2C random uncertainty method. Stereo calibration information is needed to perform this propagation. As a starting point, a pair of 2C uncertainty fields will be combined in exactly the same manner as velocity fields to form a 3C uncertainty field using commercial software. Correlated uncertainties between the components in the two 2C fields will be addressed. These results will then by compared to a more rigorous propagation, which requires access to the calibration information. Thanks to the Nuclear Science & Technology Directorate at Idaho National Laboratory. The work was supported through the U.S. Department of Energy, Laboratory Directed Research & Development grant under DOE Contract 122440 (Project Number: 12-045).
Phase-space noncommutative formulation of Ozawa's uncertainty principle
NASA Astrophysics Data System (ADS)
Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno
2014-08-01
Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.
A no-pure-boost uncertainty principle from spacetime noncommutativity
Giovanni Amelino-Camelia; Giulia Gubitosi; Antonino Marcianó; Pierre Martinetti; Flavio Mercati
2007-07-12
We study boost and space-rotation transformations in kappa-Minkowski noncommutative spacetime, using the techniques that some of us had previously developed (hep-th/0607221) for a description of translations in kappa-Minkowski, which in particular led to the introduction of translation transformation parameters that do not commute with the spacetime coordinates. We find a similar description of boosts and space rotations, which allows us to identify some associated conserved charges, but the form of the commutators between transformation parameters and spacetime coordinates is incompatible with the possibility of a pure boost.
Heisenberg uncertainty principle and economic analogues of basic physical quantities
Vladimir Soloviev; Vladimir Saptsin
2011-11-10
From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.
Phase-space noncommutative formulation of Ozawa's uncertainty principle
Catarina Bastos; Alex E. Bernardini; Orfeu Bertolami; Nuno Dias; João Prata
2014-08-27
Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.
The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment
ERIC Educational Resources Information Center
Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea
2010-01-01
An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…
Raymond C. Parks; David P. Duggan
2011-01-01
This paper proposes some principles of cyber- warfare. The principles of warfare are well documented, but are not always applicable to cyber-warfare. Differences between cyberspace and the real world suggest some additional principles. This is not intended to be a comprehensive listing of such principles but suggestions leading toward discussion and dialogue. The current candidate list of principles of cyber-warfare
Living With Radical Uncertainty. The Exemplary case of Folding Protein
Ignazio Licata
2010-04-21
Laplace's demon still makes strong impact on contemporary science, in spite of the fact that Logical Mathematics outcomes, Quantum Physics advent and more recently Complexity Science have pointed out the crucial role of uncertainty in the World's descriptions. We focus here on the typical problem of folding protein as an example of uncertainty, radical emergence and a guide to the "simple" principles for studying complex systems.
Uncertainty vs. Interindividual variability
Bogen, K.T.
1993-04-01
Distinct treatment of uncertainty and interindividual variability in variates used to model risk ensures that quantitative assessments of these attributes in modeled risk are maximally relevant to potential regulatory concerns. For example, such a distinction is required for quantitative characterization of uncertainty in population risk or in individual risk. Yet, most quantitative uncertainty analyses undertaken as part of environmental health risk assessments have failed to systematically maintain this distinction among modeled distributed input variates, and so have had limited relevance to reasonable concerns that regulators may have about how uncertainty and variability ought to relate to risk acceptability. The distinction is of course impossible if quantitative treatment of distributed input variates is rejected in favor of using single-point estimates due to the perceived impracticality of complex Monte Carlo analyses that might erroneously be thought of as being necessarily involved. Here, some practical methods are presented that facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical alternative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment.
Complementarity and Uncertainty in Mach-Zehnder Interferometry and beyond
Paul Busch; Christopher R. Shilladay
2006-09-06
A coherent account of the connections and contrasts between the principles of com- plementarity and uncertainty is developed starting from a survey of the various formalizations of these principles. The conceptual analysis is illustrated by means of a set of experimental schemes based on Mach-Zehnder interferometry. In particu- lar, path detection via entanglement with a probe system and (quantitative) quan- tum erasure are exhibited to constitute instances of joint unsharp measurements of complementary pairs of physical quantities, path and interference observables. The analysis uses the representation of observables as positive-operator-valued measures (POVMs). The reconciliation of complementary experimental options in the sense of simultaneous unsharp preparations and measurements is expressed in terms of uncertainty relations of different kinds. The feature of complementarity, manifest in the present examples in the mutual exclusivity of path detection and interference observation, is recovered as a limit case from the appropriate uncertainty relation. It is noted that the complementarity and uncertainty principles are neither completely logically independent nor logical consequences of one another. Since entanglement is an instance of the uncertainty of quantum properties (of compound systems), it is moot to play out uncertainty and entanglement against each other as possible mechanisms enforcing complementarity.
NASA Astrophysics Data System (ADS)
Meng, Xiaofeng; Chen, Jidong
One of the key research issues with moving objects databases is the uncertainty management. The uncertainty management for moving objects has been well studied recently, with many models and algorithms proposed. In this chapter, we analyze the uncertainty of moving objects in spatial networks and introduce an uncertain trajectory model and an index framework, the uncertain trajectory based Rtree (UTR-tree), for indexing the fully uncertain trajectories of network-constrained moving objects. Then, we introduce how to process queries on this framework. The content of this chapter is mainly from the work of Ding in [14].
Spectral Methods for Uncertainty Quantication
Spectral Methods for Uncertainty Quantication Emil Brandt Kærgaard Kongens Lyngby 2013 IMM) This thesis has investigated the eld of Uncertainty Quantication with regard to dierential equations the mathematical background for work- ing with Uncertainty Quantication, the theoretical background for generalized
Communicating scientific uncertainty
Fischhoff, Baruch; Davis, Alex L.
2014-01-01
All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390
Evaluating prediction uncertainty
McKay, M.D. [Los Alamos National Lab., NM (United States)
1995-03-01
The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.
Analysis of Infiltration Uncertainty
MCCURLEY,RONALD D.; HO,CLIFFORD K.; WILSON,MICHAEL L.; HEVESI,JOSEPH A.
2000-10-30
In a total-system performance assessment (TSPA), uncertainty in the performance measure (e.g., radiation dose) is estimated by first estimating the uncertain y in the input variables and then propagating that uncertain y through the model system by means of Monte Carlo simulation. This paper discusses uncertainty in surface infiltration, which is one of the input variables needed for performance assessments of the Yucca Mountain site. Infiltration has been represented in recent TSPA simulations by using three discrete infiltration maps (i.e., spatial distributions of infiltration) for each climate state in the calculation of unsaturated-zone flow and transport. A detailed uncertainty analysis of infiltration was carried out for two purposes: to better quantify the possible range of infiltration, and to determine what probability weights should be assigned to the three infiltration cases in a TSPA simulation. The remainder of this paper presents the approach and methodology for the uncertainty analysis, along with a discussion of the results.
A Generalized Uncertainty Relation
NASA Astrophysics Data System (ADS)
Chen, Zhengli; Liang, Lili; Li, Haojing; Wang, Wenhua
2015-01-01
By using a generalization of the Wigner-Yanase-Dyson skew information, a quantity |U_{? }^{? }|(A) is introduced in this paper for every Hilbert-Schmidt operator A on a Hilbert space H and a related uncertainty relation was established. The obtained inequality generalizes a known uncertainty relation. Moreover, a negative answer to a conjecture induced in Dou and Du (Int. J. Theor. Phys. 53, 952-958, 2014) was given by a counterexample.
Uncertainty prediction for PUB
NASA Astrophysics Data System (ADS)
Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.
2003-04-01
IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.
A Generalized Uncertainty Relation
NASA Astrophysics Data System (ADS)
Chen, Zhengli; Liang, Lili; Li, Haojing; Wang, Wenhua
2015-08-01
By using a generalization of the Wigner-Yanase-Dyson skew information, a quantity is introduced in this paper for every Hilbert-Schmidt operator A on a Hilbert space H and a related uncertainty relation was established. The obtained inequality generalizes a known uncertainty relation. Moreover, a negative answer to a conjecture induced in Dou and Du (Int. J. Theor. Phys. 53, 952-958, 2014) was given by a counterexample.
Conundrums with uncertainty factors.
Cooke, Roger
2010-03-01
The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767
Collision Avoidance Under Uncertainty
NASA Astrophysics Data System (ADS)
Mobasseri, Bijan G.
1989-03-01
Reasoning with uncertainty has increasingly become an important issue in robot path planning. The injection of doubt into the knowledge of obstacle location calls for capabilities not available in the conventional path planners. This work redefines the concept of optimality within the context of an uncertainty grid and proposes a class of cost functions capable of incorporating such human-like attitudes of conservative and aggressive behavior.
Measurement Uncertainty Estimation in Amperometric Sensors: A Tutorial Review
Helm, Irja; Jalukse, Lauri; Leito, Ivo
2010-01-01
This tutorial focuses on measurement uncertainty estimation in amperometric sensors (both for liquid and gas-phase measurements). The main uncertainty sources are reviewed and their contributions are discussed with relation to the principles of operation of the sensors, measurement conditions and properties of the measured samples. The discussion is illustrated by case studies based on the two major approaches for uncertainty evaluation–the ISO GUM modeling approach and the Nordtest approach. This tutorial is expected to be of interest to workers in different fields of science who use measurements with amperometric sensors and need to evaluate the uncertainty of the obtained results but are new to the concept of measurement uncertainty. The tutorial is also expected to be educative in order to make measurement results more accurate. PMID:22399887
Dasymetric Modeling and Uncertainty
Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth
2014-01-01
Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846
Picture independent quantum action principle
Mantke, W.J.
1992-01-01
The Schwinger action principle for quantum mechanics is extended into a picture independent form. This displays the quantum connection. Time variations are formulated as variations of a time variable and included into the kinematical variations. Kets and bras represent experimental operations. Experimental operations at different times cannot be identified. The ket and the bra spaces are fiber bundles over time. The same applies to the classical configuration space. For the classical action principle the action can be varied by changing the path or the classical variables. The latter variation of classical functions corresponds to kinematical variations of quantum variables. The picture independent formulation represents time evolution by a connection. A standard experiment is represented by a ket, a connection and a bra. For particular start and end times of experiments, the action and the contraction into a transition amplitude are elements of a new tensor space of quantum correspondents of path functionals. The classical correspondent of the transition amplitude is the probability for a specified state to evolve along a particular path segment. The elements of the dual tensor space represent standard experiments or superpositions thereof. The kinematical variations of the quantum variables are commuting numbers. Variations that include the effect of Poincare or gauge transformations have different commutator properties. The Schwinger action principle is derived from the Feynman path integral formulation. The limitations from the time-energy uncertainty relation might be accommodated by superposing experiments that differ in their start- and end-times. In its picture independent form the action principle can be applied to all superpositions of standard experiments. This may involve superpositions of different connections. The extension of the superposition principle to connections allows representation of a quantum field by a part of the connection.
Classification images with uncertainty
Tjan, Bosco S.; Nandy, Anirvan S.
2009-01-01
Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms. PMID:16889477
Uncertainty Analysis in Space Radiation Protection
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2011-01-01
Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.
The Equivalence Principle in a Quantum World
N. E. J. Bjerrum-Bohr; John F. Donoghue; Basem Kamal El-Menoufi; Barry R. Holstein; Ludovic Planté; Pierre Vanhove
2015-05-19
We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the Equivalence Principle, for instance through introduction of non-locality from quantum physics, embodied in the Uncertainty Principle. When the energy is small we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry - general coordinate invariance - that is used to organize the effective field theory.
Network planning under uncertainties
NASA Astrophysics Data System (ADS)
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.
NASA Astrophysics Data System (ADS)
Jones, P. W.; Strelitz, R. A.
2012-12-01
The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs at the generating points (effectively the center of the polygon). This methodology readily admits to rigorous statistical analysis using standard components found in R and thus entirely compatible with the visualization package we use (Visit and/or ParaView), the language we use (Python) and the UVCDAT environment that provides the programmer and analyst workbench. We will demonstrate the power and effectiveness of this methodology in climate studies. We will further argue that our method of defining (or predicting) values in a region has many advantages over the traditional visualization notion of value at a point.
Robust Capacity Planning Under Uncertainty
Dimitris Paraskevopoulos; Elias Karakitsos; Berc Rustem
1991-01-01
The existence of uncertainty influences the investment, production and pricing decision of firms. Therefore, capacity expansion models need to take into account uncertainty. This uncertainty, may arise because of errors in the specification, statistical estimation of relationships and in the assumptions of exogenous variables. One such example is demand uncertainty. In this paper, a cautious capacity planning approach is described
Asymmetric Uncertainty Expression for High Gradient Aerodynamics
NASA Technical Reports Server (NTRS)
Pinier, Jeremy T
2012-01-01
When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.
The propagation of uncertainty for humidity calculations
NASA Astrophysics Data System (ADS)
Lovell-Smith, J.
2009-12-01
This paper addresses the international humidity community's need for standardization of methods for propagation of uncertainty associated with humidity generators and for handling uncertainty associated with the reference water vapour-pressure and enhancement-factor equations. The paper outlines uncertainty calculations for the mixing ratio, dew-point temperature and relative humidity output from humidity generators, and in particular considers controlling equations for a theoretical hybrid humidity generator combining single-pressure (1-P), two-pressure (2-P) and two-flow (2-F) principles. Also considered is the case where the humidity generator is used as a stable source with traceability derived from a reference hygrometer, i.e. a dew-point meter, a relative humidity meter or a wet-bulb psychrometer. Most humidity generators in use at national metrology institutes can be considered to be special cases of those considered here and sensitivity coefficients for particular types may be extracted. The ability to account for correlations between input variables and between different instances of the evaluation of the reference equations is discussed. The uncertainty calculation examples presented here are representative of most humidity calculations.
Separability conditions from the Landau-Pollak uncertainty relation
Vicente, Julio I. de; Sanchez-Ruiz, Jorge
2005-05-15
We obtain a collection of necessary (sufficient) conditions for a bipartite system of qubits to be separable (entangled), which are based on the Landau-Pollak formulation of the uncertainty principle. These conditions are tested and compared with previously stated criteria by applying them to states whose separability limits are already known. Our results are also extended to multipartite and higher-dimensional systems.
Multipartite minimum uncertainty products
E. Shchukin
2012-06-18
In our previous work we have found a lower bound for the multipartite uncertainty product of the position and momentum observables over all separable states. In this work we are trying to minimize this uncertainty product over a broader class of states to find the fundamental limits imposed by nature on the observable quantites. We show that it is necessary to consider pure states only and find the infimum of the uncertainty product over a special class of pure states (states with spherically symmetric wave functions). It is shown that this infimum is not attained. We also explicitly construct a parametrized family of states that approaches the infimum by varying the parameter. Since the constructed states beat the lower bound for separable states, they are entangled. We thus show that there is a gap that separates the values of a simple measurable quantity for separable states from entangled ones and we also try to find the size of this gap.
Measurement uncertainty relations
Busch, Paul, E-mail: paul.busch@york.ac.uk [Department of Mathematics, University of York, York (United Kingdom)] [Department of Mathematics, University of York, York (United Kingdom); Lahti, Pekka, E-mail: pekka.lahti@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland)] [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland); Werner, Reinhard F., E-mail: reinhard.werner@itp.uni-hannover.de [Institut für Theoretische Physik, Leibniz Universität, Hannover (Germany)
2014-04-15
Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.
Uncertainty and calibration analysis
Coutts, D.A.
1991-03-01
All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.
Serenity in political uncertainty.
Doumit, Rita; Afifi, Rema A; Devon, Holli A
2015-01-01
College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930
NASA Astrophysics Data System (ADS)
Silverman, Mark P.
2014-07-01
1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.
Treatment of Data Uncertainties
Larson, N.M. [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6171 (United States)
2005-05-24
The generation and use of data covariance matrices are discussed within the context of the analysis of neutron-induced cross-section data via the R-matrix code SAMMY. Two complementary approaches are described, the first involving mathematical manipulation of Bayes' equations and the second utilizing computer simulations. A new procedure for propagating uncertainties on unvaried parameters will allow the effect of all relevant experimental uncertainties to be reflected in the analysis results, without placing excessive additional burden on the analyst. Implementation of this procedure within SAMMY is described and illustrated through the simulations.
NASA Technical Reports Server (NTRS)
Brown, Laurie M.
1993-01-01
An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.
Equivalence of wave-particle duality to entropic uncertainty.
Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie
2014-01-01
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138
An uncertainty inventory demonstration - a primary step in uncertainty quantification
Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J
2009-01-01
Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.
Zhang, Jun; Zhang, Yang; Yu, Chang-Shui
2015-01-01
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details. PMID:26118488
Position-momentum uncertainty relations based on moments of arbitrary order
Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.
2011-05-15
The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.
Optimization Under Generalized Uncertainty
Lodwick, Weldon
11 Optimization Under Generalized Uncertainty Optimization Modeling Math 4794/5794: Spring 2013 Weldon A. Lodwick Weldon.Lodwick@ucdenver.edu 2/14/2013 Optimization Modeling - Spring 2013 #12 in the context of optimization problems. The theoretical frame-work for these notes is interval analysis. From
Oblow, E.M.
1985-05-13
A hybrid uncertainty theory for artificial intelligence problems combining the strengths of fuzzy-set theory and Dempster/Shafer theory is presented. The basic operations for combining uncertain information are given with an indication of their applicability in expert systems and robot planning problems.
Bereby-Meyer, Yoella
2012-02-01
Guala points to a discrepancy between strong negative reciprocity observed in the lab and the way cooperation is sustained "in the wild." This commentary suggests that in lab experiments, strong negative reciprocity is limited when uncertainty exists regarding the players' actions and the intentions. Thus, costly punishment is indeed a limited mechanism for sustaining cooperation in an uncertain environment. PMID:22289307
Identity Uncertainty Stuart Russell
Russell, Stuart
Identity Uncertainty Stuart Russell Computer Science Division University of California, Berkeley, CA 94720, USA russell@cs.berkeley.edu Abstract We are often uncertain about the identity of objects probabilis- tic approach to reasoning about identity under uncer- tainty in the framework of first
Uncertainties in successive measurements
NASA Astrophysics Data System (ADS)
Distler, Jacques; Paban, Sonia
2013-06-01
When you measure an observable, A, in quantum mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some noncommuting observable, B. The standard uncertainty relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the postmeasurement state. We re-examine this problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum. In the latter case, the need to include a finite detector resolution, as part of what it means to measure such an observable, has dramatic implications for the result of successive measurements. Ozawa, [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.67.042105 67, 042105 (2003)] proposed an inequality satisfied in the case of successive measurements. Among our results, we show that his inequality is ineffective (can never come close to being saturated). For the cases of interest, we compute a sharper lower bound.
Uncertainties in repository modeling
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Principles of project management
NASA Technical Reports Server (NTRS)
1982-01-01
The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.
Chemical Principls Exemplified
ERIC Educational Resources Information Center
Plumb, Robert C.
1973-01-01
Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)
Deterministic Elaboration of Heisenberg's Uncertainty Relation and the Nowhere Differentiability
NASA Astrophysics Data System (ADS)
Adda, Fayçlal Ben; Porchon, Hélène
2013-10-01
In this paper the uncertainty principle is found via characteristics of continuous and nowhere differentiable functions. We prove that any physical system that has a continuous and nowhere differentiable position function is subject to an uncertainty in the simultaneous determination of values of its physical properties. The uncertainty in the simultaneous knowledge of the position deviation and the average rate of change of this deviation is found to be governed by a relation equivalent to the one discovered by Heisenberg in 1925. Conversely, we prove that any physical system with a continuous position function that is subject to an uncertainty relation must have a nowhere differentiable position function, which makes the set of continuous and nowhere differentiable functions a candidate for the quantum world.
The precautionary principle and ecological hazards of genetically modified organisms.
Giampietro, Mario
2002-09-01
This paper makes three points relevant to the application of the precautionary principle to the regulation of GMOs. i) The unavoidable arbitrariness in the application of the precautionary principle reflects a deeper epistemological problem affecting scientific analyses of sustainability. This requires understanding the difference between the concepts of "risk", "uncertainty" and "ignorance". ii) When dealing with evolutionary processes it is impossible to ban uncertainty and ignorance from scientific models. Hence, traditional risk analysis (probability distributions and exact numerical models) becomes powerless. Other forms of scientific knowledge (general principles or metaphors) may be useful alternatives. iii) The existence of ecological hazards per se should not be used as a reason to stop innovations altogether. However, the precautionary principle entails that scientists move away from the concept of "substantive rationality" (trying to indicate to society optimal solutions) to that of "procedural rationality" (trying to help society to find "satisficing" solutions). PMID:12436844
ERIC Educational Resources Information Center
Beim, George
This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…
Essays on uncertainty in economics
Simsek, Alp
2010-01-01
This thesis consists of four essays about "uncertainty" and how markets deal with it. Uncertainty is about subjective beliefs, and thus it often comes with heterogeneous beliefs that may be present temporarily or even ...
Predicting System Performance with Uncertainty
Yan, B.; Malkawi, A.
2012-01-01
The main purpose of this research is to include uncertainty that lies in modeling process and that arises from input values when predicting system performance, and to incorporate uncertainty related to system controls in a computationally...
Uncertainties in climate stabilization
Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.
2009-11-01
We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.
Nonequivalence of equivalence principles
Eolo Di Casola; Stefano Liberati; Sebastiano Sonego
2015-03-18
Equivalence principles played a central role in the development of general relativity. Furthermore, they have provided operative procedures for testing the validity of general relativity, or constraining competing theories of gravitation. This has led to a flourishing of different, and inequivalent, formulations of these principles, with the undesired consequence that often the same name, "equivalence principle", is associated with statements having a quite different physical meaning. In this paper we provide a precise formulation of the several incarnations of the equivalence principle, clarifying their uses and reciprocal relations. We also discuss their possible role as selecting principles in the design and classification of viable theories of gravitation.
Calibration Under Uncertainty.
Swiler, Laura Painton; Trucano, Timothy Guy
2005-03-01
This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.
Driving Toward Guiding Principles
Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.
1999-01-01
As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065
CASSEMCHAPTER 6 RISK AND UNCERTAINTY
. In this way we can quantify our uncertainty in the likely fate of the CO2 in the system and make probabilistic, with uncertainty strongly influencing the risk and risk informing important decisions regarding future data acquisition aimed at better understanding uncertainty. Both are iterative processes that take place throughout
Predictive Uncertainty in Hydrological Forecasting
NASA Astrophysics Data System (ADS)
Todini, E.
2009-04-01
This work aims at discussing the role and the relevance of "predictive uncertainty" in flood forecasting and water resources management . Predictive uncertainty, is here defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume) conditional on prior observations and knowledge as well as on all the information we can obtain on that specific future value, which is typically embodied in one or more hydrological /hydraulic model forecasts. The aim of this work is also to clarify questions such as: What is the conceptual difference between "total model uncertainty" (commonly used when dealing with model verification) from the predictive uncertainty (which is used when forecasting into the future)? What is the difference between models, parameters, input output measurement errors, initial and boundary conditions, etc. uncertainty and predictive uncertainty? How one can incorporate all these uncertainties into the predictive uncertainty and, most of all, is it really necessary? The presently available uncertainty processors are then introduced and compared on the basis of their relative performances using operational flood forecasting systems. The uncertainty processors can be continuous (Hydrologic Uncertainty Processor, Bayesian Model Averaging, Model Conditional Processor, etc.) or binary ( Logistic Regression, Binary Multivariate Bayesian Processor, etc.) depending on the scope for which they are developed and the type of decision one must take. Finally, the benefits of incorporating predictive uncertainty into the decision making process will be compared, on actual real world derived examples, to the ones obtainable when using deterministic forecasts, as currently done in practice.
Calculating efficiencies and their uncertainties
Paterno, Marc; /Fermilab
2004-12-01
The commonly used methods for the calculation of the statistical uncertainties in cut efficiencies (''Poisson'' and ''binomial'' errors) are both defective, as is seen in extreme cases. A method for the calculation of uncertainties based upon Bayes' Theorem is presented; this method has no problem with extreme cases. A program for the calculation of such uncertainties is also available.
Mama Software Features: Uncertainty Testing
Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2014-05-30
This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.
Universally valid Heisenberg uncertainty relation
NASA Astrophysics Data System (ADS)
Fujikawa, Kazuo
2012-06-01
A universally valid Heisenberg uncertainty relation is proposed by combining the universally valid error-disturbance uncertainty relation of Ozawa with the relation of Robertson. This form of the uncertainty relation, which is defined with the same mathematical rigor as the relations of Kennard and Robertson, incorporates both the intrinsic quantum fluctuations and measurement effects.
Ashford, Nicholas
2005-01-01
The precautionary principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on allegedly more "rational" ...
Uncertainty relations as Hilbert space geometry
NASA Technical Reports Server (NTRS)
Braunstein, Samuel L.
1994-01-01
Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.
Direct tests of measurement uncertainty relations: what it takes
Paul Busch; Neil Stevens
2015-01-17
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that in nearly 90 years there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance), and precise formulations of such relations that are {\\em universally valid}and {\\em directly testable}. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of {\\em value deviation errors} (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for {\\em state-dependent} error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify {\\em distances between observables}.
Direct Tests of Measurement Uncertainty Relations: What It Takes
NASA Astrophysics Data System (ADS)
Busch, Paul; Stevens, Neil
2015-02-01
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables.
Uncertainty Quantification in Lattice QCD Calculations for Nuclear Physics
Silas R. Beane; William Detmold; Kostas Orginos; Martin J. Savage
2014-10-11
The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. We review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.
Uncertainty quantification in lattice QCD calculations for nuclear physics
NASA Astrophysics Data System (ADS)
Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.
2015-03-01
The numerical technique of lattice quantum chromodynamics (LQCD) holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. We review the sources of uncertainty inherent in LQCD calculations for nuclear physics, and discuss how each is quantified in current efforts.
Debating how to assess hydrological model uncertainty and weaknesses
NASA Astrophysics Data System (ADS)
Schultz, Colin
2013-01-01
The projections of hydrological models, as numerical abstractions of the complex systems they seek to represent, suffer from epistemic uncertainty due to approximation errors in the model, incomplete knowledge of the system, and, in more extreme cases, flawed underlying theories or faulty data. These errors can result in complex nonstationary biases in model predictions. Improving hydrological models—whether they be dynamic models attempting to represent the physical system from first principles or statistical, data-driven models—depends on having a way to determine the amount of uncertainty associated with the model's projections and a reliable way to deduce which model components or forcing data are responsible for it.
Picturing Data With Uncertainty
NASA Technical Reports Server (NTRS)
Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex
2004-01-01
NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.
Group environmental preference aggregation: the principle of environmental justice
Davos, C.A.
1986-01-01
The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.
Lean Production Principles in Remanufacturing A Case Study at a Toner Cartridge Remanufacturer
J. Ostlin; H. Ekholm
2007-01-01
Scandi-Toner AB works with remanufacturing of toner cartridges; both color cartridges and black cartridges The company Scandi-Toner and the remanufacturing industry in general do, compared to ordinary manufacturing, have some specific characteristics that might limit the possibilities to apply lean production principles, due to the high degree of uncertainty in the production process. These uncertainties are mainly caused by two
Uncertainty Estimates for Millennial Scale Geomgagnetic Field Models
NASA Astrophysics Data System (ADS)
Korte, M.; Constable, C. G.; Donadini, F.
2008-12-01
Continuous geomagnetic field models spanning several millennia have recently been developed using various selections of archeo- and paleomagnetic data and their inferred ages. In each case the geographic and temporal distribution of available data is far from uniform and both the magnetic data and ages have large uncertainties. We estimate error bars for both the models and their predictions using two statistical resampling techniques and a combination thereof. First, we used what we call the spatial and temporal (ST) bootstrap yielding different spatial and temporal distributions taken randomly from the original dataset. Second, we kept the original (temporal and spatial) distribution of data, but varied each datum randomly within the expected distributions of uncertainty in both the magnetic observation and assigned ages. We call this the magnetic/age (MA) Bootstrap. We produced a large number of models based on resampled data using each of the ST and MA bootstrap methods and then obtain standard deviations for both global model coefficients and predictions of field components. The ST and MA methods yield model uncertainties of the same order of magnitude. A sequential combination of MA and ST resampling takes into account the influence of uncertainties in both magnetic elements and ages as well as the unsatisfactory data distribution. We present global and regional results from this analysis and compare the uncertainties obtained from model predictions to the assigned data errors. The uncertainties obtained for magnetic field elements vary depending on whether they are obtained by error propagation from uncertainties in the model coefficients or by computing the standard error in the individual element predictions for all resampled models. The propagated uncertainties do not currently allow for covariance among the coefficients. Hence, they can be too large in some geographic regions and time intervals with good data coverage. Individual element uncertainty predictions incorporate any such covariance automatically, and can in principle better accommodate regional variations in model accuracy.
Managing Uncertainty in Data and Models: UncertWeb
NASA Astrophysics Data System (ADS)
Nativi, S.; Cornford, D.; Pebesma, E. J.
2010-12-01
There is an increasing recognition that issues of quality, error and uncertainty are central concepts to both scientific progress and practical decision making. Recent moves towards evidence driven policy and complex, uncertain scientific investigations into climate change and its likely impacts have heightened the awareness that uncertainty is critical in linking our observations and models to reality. The most natural, principled framework is provided by Bayesian approaches, which recognise a variety of sources of uncertainty such as aleatory (variability), epistemic (lack of knowledge) and possibly ontological (lack of agreed definitions). Most current information models used in the geosciences do not fully support the communication of uncertain results, although some do provide limited support for quality information in metadata. With the UncertWeb project (http://www.uncertweb.org), involving statisticians, geospatial and application scientists and informaticians we are developing a framework for representing and communicating uncertainty in observational data and models which builds on existing standards such as the Observations and Measurements conceptual model, and related Open Geospatial Consortium and ISO standards to allow the communication and propagation of uncertainty in chains of model services. A key component is the description of uncertainties in observational data, based on a revised version of UncertML, a conceptual model and encoding for representing uncertain quantities. In this talk we will describe how we envisage using UncertML with existing standards to describe the uncertainty in observational data and how this uncertainty information can then be propagated through subsequent analysis. We will highlight some of the tools which we are developing within UncertWeb to support the management of uncertainty in web based geoscientific applications.
Instructional Software Design Principles.
ERIC Educational Resources Information Center
Hazen, Margret
1985-01-01
Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)
Principles and Methods Chromatography
Lebendiker, Mario
-1142-75 Protein Purification Handbook 18-1132-29 Ion Exchange Chromatography Principles and Methods 18 ................................................................. 12 Custom Designed Media and Columns
Physical principles of hearing
NASA Astrophysics Data System (ADS)
Martin, Pascal
2015-10-01
The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography
Antarctic Photochemistry: Uncertainty Analysis
NASA Technical Reports Server (NTRS)
Stewart, Richard W.; McConnell, Joseph R.
1999-01-01
Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.
Probabilistic Mass Growth Uncertainties
NASA Technical Reports Server (NTRS)
Plumer, Eric; Elliott, Darren
2013-01-01
Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.
NASA Astrophysics Data System (ADS)
Hobson, Art
2011-10-01
An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.
The maintenance of uncertainty
NASA Astrophysics Data System (ADS)
Smith, L. A.
Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ?-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary
Uncertainty in adaptive capacity
NASA Astrophysics Data System (ADS)
Adger, W. Neil; Vincent, Katharine
2005-03-01
The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).
Uncertainty relation in Schwarzschild spacetime
Jun Feng; Yao-Zhong Zhang; Mark D. Gould; Heng Fan
2015-02-27
We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit $-\\log_2c$. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.
Disturbance trade-off principle for quantum measurements
NASA Astrophysics Data System (ADS)
Mandayam, Prabha; Srinivas, M. D.
2014-12-01
We demonstrate a fundamental principle of disturbance tradeoff for quantum measurements, along the lines of the celebrated uncertainty principle: The disturbances associated with measurements performed on distinct yet identically prepared ensembles of systems in a pure state cannot all be made arbitrarily small. Indeed, we show that the average of the disturbances associated with a set of projective measurements is strictly greater than zero whenever the associated observables do not have a common eigenvector. For such measurements, we show an equivalence between disturbance tradeoff measured in terms of fidelity and the entropic uncertainty tradeoff formulated in terms of the Tsallis entropy (T2). We also investigate the disturbances associated with the class of nonprojective measurements, where the difference between the disturbance tradeoff and the uncertainty tradeoff manifests quite clearly.
Uncertainty As Knowledge: Harnessing Ambiguity and Uncertainty into Policy Constraints
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Risbey, J.
2014-12-01
There are numerous sources of uncertainty that impact policy decisions relating to climate change: There is scientific uncertainty, as for example encapsulated in estimates of climate sensitivity. There is policy uncertainty, which arises when mitigation efforts are erratic or are reversed (as recently happened in Australia). There is also technological uncertainty which affects the mitigation pathway. How can policy decisions be informed in light of these multiple sources of uncertainty? We propose an "ordinal" approach that relies on comparisons such as "greater than" or "lesser than" (known as ordinal), which can help sidestep disagreement about specific parameter estimates (e.g., climate sensitivity). To illustrate, recent analyses (Lewandowsky et al., 2014, Climatic Change) have shown that the magnitude of uncertainty about future temperature increases is directly linked with the magnitude of future risk: the greater the uncertainty, the greater the risk of mitigation failure (defined as exceeding a carbon budget for a predetermined threshold). Here we extend this approach to other sources of uncertainty, with a particular focus on "ambiguity" or "second-order" uncertainty, which arises when there is dissent among experts.
[Main principles of radiobiology].
Kudriashov, Iu B
2001-01-01
During all the history of the development of radiation biology a problem of "energy paradox"--low consumption of energy of ionizing and non-ionizing radiation in realization of irradiation effect--has been in the focus. The first principle, which contributed much to quantitative concepts of radiation biology, is a hit principle. The hit principle, as is well known, is based on physical properties of ionizing radiations: their discontinuity, quantization and probabilistic distribution in space. Hits, i.e. acts of energy interaction with substance elements, do not depend on each other and are obeyed to Poisson distribution. The other well-known principle--a target principle--is based on the understanding that a living system has some peculiarities: a structure of elements as well as their functions are heterogeneous, unequal and differ in response to the same hits. Along with a unique DNA macromolecule, a critical target structure, biological membranes (BM) with their barrier-matrix, energy and regulatory functions, which make a basis of living processes, are also can be considered as a sensitive target structure. One more principle--a principle of amplification of primary radiation lesions in critical target structures is based on the radiation post-effect, a well-known phenomenon in radiation biology. The fourth principle is a principle of target damage recovery (regulations of cell homeostasis) that means a system response to irradiation involving mechanisms of protection and reparation of lesions in DNA and BM. The progress in molecular biology and radiation biophysics achieved for the last two decades provided an especially powerful impetus to the development of those principles, which are based on the analysis of the radio-biological effects developing in time. The main principles of radiation biology consider peculiarities of physical and biological action of ionizing radiation. PMID:11721348
Fundamental Principles FUNDAMENTAL PRINCIPLES INFORMING SPONSORED PROGRAMS
McConnell, Terry
inconsistent with these fundamental principles, require the approval of the Vice President for Research. Normally, a "reasonable delay" is defined as 60 days. Student Participation An important part. See Faculty Manual Section 3.23. Data and Property Rights The ownership of these items may be retained
Uncertainties in the Astronomical Ephemeris as Constraints on New Physics
NASA Astrophysics Data System (ADS)
Warecki, Zoey; Overduin, J.
2014-01-01
Most extensions of the standard model of particle physics predict composition-dependent violations of the universality of free fall (equivalence principle). We test this idea using observational uncertainties in mass, range and mean motion for the Moon and planets, as well as orbit uncertainties for Trojan asteroids and Saturnian satellites. For suitable pairs of solar-system bodies, we derive linearly independent constraints on relative difference in gravitational and inertial mass from modifications to Kepler's third law, the migration of stable Lagrange points, and orbital polarization (the Nordtvedt effect). These constraints can be combined with data on bulk composition to extract limits on violations of the equivalence principle for individual elements relative to one another. These limits are weaker than those from laboratory experiments, but span a much larger volume in composition space.
Hamilton's Principle for Beginners
ERIC Educational Resources Information Center
Brun, J. L.
2007-01-01
I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…
Principles of Software Testing
Meyer, Bertrand
Seven Principles of Software Testing Bertrand Meyer, ETH Zürich and Eiffel Software W hile everyone knows the theoret- ical limitations of software testing, in practice we devote considerable effort of testing and per- form it right. The principles that follow emerged from experience studying software
NSDL National Science Digital Library
Dr. Rod Nave
This tutorial provides instruction on Pauli's exclusion principle, formulated by physicist Wolfgang Pauli in 1925, which states that no two electrons in an atom can have identical quantum numbers. Topics include a mathematical statement of the principle, descriptions of some of its applications, and its role in ionic and covalent bonding, nuclear shell structure, and nuclear binding energy.
ERIC Educational Resources Information Center
Batstone, Rob; Ellis, Rod
2009-01-01
A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world…
Principles of engineering geology
P. B. Attewell; I. W. Farmer
1976-01-01
This book discusses basic principles as well as the practical applications of geological survey and analysis. Topics covered include the mechanical and physical response of rocks, rock masses and soils to changes in environmental conditions, and the principles of groundwater flow. The core of the book deals with the collection of geological and technical data, its subsequent analysis, and application
Alternative Approaches to Uncertainty Calculations for TIMS Isotopic Measurements
NASA Astrophysics Data System (ADS)
Thomas, R. B.; Essex, R. M.; Goldberg, S. A.
2006-12-01
Two methods of estimating uncertainty for TIMS U isotopic ratio measurements were evaluated. Although these methods represent fundamentally different approaches both are consistent with the principles outlined in the ISO "Guide to the Expression of Uncertainty in Measurements" (GUM). In the "Discrete Component" approach all of the identifiable sources of random variability associated with the mass spectrometer (gain variability, baseline variability, cup efficiency variability, Schottky noise, counting statistics) are individually assessed to estimate measurement reproducibility. The second approach is an "Integrated" method, which uses observed reproducibility of repeated identical sample measurements to confound the various components of random variability. Evaluation of the uncertainty budgets for the two methods shows that the relative importance of an uncertainty component in a measurement is highly dependent on the measurement technique and the isotopic ratio of the measured value. For example, the uncertainty of the ^{235}U/^{238}U ratio of the material analyzed in this study will generally be dominated by the uncertainty of the CRM used to determine the mass fractionation factor. The more extreme 234U/^{238}U and ^{236}U/^{238}U ratios are often dominated by other factors such as internal and external reproducibility. Although both methods are consistent with the GUM principles, there are many instrumental factors that can produce measurement variability but are not readily quantifiable (i.e., small differences in run conditions, filament geometry, sample loading, etc). Accordingly, the Discrete Component determination can accurately estimate internal reproducibility of an isotopic measurement but will not sufficiently characterize analysis-to- analysis variability that is inherent in all measurements. The Integrated approach to uncertainty evaluation has the advantage of not requiring the quantification of an extensive set of variables and also greatly simplifies the calculation of a combined standard uncertainty. This method, however, has the distinct disadvantage of requiring a statistically significant number of replicate analyses and does not allow for the determination of primary contributors to internal variability. Replicate measurements are not practical or possible for many analytical situations but it is still necessary to assess the uncertainty associated with external reproducibility. A straightforward method for estimating an external reproducibility factor for isotopic measurements is to incorporate the standard uncertainty of repeated measurements of a matrix-matched reference material or even an isotopic CRM if a matrix-matched material is unavailable.
ERIC Educational Resources Information Center
Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.
2002-01-01
Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…
Uncertainty and Surprise: An Introduction
NASA Astrophysics Data System (ADS)
McDaniel, Reuben R.; Driebe, Dean J.
Much of the traditional scientific and applied scientific work in the social and natural sciences has been built on the supposition that the unknowability of situations is the result of a lack of information. This has led to an emphasis on uncertainty reduction through ever-increasing information seeking and processing, including better measurement and observational instrumentation. Pending uncertainty reduction through better information, efforts are devoted to uncertainty management and hierarchies of controls. A central goal has been the avoidance of surprise.
Khoury, Justin [Perimeter Institute for Theoretical Physics, 31 Caroline St. N., Waterloo, Ontario, Canada N2L 2Y5 (Canada); Center for Particle Cosmology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Parikh, Maulik [Institute for Strings, Cosmology, and Astroparticle Physics, Columbia University, New York, New York 10027 (United States); Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Pune 411007 (India)
2009-10-15
Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.
NASA Technical Reports Server (NTRS)
Zuk, J.
1976-01-01
The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.
Cancelling out systematic uncertainties
Noreña, Jorge; Jimenez, Raul; Pena-Garay, Carlos; Gomez, Cesar
2011-01-01
We present a method to minimize, or even cancel out, the nuisance parameters affecting a measurement. Our approach is general and can be applied to any experiment or observation. We compare it with the bayesian technique used to deal with nuisance parameters: marginalization, and show how the method compares and improves by avoiding biases. We illustrate the method with several examples taken from the astrophysics and cosmology world: baryonic acoustic oscillations, cosmic clocks, Supernova Type Ia luminosity distance, neutrino oscillations and dark matter detection. By applying the method we recover some known results but also find some interesting new ones. For baryonic acoustic oscillation (BAO) experiments we show how to combine radial and angular BAO measurements in order to completely eliminate the dependence on the sound horizon at radiation drag. In the case of exploiting SN1a as standard candles we show how the uncertainty in the luminosity distance by a second parameter modeled as a metallicity depe...
NASA Astrophysics Data System (ADS)
Petzinger, Tom
I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.
Satellite altitude determination uncertainties
NASA Technical Reports Server (NTRS)
Siry, J. W.
1971-01-01
Satellite altitude determination uncertainties are discussed from the standpoint of the GEOS-C satellite. GEOS-C will be tracked by a number of the conventional satellite tracking systems, as well as by two advanced systems; a satellite-to-satellite tracking system and lasers capable of decimeter accuracies which are being developed in connection with the Goddard Earth and Ocean Dynamics Applications program. The discussion is organized in terms of a specific type of GEOS-C orbit which would satisfy a number of scientific objectives including the study of the gravitational field by means of both the altimeter and the satellite-to-satellite tracking system, studies of tides, and the Gulf Stream meanders.
Integrating out astrophysical uncertainties
Fox, Patrick J. [Theoretical Physics Department, Fermilab, Batavia, Illinois 60510 (United States); School of Natural Sciences, Institute for Advanced Study, Einstein Drive, Princeton, New Jersey 08540 (United States); Liu Jia [Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, New York 10003 (United States); Weiner, Neal [Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, New York 10003 (United States); School of Natural Sciences, Institute for Advanced Study, Einstein Drive, Princeton, New Jersey 08540 (United States)
2011-05-15
Underground searches for dark matter involve a complicated interplay of particle physics, nuclear physics, atomic physics, and astrophysics. We attempt to remove the uncertainties associated with astrophysics by developing the means to map the observed signal in one experiment directly into a predicted rate at another. We argue that it is possible to make experimental comparisons that are completely free of astrophysical uncertainties by focusing on integral quantities, such as g(v{sub min})=v{sub min}dvf(v)/v and v{sub thresh}dvvg(v). Direct comparisons are possible when the v{sub min} space probed by different experiments overlap. As examples, we consider the possible dark matter signals at CoGeNT, DAMA, and CRESST-Oxygen. We find that the expected rate from CoGeNT in the XENON10 experiment is higher than observed, unless scintillation light output is low. Moreover, we determine that S2-only analyses are constraining, unless the charge yields Q{sub y}<2.4 electrons/keV. For DAMA to be consistent with XENON10, we find for q{sub Na}=0.3 that the modulation rate must be extremely high (> or approx. 70% for m{sub {chi}=}7 GeV), while for higher quenching factors, it makes an explicit prediction (0.8-0.9 cpd/kg) for the modulation to be observed at CoGeNT. Finally, we find CDMS-Si, even with a 10 keV threshold, as well as XENON10, even with low scintillation, would have seen significant rates if the excess events at CRESST arise from elastic WIMP scattering, making it very unlikely to be the explanation of this anomaly.
Uncertainty and Anticipation in Anxiety
Grupe, Dan W.; Nitschke, Jack B.
2014-01-01
Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199
Evaluating uncertainty in simulation models
McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.
1998-12-01
The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.
Uncertainty of testing methods--what do we (want to) know?
Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone
2013-01-01
It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods. PMID:23665803
Outline Overview Design Principles Key Points Design Principles
Almulhem, Ahmad
;Outline Overview Design Principles Key Points Outline 1 Overview 2 Design Principles Least Privilege Fail Principles Key Points Least Privilege Fail-Safe Defaults Economy of Mechanism Com Least Privilege Principle#1 Design Principles Key Points Least Privilege Fail-Safe Defaults Economy of Mechanism Com Fail
How to handle calibration uncertainties in high-energy astrophysics
NASA Astrophysics Data System (ADS)
Kashyap, Vinay L.; Lee, Hyunsook; Siemiginowska, Aneta; McDowell, Jonathan; Rots, Arnold; Drake, Jeremy; Ratzlaff, Pete; Zezas, Andreas; Izem, Rima; Connors, Alanna; van Dyk, David; Park, Taeyoung
2008-07-01
Unlike statistical errors, whose importance has been well established in astronomical applications, uncertainties in instrument calibration are generally ignored. Despite wide recognition that uncertainties in calibration can cause large systematic errors, robust and principled methods to account for them have not been developed, and consequently there is no mechanism by which they can be incorporated into standard astronomical data analysis. Here we present a framework where they can be encoded such that they can be brought within the scope of analysis. We describe this framework, which is based on a modified MCMC algorithm, and propose a format standard derived from experience with effective area measurements of the ACIS-S detector on Chandra that can be applied to any instrument or method of codifying systematic errors. Calibration uncertainties can then be propagated into model parameter estimates to produce error bars that include systematic error information.
Maximum predictive power and the superposition principle
NASA Technical Reports Server (NTRS)
Summhammer, Johann
1994-01-01
In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.
Pérez-Soba Díez del Corral, Juan José
2008-01-01
Bioethics emerges about the tecnological problems of acting in human life. Emerges also the problem of the moral limits determination, because they seem exterior of this practice. The Bioethics of Principles, take his rationality of the teleological thinking, and the autonomism. These divergence manifest the epistemological fragility and the great difficulty of hmoralñ thinking. This is evident in the determination of autonomy's principle, it has not the ethical content of Kant's propose. We need a new ethic rationality with a new refelxion of new Principles whose emerges of the basic ethic experiences. PMID:18402229
Environmental assessments: Uncertainties in implementation
Hunsaker; D. B. Jr
1987-01-01
A review of the regulations, guidance, statutes, and case law affecting Environmental Assessment (EA) preparation has identified a number of uncertainties that, if clarified, would facilitate EA preparation and National Environmental Policy ACT (NEPA) implementation. Recommendations are made for clarifying the uncertainties regarding EA preparation to help EAs fulfill their intended role in the NEPA process and to thereby facilitate
Quantification of Emission Factor Uncertainty
Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...
Hydrology, society, change and uncertainty
NASA Astrophysics Data System (ADS)
Koutsoyiannis, Demetris
2014-05-01
Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.
Choice under Uncertainty Jonathan Levin
Zalta, Edward N.
is made in the face of uncertainty. While we often rely on models of certain information as you've seen on different investments, their health and their future preferences? How should firms choose what products classes is to develop a model of choice behavior under uncertainty. We start with the von Neumann
Planning ATES systems under uncertainty
NASA Astrophysics Data System (ADS)
Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin
2015-04-01
Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.
Uncertainty in Integrated Assessment Scenarios
Mort Webster
2005-10-17
The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean trends from a model for uncertainty projections. The probability distributions of these critical model drivers, and the resulting uncertainty in projections from a range of models, can provide the basis of future emission scenario set designs.
Uncertainty relation in Schwarzschild spacetime
Feng, Jun; Gould, Mark D; Fan, Heng
2015-01-01
We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduce a nontrivial modification on the uncertainty bound for particular observer, therefore could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitably increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole, which triggers an effectively reduced uncert...
Scattering strength uncertainty
NASA Astrophysics Data System (ADS)
Harrison, Chris H.
2002-11-01
A serious weakness in modeling shallow water reverberation is the uncertainty in bottom scattering strength and its angle-dependence. If the bottom scattering law is assumed to be a separable function of an incoming and outgoing angle it follows that the reverberation contains separable incoming and outgoing propagation terms. Thus the returning multipaths from a scattering patch are weighted directly by (the outgoing part of) the scattering law. This means that comparisons of reverberation and propagation angle-dependence on a vertical receiving array have the potential to reveal the scattering law directly. In this paper we discuss a reverberation experiment with complementary propagation measurements using a VLA and a broadband source to deduce scattering law angle-dependence and absolute scattering strength. The approach is justified by some analysis, and findings are compared with the numerical results of a new multistatic sonar model, SUPREMO. The experiment was conducted in a fairly flat bottomed part of the Mediterranean south of Sicily during BOUNDARY2002.
Pandemic influenza: certain uncertainties
Morens, David M.; Taubenberger, Jeffery K.
2011-01-01
SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672
Uncertainties in Supernova Yields
NASA Astrophysics Data System (ADS)
Young, Patrick A.; Fryer, C. L.
2006-12-01
Theoretical nucleosynthetic yields from supernovae are sensitive to both the details of the progenitor star and the explosion calculation. We attempt to comprehensively identify the sources of uncertainties in these yields. In this poster we concentrate on the variations in yields from a single progenitor arising from common 1-dimensional methods of approximating a supernova explosion. 3-dimensional effects in the explosion and the progenitor and improved physics in the progenitor evolution are also given preliminary consideration. For the 1-dimensional explosions we find that both elemental and isotopic yields for Si and heavier elements are a sensitive function of explosion energy. Also, piston-driven and thermal bomb type explosions have different yields for the same explosion energy. Yields derived from 1-dimensional explosions are non-unique. Bulk yields of common elements can vary by factors of several depending upon the assumptions of the calculation. This work was carried out in part under the auspices of the National Nuclear Security Administration of the U.S. Department of Energy at Los Alamos National Laboratory and supported by Contract No. DE-AC52-06NA25396, by a DOE SciDAC grant DE-FC02-01ER41176, an NNSA ASC grant, and a subcontract to the ASCI FLASH Center at the University of Chicago.
ERIC Educational Resources Information Center
Porter, J. D.
This paper provides a brief history of computers. It explains basic computer principles and compares computer capabilities. Subjects such as input/output, binary logic, storage, and cost are also discussed. (Author)
Buoyancy and Archimedes Principle
NSDL National Science Digital Library
Summary Buoyancy is based on Archimedes' Principle which states that the buoyant force acting upward on an object completely or partially immersed in a fluid equals the weight of the fluid displaced by the ...
Global ethics and principlism.
Gordon, John-Stewart
2011-09-01
This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together. PMID:22073817
Chemical Principles Exemplified
ERIC Educational Resources Information Center
Plumb, Robert C.
1972-01-01
Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)
Archimedes' Principle in Action
ERIC Educational Resources Information Center
Kires, Marian
2007-01-01
The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)
The Bayesian brain: phantom percepts resolve sensory uncertainty.
De Ridder, Dirk; Vanneste, Sven; Freeman, Walter
2014-07-01
Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the Bayesian updating of knowledge via active sensory exploration of the environment, driven by the Shannonian free-energy principle, provides an explanation for the generation of phantom percepts, as a way to reduce uncertainty, to make sense of the world. PMID:22516669
A Higher Order GUP with Minimal Length Uncertainty and Maximal Momentum
Pouria Pedram
2012-10-19
We present a higher order generalized (gravitational) uncertainty principle (GUP) in the form $[X,P]=i\\hbar/(1-\\beta P^2)$. This form of GUP is consistent with various proposals of quantum gravity such as string theory, loop quantum gravity, doubly special relativity, and predicts both a minimal length uncertainty and a maximal observable momentum. We show that the presence of the maximal momentum results in an upper bound on the energy spectrum of the momentum eigenstates and the harmonic oscillator.
PIV uncertainty quantification by image matching
NASA Astrophysics Data System (ADS)
Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio
2013-04-01
A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.
Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation
Smith, D.L.; Otuka, N.
2012-12-15
This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.
Uncertainties of Mayak urine data
Miller, Guthrie [Los Alamos National Laboratory; Vostrotin, Vadim [SUBI; Vvdensky, Vladimir [SUBI
2008-01-01
For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.
Belda, Petra M; Mielck, Jobst B
2003-03-01
The uncertainty of some characteristic parameters describing the course of the tabletting process, namely the area quotient according to Emschermann and Müller, the apparent net work, the slope of the Heckel-plot, and the parameters of the modified Weibull function, was calculated according to the German norm DIN 1319-4 1999. The method allows to consider random and systematic uncertainties in a consistent way as variances of normal and rectangular probability distributions, respectively, or other suitable probability distributions based on Bayesian statistics and the principle of maximum entropy. So, random and systematic uncertainties known from a calibration and validation study of the measurement of force and displacement, and the uncertainty of the true density were included meaningfully into the uncertainty of the resulting tabletting parameters using the propagation of uncertainties according to the Gauss method. The standard uncertainty for the results calculated this way seems to be suitable for a critical evaluation of the tabletting data. PMID:12637100
Measurement control: Principles and practice as applied to nondestructive assay
Sampson, T.E.
1991-01-01
This paper discusses the principles and practice of measurement control for nondestructive assay (NDA) instruments. The NDA is not always blessed with the highly controlled samples that are assumed in the analytical laboratory. This adversely affects the use and applicability of the historical error information from instrument stability checks to estimate measurement uncertainties for the broad range of sample characteristics presented to most NDA instruments. This paper emphasizes the methods used to perform instrument stability checks and discusses the resulting uncertainty information that can be derived from these measurements. 4 refs., 2 figs., 2 tabs.
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 2010-01-01 false Uncertainty analyses. 436.24 Section...Cost Analyses § 436.24 Uncertainty analyses. If particular...agencies may examine the impact of uncertainty on the calculation of life cycle...
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
Uncertainty in emissions projections for climate models
Webster, Mort David.; Babiker, Mustafa H.M.; Mayer, Monika.; Reilly, John M.; Harnisch, Jochen.; Hyman, Robert C.; Sarofim, Marcus C.; Wang, Chien.
Future global climate projections are subject to large uncertainties. Major sources of this uncertainty are projections of anthropogenic emissions. We evaluate the uncertainty in future anthropogenic emissions using a ...
Principles of multisensory behavior.
Otto, Thomas U; Dassy, Brice; Mamassian, Pascal
2013-04-24
The combined use of multisensory signals is often beneficial. Based on neuronal recordings in the superior colliculus of cats, three basic rules were formulated to describe the effectiveness of multisensory signals: the enhancement of neuronal responses to multisensory compared with unisensory signals is largest when signals occur at the same location ("spatial rule"), when signals are presented at the same time ("temporal rule"), and when signals are rather weak ("principle of inverse effectiveness"). These rules are also considered with respect to multisensory benefits as observed with behavioral measures, but do they capture these benefits best? To uncover the principles that rule benefits in multisensory behavior, we here investigated the classical redundant signal effect (RSE; i.e., the speedup of response times in multisensory compared with unisensory conditions) in humans. Based on theoretical considerations using probability summation, we derived two alternative principles to explain the effect. First, the "principle of congruent effectiveness" states that the benefit in multisensory behavior (here the speedup of response times) is largest when behavioral performance in corresponding unisensory conditions is similar. Second, the "variability rule" states that the benefit is largest when performance in corresponding unisensory conditions is unreliable. We then tested these predictions in two experiments, in which we manipulated the relative onset and the physical strength of distinct audiovisual signals. Our results, which are based on a systematic analysis of response time distributions, show that the RSE follows these principles very well, thereby providing compelling evidence in favor of probability summation as the underlying combination rule. PMID:23616552
The traveltime holographic principle
NASA Astrophysics Data System (ADS)
Huang, Yunsong; Schuster, Gerard T.
2015-01-01
Fermat's interferometric principle is used to compute interior transmission traveltimes ?pq from exterior transmission traveltimes ?sp and ?sq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times ?pq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes ?pq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes ?pq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.
Structural model uncertainty in stochastic simulation
McKay, M.D.; Morrison, J.D. [Los Alamos National Lab., NM (United States). Technology and Safety Assessment Div.
1997-09-01
Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.
Photometric Uncertainties within Hinode XRT
NASA Astrophysics Data System (ADS)
Kobelski, Adam; Saar, S. H.; Weber, M. A.; McKenzie, D. E.; Reeves, K. K.
2012-05-01
We have developed estimates of the systematic uncertainties for the X-Ray Telescope (XRT) on Hinode. These estimates are included as optional returns from the standard XRT data reduction software, xrt_prep.pro. Included in these software estimates are uncertainties from instrument vignetting, dark current subtraction, split bias leveling, Fourier filtering and JPEG compression. Sources of uncertainty that rely heavily on models of plasma radiation or assumptions of elemental abundances, such as photon noise, are discussed, but not included in the software. It will be shown that the photon noise is much larger than the systematic uncertainty. This work is supported by NASA under contract NNM07AB07C with the Harvard-Smithsonian Astrophysical Observatory
Computational Methods in Uncertainty Quantification
Computational Methods in Uncertainty Quantification Robert Scheichl Department of Mathematical Methods in UQ HGS Course, June 2015 1 / 32 #12;Lecture 4 Bayesian Inverse Problems Conditioning on Data Inverse Problems Least Squares Minimisation and Regularisation Bayes' Rule and Bayesian Interpretation
Scientific Uncertainty: An Industry Perspective.
ERIC Educational Resources Information Center
Perhac, Ralph
1986-01-01
Discusses the uncertainties inherent in assessing the nature and extent of any damage that might be attributed to acidic deposition. Probes associated dilemmas related to decisions involving control strategies, and indicates societal and legislative roles for solving this problem. (ML)
Estimations of uncertainties of frequencies
NASA Astrophysics Data System (ADS)
Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan
2015-08-01
Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common.The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints.We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.
Non-scalar uncertainty: Uncertainty in dynamic systems
NASA Technical Reports Server (NTRS)
Martinez, Salvador Gutierrez
1992-01-01
The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.
Challenging Proteins Principles and Methods
Jacobsen, Steve
Purifying Challenging Proteins Principles and Methods GE Healthcare #12;2 Handbook 28-9095-31 AA & Chromatofocusing Principles and Methods 11-0004-21 Purifying Challenging Proteins Principles and Methods 28-9095-31 Handbooks from GE Healthcare #12;Purifying Challenging Proteins Principles and Methods #12;2 Handbook 28
Fine-grained lower limit of entropic uncertainty in the presence of quantum memory.
Pramanik, T; Chowdhury, P; Majumdar, A S
2013-01-11
The limitation on obtaining precise outcomes of measurements performed on two noncommuting observables of a particle as set by the uncertainty principle in its entropic form can be reduced in the presence of quantum memory. We derive a new entropic uncertainty relation based on fine graining, which leads to an ultimate limit on the precision achievable in measurements performed on two incompatible observables in the presence of quantum memory. We show that our derived uncertainty relation tightens the lower bound set by entropic uncertainty for members of the class of two-qubit states with maximally mixed marginals, while accounting for the recent experimental results using maximally entangled pure states and mixed Bell-diagonal states. An implication of our uncertainty relation on the security of quantum key generation protocols is pointed out. PMID:23383877
A Typology for Visualizing Uncertainty
Thomson, Judi R.; Hetzler, Elizabeth G.; MacEachren, Alan; Gahegan, Mark N.; Pavel, Misha
2005-01-05
Information analysts must rapidly assess information to determine its usefulness in supporting and informing decision makers. In addition to assessing the content, the analyst must also be confident about the quality and veracity of the information. Visualizations can concisely represent vast quantities of information thus aiding the analyst to examine larger quantities of material; however visualization programs are challenged to incorporate a notion of confidence or certainty because the factors that influence the certainty or uncertainty of information vary with the type of information and the type of decisions being made. For example, the assessment of potentially subjective human-reported data leads to a large set of uncertainty concerns in fields such as national security, law enforcement (witness reports), and even scientific analysis where data is collected from a variety of individual observers. What's needed is a formal model or framework for describing uncertainty as it relates to information analysis, to provide a consistent basis for constructing visualizations of uncertainty. This paper proposes an expanded typology for uncertainty, drawing from past frameworks targeted at scientific computing. The typology provides general categories for analytic uncertainty, a framework for creating task-specific refinements to those categories, and examples drawn from the national security field.
Wildfire Decision Making Under Uncertainty
NASA Astrophysics Data System (ADS)
Thompson, M.
2013-12-01
Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.
Differential Landauer's principle
NASA Astrophysics Data System (ADS)
Granger, Léo; Kantz, Holger
2013-03-01
Landauer's principle states that the erasure of information must be a dissipative process. In this paper, we carefully analyze the recording and erasure of information on a physical memory. On the one hand, we show that, in order to record some information, the memory has to be driven out of equilibrium. On the other hand, we derive a differential version of Landauer's principle: We link the rate at which entropy is produced at every time of the erasure process to the rate at which information is erased.
NASA Technical Reports Server (NTRS)
Hankins, D. B.; Wake, W. H.
1981-01-01
The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.
NASA Astrophysics Data System (ADS)
Born, Max; Wolf, Emil
1999-10-01
Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.
A weak equivalence principle test on a suborbital rocket
Robert D. Reasenberg; James D. Phillips
2010-01-26
We describe a Galilean test of the weak equivalence principle, to be conducted during the free fall portion of a sounding rocket flight. The test of a single pair of substances is aimed at a measurement uncertainty of sigma(eta) < 10^-16 after averaging the results of eight separate drops. The weak equivalence principle measurement is made with a set of four laser gauges that are expected to achieve 0.1 pm Hz^-1/2. The discovery of a violation (eta not equal to 0) would have profound implications for physics, astrophysics, and cosmology.
The precautionary principle in environmental science.
Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M
2001-01-01
Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114
The scope of modelling the behavior of pollutants in the aquatic environment is now immense. n many practical applications, there are effectively no computational constraints on what is possible. here is accordingly an increasing need for a set of principles of modelling that in ...
Pattern recognition principles
NASA Technical Reports Server (NTRS)
Tou, J. T.; Gonzalez, R. C.
1974-01-01
The present work gives an account of basic principles and available techniques for the analysis and design of pattern processing and recognition systems. Areas covered include decision functions, pattern classification by distance functions, pattern classification by likelihood functions, the perceptron and the potential function approaches to trainable pattern classifiers, statistical approach to trainable classifiers, pattern preprocessing and feature selection, and syntactic pattern recognition.
Principles of Cancer Screening.
Pinsky, Paul F
2015-10-01
Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike. PMID:26315516
PRINCIPLES OF WATER FILTRATION
This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...
Functional Principles of Learning.
ERIC Educational Resources Information Center
Humphreys, Lloyd G.
In order of importance, curriculum, motivation, academic ability, and teaching methods are described in this paper as principles affecting classroom learning that can lead to more effective instruction. Curriculum simply exposes students to appropriate content and subject matter. Educational research should concentrate on the evaluation of…
Giulio Chiribella; Giacomo Mauro D'Ariano; Paolo Perinotti
2015-09-11
Quantum theory was discovered in an adventurous way, under the urge to solve puzzles-like the spectrum of the blackbody radiation-that haunted the physics community at the beginning of the 20th century. It soon became clear, though, that quantum theory was not just a theory of specific physical systems, but rather a new language of universal applicability. Can this language be reconstructed from first principles? Can we arrive at it from logical reasoning, instead of ad hoc guesswork? A positive answer was provided in Refs. [1, 2], where we put forward six principles that identify quantum theory uniquely in a broad class of theories. We first defined a class of "theories of information", constructed as extensions of probability theory in which events can be connected into networks. In this framework, we formulated the six principles as rules governing the control and the accessibility of information. Directly from these rules, we reconstructed a number of quantum information features, and eventually, the whole Hilbert space framework. In short, our principles characterize quantum theory as the theory of information that allows for maximal control of randomness.
Principles of Teaching. Module.
ERIC Educational Resources Information Center
Rhoades, Joseph W.
This module on principles of teaching is 1 in a series of 10 modules written for vocational education teacher education programs. It is designed to enable the teacher to do the following: (1) identify subject matter and integrate that subject matter with thought-provoking questions; (2) organize and demonstrate good questioning techniques; and (3)…
ERIC Educational Resources Information Center
Kamat, R. V.
1991-01-01
A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)
First Principles of Instruction.
ERIC Educational Resources Information Center
Merrill, M. David
2002-01-01
Examines instructional design theories and elaborates principles about when learning is promoted, i.e., when learners are engaged in solving real-world problems, when existing knowledge is activated as a foundation for new knowledge, and when new knowledge is demonstrated to the learner, applied by the learner, and integrated into the learner's…
Putting Principles into Practice
ERIC Educational Resources Information Center
Jamieson, Joan; Chapelle, Carol A.; Preiss, Sherry
2004-01-01
CALL evaluation might ideally draw on principles from fields such as second language acquisition, language pedagogy, instructional design, and testing and measurement in order to make judgments about criteria such as elaborated input, feedback, collaborative learning, authentic tasks, navigation, screen design, reliability, validity, impact, and…
Identifying Product Scaling Principles
Perez, Angel 1986-
2011-06-02
) ........................................................................................ 53 Figure 19: Manual push cart Black Box model ................................................................ 61 Figure 20: Electric push cart Black Box model ............................................................... 61 Figure 21: A. Manual... and B. Electric push cart activity diagrams with highlighted boxes that represent possible activities that can be simplified or combined through the application of the ?simplify system? and ?combine functions? principles...
Principles of Wildlife Conservation
NSDL National Science Digital Library
Dan Edge
Principles of Wildlife Conservation is a course that was developed to fulfill requirements in the curriculum of Forest Resource Technology students at Chemeketa Community College in Salem, Oregon. It is an introductory course that presents a diversity of issues relating to wildlife conservation and management and is open to the general student population.
Basic Comfort Heating Principles.
ERIC Educational Resources Information Center
Dempster, Chalmer T.
The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…
The Proposal Design Principles
Flynn, E. Victor
satisfactory connections with Worcester Place, Walton Street and the broader community. #12;49 Urban Design Architects consider Exeter College's Walton Street Quad as an urban design project. With its very longThe Proposal 5.0 #12;48 Design Principles 5.1 Existing ground floor plan showing extent of retained
Weak Equivalence Principle Test on a Sounding Rocket
NASA Astrophysics Data System (ADS)
Phillips, J. D.; Patla, B. R.; Popescu, E. M.; Rocco, E.; Thapa, R.; Reasenberg, R. D.; Lorenzini, E. C.
2011-12-01
SR-POEM, our principle of equivalence measurement on a sounding rocket, will compare the free fall rate of two substances yielding an uncertainty of 10-16 in the estimate of ?. During the past two years, the design concept has matured and we have been working on the required technology, including a laser gauge that is self aligning and able to reach 0.1 {pm/}? {{Hz}} for periods up to 40 s. We describe the status and plans for this project.
Uncertainty quantification in reacting flow.
Marzouk, Youssef M. (MIT, Cambridge, MA); Debusschere, Bert J.; Najm, Habib N.; Berry, Robert Bruce
2010-06-01
Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.
Uncertainty Quantification in Solidification Modelling
NASA Astrophysics Data System (ADS)
Fezi, K.; Krane, M. J. M.
2015-06-01
Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.
Communicating Uncertainties on Climate Change
NASA Astrophysics Data System (ADS)
Planton, S.
2009-09-01
The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.
The Principle of Maximum Conformality
Brodsky, Stanley J; /SLAC; Giustino, Di; /SLAC
2011-04-05
A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.
The Top Mass: Interpretation and Theoretical Uncertainties
André H. Hoang
2014-12-11
Currently the most precise LHC measurements of the top quark mass are determinations of the top quark mass parameter of Monte-Carlo (MC) event generators reaching uncertainties of well below $1$ GeV. However, there is an additional theoretical problem when using the MC top mass $m_t^{\\rm MC}$ as an input for theoretical predictions, because a rigorous relation of $m_t^{\\rm MC}$ to a renormalized field theory mass is, at the very strict level, absent. In this talk I show how - nevertheless - some concrete statements on $m_t^{\\rm MC}$ can be deduced assuming that the MC generator behaves like a rigorous first principles QCD calculator for the observables that are used for the analyses. I give simple conceptual arguments showing that in this context $m_t^{\\rm MC}$ can be interpreted like the mass of a heavy-light top meson, and that there is a conversion relation to field theory top quark masses that requires a non-perturbative input. The situation is in analogy to B physics where a similar relation exists between experimental B meson masses and field theory bottom masses. The relation gives a prescription how to use $m_t^{\\rm MC}$ as an input for theoretical predictions in perturbative QCD. The outcome is that at this time an additional uncertainty of about $1$ GeV has to be accounted for. I discuss limitations of the arguments I give and possible ways to test them, or even to improve the current situation.
forest ecology Quantifying Uncertainty in Forest
Battles, John
uncertainty but rarely propagate the uncertainty in the allometric equations used to estimate tree biomass, much less the uncertainty in the selection of which allometric equations to use. Change over time may of inaccurate allometric equations or soil sampling techniques. Quantifying uncertainty is not as difficult
Linear Programming Problems for Generalized Uncertainty
ERIC Educational Resources Information Center
Thipwiwatpotjana, Phantipa
2010-01-01
Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…
Standardized Assessment of Reasoning in Contexts of UncertaintyThe Script Concordance Approach
Bernard Charlin; Cees van der Vleuten
2004-01-01
Current written tools of assessment are mostly measuring the capacity to solve well-defined problems by the application of rules and principles, while the essence of expertise in the professions lies in the capacity to solve illdefined problems, that is, reasoning in contexts of uncertainty. The purpose of this study is to describe an approach that allows assessing ill-defined problems and
Daume III, Hal
Uncertainty in visual processes predicts geometrical optical illusions q Cornelia Fermuller a 2003 Abstract It is proposed in this paper that many geometrical optical illusions, as well as illusory principle which governs the workings of vision systems, and optical illusions are an artifact
Robust optimization of nonlinear impulsive rendezvous with uncertainty
NASA Astrophysics Data System (ADS)
Luo, YaZhong; Yang, Zhen; Li, HengNian
2014-04-01
The optimal rendezvous trajectory designs in many current research efforts do not incorporate the practical uncertainties into the closed loop of the design. A robust optimization design method for a nonlinear rendezvous trajectory with uncertainty is proposed in this paper. One performance index related to the variances of the terminal state error is termed the robustness performance index, and a two-objective optimization model (including the minimum characteristic velocity and the minimum robustness performance index) is formulated on the basis of the Lambert algorithm. A multi-objective, non-dominated sorting genetic algorithm is employed to obtain the Pareto optimal solution set. It is shown that the proposed approach can be used to quickly obtain several inherent principles of the rendezvous trajectory by taking practical errors into account. Furthermore, this approach can identify the most preferable design space in which a specific solution for the actual application of the rendezvous control should be chosen.
Measuring uncertainty by extracting fuzzy rules using rough sets
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.
1991-01-01
Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.
Uncertainty and Sensitivity Analyses Plan
Simpson, J.C.; Ramsdell, J.V. Jr.
1993-04-01
Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.
Uncertainty in 3D gel dosimetry
NASA Astrophysics Data System (ADS)
De Deene, Yves; Jirasek, Andrew
2015-01-01
Three-dimensional (3D) gel dosimetry has a unique role to play in safeguarding conformal radiotherapy treatments as the technique can cover the full treatment chain and provides the radiation oncologist with the integrated dose distribution in 3D. It can also be applied to benchmark new treatment strategies such as image guided and tracking radiotherapy techniques. A major obstacle that has hindered the wider dissemination of gel dosimetry in radiotherapy centres is a lack of confidence in the reliability of the measured dose distribution. Uncertainties in 3D dosimeters are attributed to both dosimeter properties and scanning performance. In polymer gel dosimetry with MRI readout, discrepancies in dose response of large polymer gel dosimeters versus small calibration phantoms have been reported which can lead to significant inaccuracies in the dose maps. The sources of error in polymer gel dosimetry with MRI readout are well understood and it has been demonstrated that with a carefully designed scanning protocol, the overall uncertainty in absolute dose that can currently be obtained falls within 5% on an individual voxel basis, for a minimum voxel size of 5 mm3. However, several research groups have chosen to use polymer gel dosimetry in a relative manner by normalizing the dose distribution towards an internal reference dose within the gel dosimeter phantom. 3D dosimetry with optical scanning has also been mostly applied in a relative way, although in principle absolute calibration is possible. As the optical absorption in 3D dosimeters is less dependent on temperature it can be expected that the achievable accuracy is higher with optical CT. The precision in optical scanning of 3D dosimeters depends to a large extend on the performance of the detector. 3D dosimetry with X-ray CT readout is a low contrast imaging modality for polymer gel dosimetry. Sources of error in x-ray CT polymer gel dosimetry (XCT) are currently under investigation and include inherent limitations in dosimeter homogeneity, imaging performance, and errors induced through post-acquisition processing. This overview highlights a number of aspects relating to uncertainties in polymer gel dosimetry.
Common Principles and Multiculturalism
Zahedi, Farzaneh; Larijani, Bagher
2009-01-01
Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720
Applied optical principles: keratometry.
Sampson, W G
1979-03-01
The optical correction of the aphakic eye requires the addition of a significant amount of vergence power to the corneal interface by one of three modalities: spectacle, contact lens, or intraocular lens. No matter which modality is chosen, based upon Gullstrand's parameters, the corneal interface provides approximately 74% of the effective vergence power of the eye; therefore, it is essential to understand the principles underlying the clinical assessment of the refracting power of the cornea. This presentation briefly reviews selected principles underlying the determination of corneal curvature and toricity by keratometry, as well as their relationship to the correcting spectacle lenses, and emphasizes the necessity of surveying the topography of the cornea for the greatest accuracy in these measurements. PMID:530586
Common principles and multiculturalism.
Zahedi, Farzaneh; Larijani, Bagher
2009-01-01
Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720
Basic Principles of Ultrasound
NSDL National Science Digital Library
2004-01-01
Created by a team of medical professionals and health-care specialists, the main Echo Web site contains a wide range of resources dealing primarily with diagnostic ultrasounds, sonography, and the field of echocardiography. One of the most helpful of these resources is the Basic Principles of Ultrasound online course, which is available here at no cost. The course itself is divided into six different sections, along with a bibliography and FAQ area. Visitors can use the online course to learn about the basic principles of ultrasound, the basic science behind related devices and instruments, and the ways to use these devices safely. Instructors might also do well to use this website in conjunction with lectures on the subject, or as away to give students an additional resource to consult at their leisure.
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Uncertainties in parton related quantities.
Thorne, Robert S
ar X iv :h ep -p h/ 02 05 23 5v 1 2 1 M ay 2 00 2 Uncertainties in Parton Related Quantities R.S. Thorne Cavendish Laboratory, University of Cambridge, UK Abstract I discuss the issue of uncertainties in parton distributions and in the physical... for inviting to present this talk. References [1] M. Glu¨ck, E. Reya and A. Vogt, Eur. Phys. J. C5 (1998) 461. [2] A.D. Martin, R.G. Roberts, W.J. Stirling and R.S. Thorne, Eur. Phys. J. C23 (2002) 73. [3] CTEQ Collaboration: J. Pumplin et al., hep-ph/0201195...
Principles of gravitational biology
NASA Technical Reports Server (NTRS)
Smith, A. H.
1975-01-01
Physical principles of gravitation are enumerated, including gravitational and inertial forces, weight and mass, weightlessness, size and scale effects, scale limits of gravitational effects, and gravity as biogenic factor. Statocysts, otolithic organs of vertebrates, gravity reception in plants, and clinostat studies for gravitation orientation are reviewed. Chronic acceleration is also studied, as well as physiology of hyper and hypodynamic fields. Responses of animals to a decreased acceleration field are examined, considering postural changes, work capacity, growth, and physiologic deadaptation.
NSDL National Science Digital Library
The Principle's of Flight Web site is offered by the Pilot's Web Aviation Journal and contains an excellent introduction to the physics of flight. Topics include Newton's laws of motion and force, airfoils, lift and drag, forces acting on an airplane, speed, flight maneuvers, the effects of roll, and more. Each topic contains good illustrations, descriptions, and equations. Overall, the site is an interesting and informative look behind the science of flight.
Principles of Public Paul Tabbush
Principles of Public Engagement Paul Tabbush Bianca Ambrose-Oji #12;Principles of Public Engagement This is a document was produced in association with: Paul Tabbush Forestry and Social Research Services Farnham
Accounting for Calibration Uncertainty in Detectors for High-Energy Astrophysics
NASA Astrophysics Data System (ADS)
Xu, Jin
Systematic instrumental uncertainties in astronomical analyses have been generally ignored in data analysis due to the lack of robust principled methods, though the importance of incorporating instrumental calibration uncertainty is widely recognized by both users and instrument builders. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. Lee et al. (2011) introduced a so-called pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduces an ad hoc technique that simplifies computation by assuming that the current data is not useful in narrowing the uncertainty for the calibration product, i.e., that the prior and posterior distributions for the calibration products are the same. In the thesis, we focus on incorporating calibration uncertainty into a principled Bayesian X-ray spectral analysis, specifically we account for uncertainty in the so-called effective area curve and the photon redistribution matrix. X-ray spectral analysis models the distribution of the energies of X-ray photons emitted from an astronomical source. The effective area curve of an X-ray detector describes its sensitive as a function of the energy of incoming photons, and the photon redistribution matrix describes the probability distribution of the recorded (discrete) energy of a photon as a function of the true (discretized) energy. Starting with the effective area curve, we follow Lee et al. (2011) and use a principle component analysis (PCA) to efficiently represent the uncertainty. Here, however, we leverage this representation to enable a principled, fully Bayesian method to account for calibration uncertainty in high-energy spectral analysis. For the photon redistribution matrix, we first model each conditional distribution as a normal distribution and then apply PCA to the parameters describing the normal models. This results in an efficient low-dimensional summary of the uncertainty in the redistribution matrix. Our methods for both calibration products are compared with standard analysis techniques and the pragmatic Bayesian method of Lee et al. (2011). The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product; we demonstrate this for the effective area curve. In this way, our fully Bayesian approach can yield more accurate and efficient estimates of the source parameters, and valid estimates of their uncertainty. Moreover, the fully Bayesian approach is the only method that allows us to make a valid inference about the effective area curve itself, quantifying which possible curves are most consistent with the data.
Uncertainty Management in Expert Systems
Ng Keung-chi; Bruce Abramson
1990-01-01
Basic expert system terminology is reviewed, and several uncertainty management paradigms are surveyed. The focus is on subjective probability theory, Dempster-Shafer theory, and possibility theory, although a number of other methods are mentioned. The benefits and limitations of the various schemes are examined, examples of expert systems within each school are presented, and some relevant open problems are discussed
Exploring Uncertainty with Projectile Launchers
ERIC Educational Resources Information Center
Orzel, Chad; Reich, Gary; Marr, Jonathan
2012-01-01
The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…
Identity Uncertainty and Citation Matching
Hanna Pasula; Bhaskara Marthi; Brian Milch; Stuart J. Russell; Ilya Shpitser
2002-01-01
Identity uncertainty is a pervasive problem in real-world data analysis. It arises whenever objects are not labeled with unique identifiers or when those identifiers may not be perceived perfectly. In such cases, two ob- servations may or may not correspond to the same object. In this paper, we consider the problem in the context of citation matching—the prob- lem of
Uncertainty quantification and error analysis
Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip
2010-01-01
UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.
Getting to grips with uncertainty
T Kippenberger
1998-01-01
Challenges the belief shown by many McKinsey managers that accurate predictions of the future follow rigorous analysis using the many tools available. Describes four levels of uncertainty that require differing approaches: clear-enough future; alternate futures; range of futures; and true ambiguity. Emphasizes that real options create learning through the thought processes that the method demands and by virtue of the
Uncertainties in stellar abundance analyses
Martin Asplund
2003-10-16
Over the last half-century quantitative stellar spectroscopy has made great progress. However, most stellar abundance analyses today still employ rather simplified models, which can introduce severe systematic errors swamping the observational errors. Some of these uncertainties for late-type stars are briefly reviewed here: atomic and molecular data, stellar parameters, model atmospheres and spectral line formation.
Quantification of entanglement via uncertainties
Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S.
2007-03-15
We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.
Spatial uncertainty and ecological models
Jager, Yetta [ORNL; King, Anthony Wayne [ORNL
2004-07-01
Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.
Structure and Uncertainty Graphical modelling
Green, Peter
1 1 Structure and Uncertainty Graphical modelling and complex stochastic systems Peter Green/nerve network 12 Complex stochastic systems Problems in these areas and many others - have been successfully networks 6 Venter et al, Science, 16 February, 2001 Functional categories of genes in the human genome #12
Academic Principles: A Brief Introduction
ERIC Educational Resources Information Center
Association of American Universities, 2013
2013-01-01
For many decades certain core principles have guided the conduct of teaching, research, and scholarship at American universities, as well as the ways in which these institutions are governed. There is ample evidence that these principles have strongly contributed to the quality of American universities. The principles have also made these…
Cosmology with minimal length uncertainty relations
Babak Vakili
2008-11-21
We study the effects of the existence of a minimal observable length in the phase space of classical and quantum de Sitter (dS) and Anti de Sitter (AdS) cosmology. Since this length has been suggested in quantum gravity and string theory, its effects in the early universe might be expected. Adopting the existence of such a minimum length results in the Generalized Uncertainty Principle (GUP), which is a deformed Heisenberg algebra between minisuperspace variables and their momenta operators. We extend these deformed commutating relations to the corresponding deformed Poisson algebra in the classical limit. Using the resulting Poisson and Heisenberg relations, we then construct the classical and quantum cosmology of dS and Ads models in a canonical framework. We show that in classical dS cosmology this effect yields an inflationary universe in which the rate of expansion is larger than the usual dS universe. Also, for the AdS model it is shown that GUP might change the oscillatory nature of the corresponding cosmology. We also study the effects of GUP in quantized models through approximate analytical solutions of the Wheeler-DeWitt (WD) equation, in the limit of small scale factor for the universe, and compare the results with the ordinary quantum cosmology in each case.
Attention, Uncertainty, and Free-Energy
Feldman, Harriet; Friston, Karl J.
2010-01-01
We suggested recently that attention can be understood as inferring the level of uncertainty or precision during hierarchical perception. In this paper, we try to substantiate this claim using neuronal simulations of directed spatial attention and biased competition. These simulations assume that neuronal activity encodes a probabilistic representation of the world that optimizes free-energy in a Bayesian fashion. Because free-energy bounds surprise or the (negative) log-evidence for internal models of the world, this optimization can be regarded as evidence accumulation or (generalized) predictive coding. Crucially, both predictions about the state of the world generating sensory data and the precision of those data have to be optimized. Here, we show that if the precision depends on the states, one can explain many aspects of attention. We illustrate this in the context of the Posner paradigm, using the simulations to generate both psychophysical and electrophysiological responses. These simulated responses are consistent with attentional bias or gating, competition for attentional resources, attentional capture and associated speed-accuracy trade-offs. Furthermore, if we present both attended and non-attended stimuli simultaneously, biased competition for neuronal representation emerges as a principled and straightforward property of Bayes-optimal perception. PMID:21160551
Adjoint-Based Uncertainty Quantification with MCNP
NASA Astrophysics Data System (ADS)
Seifried, Jeffrey Edwin
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.
Stronger Schrödinger-like uncertainty relations
Qiu-Cheng Song; Cong-Feng Qiao
2015-04-11
In a recent work [Phys. Rev. Lett. {\\bf113}, 260401 (2014)], L. Maccone and A. K. Pati presented two stronger uncertainty relations and an amended Heisenberg-Robertson uncertainty relation for incompatible observables. In this work we derive a pair of Schr\\"odinger-like uncertainty relations for the product and sum of two variances. We also obtain a uncertainty relation for three observables and investigate its property for spin-1 particle state, which indicates that the new uncertainty relation may provide a stronger lower bound than the trivial extension of Schr\\"odinger uncertainty relation.
On the evaluation of uncertainties in climate models
NASA Astrophysics Data System (ADS)
Tredger, Edward
The prediction of the Earth's climate system is of immediate importance to many decision-makers. Anthropogenic climate change is a key area of public policy and will likely have widespread impacts across the world over the 21st Century. Understanding potential climate changes, and their magnitudes, is important for effective decision making. The principal tools used to provide such climate predictions are physical models, some of the largest and most complex models ever built. Evaluation of state-of-the-art climate models is vital to understanding our ability to make statements about future climate. This Thesis presents a framework for the analysis of climate models in light of their inherent uncertainties and principles of statistical good practice. The assessment of uncertainties in model predictions to-date is incomplete and warrants more attention that it has previously received. This Thesis aims to motivate a more thorough investigation of climate models as fit for use in decision-support. The behaviour of climate models is explored using data from the largest ever climate modelling experiment, the climateprediction.net project. The availability of a large set of simulations allows novel methods of analysis for the exploration of the uncertainties present in climate simulations. It is shown that climate models are capable of producing very different behaviour and that the associated uncertainties can be large. Whilst no results are found that cast doubt on the hypothesis that greenhouse gases are a significant driver of climate change, the range of behaviour shown in the climateprediction.net data set has implications for our ability to predict future climate and for the interpretation of climate model output. It is argued that uncertainties should be explored and communicated to users of climate predictions in such a way that decision-makers are aware of the relative robustness of climate model output.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Complex Correspondence Principle
Bender, Carl M.; Meisinger, Peter N. [Department of Physics, Washington University, St. Louis, Missouri 63130 (United States); Hook, Daniel W. [Theoretical Physics, Imperial College London, London SW7 2AZ (United Kingdom); Wang Qinghai [Department of Physics, National University of Singapore, Singapore 117542 (Singapore)
2010-02-12
Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.
The quantification of uncertainties in internal doses assessed from monitoring measurements.
Etherington, G
2007-01-01
A novel method is described for the assessment of total uncertainties in intakes and internal doses assessed from in vivo and bioassay monitoring measurements. Using the information on uncertainties in intake patterns, measurements and biokinetic model parameters, the probability distribution functions for assessed intake and dose were generated using Monte-Carlo simulations. The method was implemented using software written in MS Visual Basic 6.0. Preliminary results are presented for the example of routine tritium-in-urine monitoring. It was shown that the uncertainty in the assessed dose first decreases with increasing monitoring interval, reaching a minimum at approximately 14 d, and then increases as the monitoring interval is increased beyond 14 d. The distribution describing the ratio of assessed-dose-to-true-dose becomes very asymmetric at longer monitoring intervals. In principle, this method should allow realistic uncertainties to be placed on assessed internal doses. PMID:17237184
Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, Árpád
2015-05-13
The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants. PMID:25658668
Fracture mechanics principles.
Mecholsky, J J
1995-03-01
The principles of linear elastic fracture mechanics (LEFM) were developed in the 1950s by George Irwin (1957). This work was based on previous investigations of Griffith (1920) and Orowan (1944). Irwin (1957) demonstrated that a crack shape in a particular location with respect to the loading geometry had a stress intensity associated with it. He also demonstrated the equivalence between the stress intensity concept and the familiar Griffith criterion of failure. More importantly, he described the systematic and controlled evaluation of the toughness of a material. Toughness is defined as the resistance of a material to rapid crack propagation and can be characterized by one parameter, Kic. In contrast, the strength of a material is dependent on the size of the initiating crack present in that particular sample or component. The fracture toughness of a material is generally independent of the size of the initiating crack. The strength of any product is limited by the size of the cracks or defects during processing, production and handling. Thus, the application of fracture mechanics principles to dental biomaterials is invaluable in new material development, production control and failure analysis. This paper describes the most useful equations of fracture mechanics to be used in the failure analysis of dental biomaterials. PMID:8621030
Revisiting Tversky's diagnosticity principle
Evers, Ellen R. K.; Lakens, Daniël
2013-01-01
Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638
Principles of Safety Pharmacology
Pugsley, M K; Authier, S; Curtis, M J
2008-01-01
Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233
Principle of relative locality
Amelino-Camelia, Giovanni [Dipartimento di Fisica, Universita 'La Sapienza', and Sez. Roma1 INFN, P. le A. Moro 2, 00185 Roma (Italy); Freidel, Laurent; Smolin, Lee [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario N2J 2Y5 (Canada); Kowalski-Glikman, Jerzy [Institute for Theoretical Physics, University of Wroclaw, Pl. Maxa Borna 9, 50-204 Wroclaw (Poland)
2011-10-15
We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.
Principles of conservative prescribing.
Schiff, Gordon D; Galanter, William L; Duhig, Jay; Lodolce, Amy E; Koronkowski, Michael J; Lambert, Bruce L
2011-09-12
Judicious prescribing is a prerequisite for safe and appropriate medication use. Based on evidence and lessons from recent studies demonstrating problems with widely prescribed medications, we offer a series of principles as a prescription for more cautious and conservative prescribing. These principles urge clinicians to (1) think beyond drugs (consider nondrug therapy, treatable underlying causes, and prevention); (2) practice more strategic prescribing (defer nonurgent drug treatment; avoid unwarranted drug switching; be circumspect about unproven drug uses; and start treatment with only 1 new drug at a time); (3) maintain heightened vigilance regarding adverse effects (suspect drug reactions; be aware of withdrawal syndromes; and educate patients to anticipate reactions); (4) exercise caution and skepticism regarding new drugs (seek out unbiased information; wait until drugs have sufficient time on the market; be skeptical about surrogate rather than true clinical outcomes; avoid stretching indications; avoid seduction by elegant molecular pharmacology; beware of selective drug trial reporting); (5) work with patients for a shared agenda (do not automatically accede to drug requests; consider nonadherence before adding drugs to regimen; avoid restarting previously unsuccessful drug treatment; discontinue treatment with unneeded medications; and respect patients' reservations about drugs); and (6) consider long-term, broader impacts (weigh long-term outcomes, and recognize that improved systems may outweigh marginal benefits of new drugs). PMID:21670331
Great Lakes Literacy Principles
NASA Astrophysics Data System (ADS)
Fortner, Rosanne W.; Manzo, Lyndsey
2011-03-01
Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.
Uncertainty Estimation for Extreme Isotopic Ratios Measured in Multi-Collector Mass Spectrometers
NASA Astrophysics Data System (ADS)
Essex, R. M.; Thomas, R. B.
2007-12-01
Isotopic abundances for elements of interest to the earth science community can vary by many orders of magnitude. Commonly used detector systems (Faraday, Daly, SEM) can not encompass the entire dynamic range of interest and making measurements at the limits of the useful range for a single type of detector can also present special difficulties. Therefore, estimating a measurement uncertainty that accurately represents the level of confidence in the results obtained for an extreme ratio (<1:10,000) requires the careful consideration of several components that are not applicable to more conventional isotopic ratio measurement. To be consistent with widely accepted principles for determination of measurement uncertainty, as expressed in the ISO "Guide to the Expression of Uncertainty in Measurement" (GUM), it is necessary to consistently evaluate the contributions of these additional components and provide an uncertainty budget for measurement results. This is of particular importance for measurements of extreme isotopic ratios because many of the uncertainty components are associated with measurement biases. For mass spectrometric measurements of elements with relatively evenly distributed isotopic abundances (e.g. Sr and Nd), multicollector dynamic analysis techniques can reduce the significant uncertainty contributors to measurement variability and uncertainty of the internal normalization ratio. For more extreme ratios such as 234U/238U and 230Th/232Th, not only do uncertainties associated with measurement variability and mass bias corrections have to be considered, but several other components can be significant or even dominant contributors to measurement uncertainty. These components include uncertainties associated with detector inter-calibration, detector linearity, electronic baseline, background corrections, dark noise, dead time, and mass interferences. Although one or several of these parameters may not be significant contributors to the uncertainty for a particular measurement, the relative significance of any individual component can change dramatically depending upon signal intensity. Therefore, a proper uncertainty evaluation will, at least initially, include all of the identifiable components associated with that particular measurement system. The relationship of these various components to the total uncertainty should be presented in an uncertainty budget which provides transparency for the uncertainty estimate and is a useful tool for determining where best to seek improvements in accuracy or precision.
Uncertainty in flood risk mapping
NASA Astrophysics Data System (ADS)
Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo
2014-05-01
A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.
Quantifying the uncertainty in heritability.
Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph
2014-05-01
The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large. PMID:24670270
Quantifying the uncertainty in heritability
Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph
2014-01-01
The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large. PMID:24670270
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Probabilistic Robotics Overview of probability, Representing uncertainty
Kosecka, Jana
Probabilistic Robotics ·Overview of probability, Representing uncertainty ·Propagation of uncertainty, Bayes Rule ·Application to Localization and Mapping Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics (S. Thurn et al. ) #12;Probabilistic Robotics Key idea
Greenhouse Gas Inventory Uncertainties Need Characterization
Post, Wilfred M.
Greenhouse Gas Inventory Uncertainties Need Characterization Contact: Gregg Marland, 865 carbon cycles and both regional and national ramifications of international agreements. · Inventories.S. Department of Energy Greenhouse Gas Inventory Uncertainties Need Characterization Abstract: The assessment
Optimization under uncertainty in radiation therapy
Chan, Timothy Ching-Yee
2007-01-01
In the context of patient care for life-threatening illnesses, the presence of uncertainty may compromise the quality of a treatment. In this thesis, we investigate robust approaches to managing uncertainty in radiation ...
Generalized approach to minimal uncertainty products
Mendoza, Douglas M
2013-01-01
A general technique to construct quantum states that saturate uncertainty products using variational methods is developed. Such a method allows one to numerically compute uncertainties in cases where the Robertson-Schrodinger ...
Trading indicators with information-gap uncertainty
Guttman, Tony
1 Trading indicators with information-gap uncertainty Colin J. Thompson ARC Centre of Excellence. Practical implications An additional technical trading tool for applying Information Gap theory trading indicators, Information gaps, Uncertainty, Robustness, Financial modelling Paper type Research
Quantifying uncertainty from material inhomogeneity.
Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee
2009-09-01
Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.
Communicating uncertainty to policy makers
Anthony Patt
As the types of problems that policy-makers attempt to solve grow more complex, they increasingly are turning to scientists\\u000a for specific advice. A critical challenge in communicating the results of scientific research arises when those results contain\\u000a a great deal of uncertainty. Different academic disciplines offer diverging advice on how scientists should proceed, based\\u000a in large part on differences in
Quantification of uncertainties in composites
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Singhal, S. N.; Murthy, P. L. N.; Chamis, Christos C.
1993-01-01
An integrated methodology is developed for computationally simulating the probabilistic composite material properties at all composite scales. The simulation requires minimum input consisting of the description of uncertainties at the lowest scale (fiber and matrix constituents) of the composite and in the fabrication process variables. The methodology allows the determination of the sensitivity of the composite material behavior to all the relevant primitive variables. This information is crucial for reducing the undesirable scatter in composite behavior at its macro scale by reducing the uncertainties in the most influential primitive variables at the micro scale. The methodology is computationally efficient. The computational time required by the methodology described herein is an order of magnitude less than that for Monte Carlo Simulation. The methodology has been implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of the methodology/code are demonstrated by simulating the uncertainties in the heat-transfer, thermal, and mechanical properties of a typical laminate and comparing the results with the Monte Carlo simulation method and experimental data. The important observation is that the computational simulation for probabilistic composite mechanics has sufficient flexibility to capture the observed scatter in composite properties.
Uncertainty propagation in nuclear forensics.
Pommé, S; Jerome, S M; Venchiarutti, C
2014-07-01
Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined. PMID:24607529
Uncertainty compliant design flood estimation
NASA Astrophysics Data System (ADS)
Botto, A.; Ganora, D.; Laio, F.; Claps, P.
2014-05-01
Hydraulic infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of Uncertainty Compliant Design (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimizes the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.
Quantifying Uncertainty in Epidemiological Models
Ramanathan, Arvind; Jha, Sumit Kumar
2012-01-01
Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.
NASA Astrophysics Data System (ADS)
Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Coquet, Richard; François Fontaine, Jean
2014-12-01
Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed.
Application of fuzzy system theory in addressing the presence of uncertainties
NASA Astrophysics Data System (ADS)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.
2015-02-01
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.
48 CFR 49.113 - Cost principles.
Code of Federal Regulations, 2010 CFR
2010-10-01
...2010-10-01 2010-10-01 false Cost principles. 49.113 Section 49.113 Federal Acquisition...MANAGEMENT TERMINATION OF CONTRACTS General Principles 49.113 Cost principles. The cost principles and procedures...
Risk Analysis and Uncertainty: Implications for Counselling
ERIC Educational Resources Information Center
Hassenzahl, David
2004-01-01
Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…
Experimentation and uncertainty analysis for engineers
H. W. Coleman; W. G. Jr. Steele
1989-01-01
The application of uncertainty analysis (UA) methods to experimental programs is discussed in an introduction for advanced undergraduate and graduate students of engineering and the physical sciences. Chapters are devoted to experimental errors and uncertainty; statistical considerations in measurement uncertainties; general UA methods for experiment planning; detailed UA methods for experiment design; problems due to variable but deterministic bias errors,
Assessment of Uncertainty-Infused Scientific Argumentation
ERIC Educational Resources Information Center
Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.
2014-01-01
Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…
Shannon Revisited: Information in Terms of Uncertainty.
ERIC Educational Resources Information Center
Cole, Charles
1993-01-01
Discusses the meaning of information in terms of Shannon's mathematical theory of communication and the concept of uncertainty. The uncertainty associated with the transmission of the signal is argued to have more significance for information science than the uncertainty associated with the selection of a message from a set of possible messages.…
The Stock Market: Risk vs. Uncertainty.
ERIC Educational Resources Information Center
Griffitts, Dawn
2002-01-01
This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…
UK coastal flood risk; understanding the uncertainty
Matt Lewis; Paul Bates; Kevin Horsburgh; Ros Smith
2010-01-01
The sensitivity of flood risk mapping to the major sources of future climate uncertainty were investigated by propagating these uncertainties through a LISFLOOD inundation model of a significant flood event of the North Somerset coast, to the west of the UK. The largest source of uncertainty was found to be the effect of the global Mean Sea Level rise range
Regarding Uncertainty in Teachers and Teaching
ERIC Educational Resources Information Center
Helsing, Deborah
2007-01-01
The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…
Principles of Induction Accelerators
NASA Astrophysics Data System (ADS)
Briggs*, Richard J.
The basic concepts involved in induction accelerators are introduced in this chapter. The objective is to provide a foundation for the more detailed coverage of key technology elements and specific applications in the following chapters. A wide variety of induction accelerators are discussed in the following chapters, from the high current linear electron accelerator configurations that have been the main focus of the original developments, to circular configurations like the ion synchrotrons that are the subject of more recent research. The main focus in the present chapter is on the induction module containing the magnetic core that plays the role of a transformer in coupling the pulsed power from the modulator to the charged particle beam. This is the essential common element in all these induction accelerators, and an understanding of the basic processes involved in its operation is the main objective of this chapter. (See [1] for a useful and complementary presentation of the basic principles in induction linacs.)
NASA Astrophysics Data System (ADS)
Vogl, Thomas J.
The evolving field of interventional oncology can only be considered as a small integrative part in the complex area of oncology. The new field of interventional oncology needs a standardization of the procedures, the terminology, and criteria to facilitate the effective communication of ideas and appropriate comparison between treatments and new integrative technology. In principle, ablative therapy is a part of locoregional oncological therapy and is defined either as chemical ablation using ethanol or acetic acid, or thermotherapies such as radiofrequency, laser, microwave, and cryoablation. All these new evolving therapies have to be exactly evaluated and an adequate terminology has to be used to define imaging findings and pathology. All the different technologies and evaluated therapies have to be compared, and the results have to be analyzed in order to improve the patient outcome.
NASA Astrophysics Data System (ADS)
Barbour, Julian
The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.
Dynamical principles in neuroscience
Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I. [Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and GNB, Departamento de Ingenieria Informatica, Universidad Autonoma de Madrid, 28049 Madrid, Spain and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States); Department of Physics and Marine Physical Laboratory, Scripps Institution of Oceanography and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States)
2006-10-15
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.
Dynamical principles in neuroscience
NASA Astrophysics Data System (ADS)
Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.
2006-10-01
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?
Uncertainty Quantification in Climate Modeling
NASA Astrophysics Data System (ADS)
Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.
2011-12-01
We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Optimal uncertainty quantification with model uncertainty and legacy data
NASA Astrophysics Data System (ADS)
Kamga, P.-H. T.; Li, B.; McKerns, M.; Nguyen, L. H.; Ortiz, M.; Owhadi, H.; Sullivan, T. J.
2014-12-01
We present an optimal uncertainty quantification (OUQ) protocol for systems that are characterized by an existing physics-based model and for which only legacy data is available, i.e., no additional experimental testing of the system is possible. Specifically, the OUQ strategy developed in this work consists of using the legacy data to establish, in a probabilistic sense, the level of error of the model, or modeling error, and to subsequently use the validated model as a basis for the determination of probabilities of outcomes. The quantification of modeling uncertainty specifically establishes, to a specified confidence, the probability that the actual response of the system lies within a certain distance of the model. Once the extent of model uncertainty has been established in this manner, the model can be conveniently used to stand in for the actual or empirical response of the system in order to compute probabilities of outcomes. To this end, we resort to the OUQ reduction theorem of Owhadi et al. (2013) in order to reduce the computation of optimal upper and lower bounds on probabilities of outcomes to a finite-dimensional optimization problem. We illustrate the resulting UQ protocol by means of an application concerned with the response to hypervelocity impact of 6061-T6 Aluminum plates by Nylon 6/6 impactors at impact velocities in the range of 5-7 km/s. The ability of the legacy OUQ protocol to process diverse information on the system and its ability to supply rigorous bounds on system performance under realistic-and less than ideal-scenarios demonstrated by the hypervelocity impact application is remarkable.
Towards first-principles electrochemistry
Dabo, Ismaila
2008-01-01
This doctoral dissertation presents a comprehensive computational approach to describe quantum mechanical systems embedded in complex ionic media, primarily focusing on the first-principles representation of catalytic ...
Validation of an Experimentally Derived Uncertainty Model
NASA Technical Reports Server (NTRS)
Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.
1996-01-01
The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.
Entropic uncertainty relations for multiple measurements
Shang Liu; Liang-Zhu Mu; Heng Fan
2014-11-23
We present the entropic uncertainty relations for multiple measurement settings in quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory. They take concise forms which can be proven in a unified method and easy to calculate. Our results recover the well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. Those uncertainty relations are applicable in both foundations of quantum theory and the security of many quantum cryptographic protocols.
Uncertainty assessment in probabilistic risk assessment
Spencer, F.W.; Diegert, K.V.; Easterling, R.G.
1985-01-01
This paper focuses on our proposal for the different roles that data and expert opinion play in uncertainty analysis. Parameters for which reliable data exist are estimated by classical statistical techniques. Their uncertainty bounds are statistical confidence limits. Uncertainty about data-free parameters is expressed as a range, or set, of plausible values, with no probabilistic connotations. For parameters with both data and opinion sources, conditional confidence limits can be used to assess both total uncertainty, and the separate contributions of data-based and data-free uncertainties.
Quantum principles and free particles. [evaluation of partitions
NASA Technical Reports Server (NTRS)
1976-01-01
The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.
Aspects of universally valid Heisenberg uncertainty relation
NASA Astrophysics Data System (ADS)
Fujikawa, Kazuo; Umetsu, Koichiro
2013-01-01
A numerical illustration of a universally valid Heisenberg uncertainty relation, which was proposed recently, is presented by using the experimental data on spin-measurements by J. Erhart et al. [Nat. Phys. 8, 185 (2012)]. This uncertainty relation is closely related to a modified form of the Arthurs-Kelly uncertainty relation, which is also tested by the spin-measurements. The universally valid Heisenberg uncertainty relation always holds, but both the modified Arthurs-Kelly uncertainty relation and the Heisenberg error-disturbance relation proposed by Ozawa, which was analyzed in the original experiment, fail in the present context of spin-measurements, and the cause of their failure is identified with the assumptions of unbiased measurement and disturbance. It is also shown that all the universally valid uncertainty relations are derived from Robertson's relation and thus the essence of the uncertainty relation is exhausted by Robertson's relation, as is widely accepted.
Essays on pricing under uncertainty
Escobari Urday, Diego Alfonso
2008-10-10
. Jansen for his trust and support. I thank Dr. James Dana for his careful revision of the current version of Chapter II. I also want to show my appreciation to Dr. Volodymyr Bilotkach, Dr. Hae-Shin Hwang, Dr. Paan Jindapon, Dr. Eugenio Miravete, Dr. Carlos... leaves the gate. Dana [21] explains that yield management in airlines is used to (1) deal with costly capacity and demand uncertainty, (2) implement price discrimination, and (3) implement peak-load pricing. Because of the lack of appropriate data...
Error models for uncertainty quantification
NASA Astrophysics Data System (ADS)
Josset, L.; Scheidt, C.; Lunati, I.
2012-12-01
In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing spatial uncertainty using distances and kernels", Math Geosci (2009) [2] P. Jenny et al., "Multi-Scale finite-volume method for elliptic problems in subsurface flow simulation", J. Comp. Phys., 187(1) (2003) [3] I. Lunati and S.H. Lee, "An operator formulation of the multiscale finite-volume method with correction function", Multiscale Model. Simul. 8(1) (2009) [4] G. Mariethoz, P. Renard, and J. Straubhaar "The Direct Sampling method to perform multiple-point geostatistical simulations", Water Resour. Res., 46 (2010) [5] P. Bayer et al., "Three-dimensional high resolution fluvio-glacial aquifer analog", J. Hydro 405 (2011) 19
Congress probes climate change uncertainties
NASA Astrophysics Data System (ADS)
Simarski, Lynn Teo
Policymakers are demanding information about climate change faster than it can be turned out by scientists. This conflict between politics and science was debated at a recent congressional hearing on priorities in global change research. On October 8 and 10, panels of scientists that included AGU president-elect Ralph J. Cicerone of the University of California attempted to identify scientific uncertainties in global warming research before the House Science Committee's Subcommittee on Science.“Decisionmakers provided with incomplete information are left with the problem of choosing among options where the consequences of a wrong choice could be disastrous,” said subcommittee chair Rick Boucher (D-Va.).
BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance
NASA Astrophysics Data System (ADS)
Lira, Ignacio
2003-08-01
Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker
A Bayesian Foundation for Individual Learning Under Uncertainty
Mathys, Christoph; Daunizeau, Jean; Friston, Karl J.; Stephan, Klaas E.
2011-01-01
Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID:21629826
A bayesian foundation for individual learning under uncertainty.
Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E
2011-01-01
Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID:21629826
Solving Navigational Uncertainty Using Grid Cells on Robots
Milford, Michael J.; Wiles, Janet; Wyeth, Gordon F.
2010-01-01
To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643
Solving navigational uncertainty using grid cells on robots.
Milford, Michael J; Wiles, Janet; Wyeth, Gordon F
2010-01-01
To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643
Cryogenic Equivalence Principle Experiment
NASA Technical Reports Server (NTRS)
Everitt, C. W. F.; Worden, P. W.
1985-01-01
The purpose of this project is to test the equivalence of inertial and passive gravitational mass in an Earth-orbiting satellite. A ground-based experiment is now well developed. It consists of comparing the motions of two cylindrical test masses suspended in precision superconducting magnetic bearings and free to move along the horizontal (axis) direction. The masses are made of niobium and lead-plated aluminum. A position detector based on a SQUID magnetometer measures the differential motion between the masses. The periods of the masses are matched by adjustment of the position detector until the system is insensitive to common mode signals, and so that the experiment is less sensitive to seismic vibration. The apparatus is contained in a twelve inch helium dewar suspended in a vibration isolation stand. The stand achieves 30 db isolation from horizontal motions between 0.1 and 60 Hz, by simulating the motion of a 200 meter long pendulum with an air bearing. With this attenuation of seismic noise and a common mode rejection ratio of 10 to the 5th power in the differential mode, the ground based apparatus should have a sensitivity to equivalence principle violations of one part in 10 to the 13th power; the satellite version might have a sensitivity of one part in 10 to the 17th power.
Magnetic Core Memory Principles
NSDL National Science Digital Library
Doherty, Frederico A.
A researcher from the Department of Physics and Astronomy at the University of Glasgow provides this website on Magnetic RAM (MRAM) -- a non-volatile memory storage system similar to Flash memory except that it uses less power and switches faster. Predicting that "2005 could see mass production of MRAM parts" to be used in powering instant-on computers and computers that are in stand-by power-savings mode (as is currently done with PDAs and laptops), the author reviews some of the physical challenges yet to be overcome. The website provides some basic information on magnetic memory and binary notation, as well as sections on: the Principle of the Magnetic Memory, The Rectangular Hysterisis Loop, A Magnetic Memory Element, Arrangement of Magnetic Core Memories, Relation between the Decimal and Binary Codes, How Numbers Are Stored in a Memory, How a Binary-Coded Decimal Digit is 'written in,' How a Digit is 'read out,' and a Complete Wiring Diagram of a Matrix Plane.
[Principles of wound treatment].
Bruhin, A; Metzger, J
2007-09-01
New techniques and devices have revolutionized the treatment of wounds during the last years. For the treatment of wounds we have nowadays a great variety of new gadgets, tools and methods. Complex wounds require specific skills, given the fact that a great number of different promising methods are on the market to enable an optimal wound management. Well educated "wound experts" are required to overcome the problems of very complicated and chronic wound problems. The importance of an interdisciplinary team increases while facing the problems of special wound disorders such as a diabetic food, food ulcers or the problems of open abdomen in case of severe peritonitis. In this overview the main principles of modern wound treatment are outlined. The aim of this article is to present a good summary of wound judgement and treatment for the practioner. Increasingly important is it to point out the situation of complexe wounds which should be judgded and treated with the help of a "wound expert". PMID:18075140
Collaborative framework for PIV uncertainty quantification: the experimental database
NASA Astrophysics Data System (ADS)
Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio
2015-07-01
The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for comparison of the measurement accuracy of existing or newly developed PIV interrogation algorithms. The database is publicly available on the website www.piv.de/uncertainty.
Meaty Principles for Environmental Educators.
ERIC Educational Resources Information Center
Rockcastle, V. N.
1985-01-01
Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)
Ideario Educativo (Principles of Education).
ERIC Educational Resources Information Center
Consejo Nacional Tecnico de la Educacion (Mexico).
This document is an English-language abstract (approximately 1,500 words) which discusses an overall educational policy for Mexico based on Constitutional principles and those of humanism. The basic principles that should guide Mexican education as seen by the National Technical Council for Education are the following: (1) love of country; (2)…
ERIC Educational Resources Information Center
Ouellette, John
2004-01-01
Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…
Multimedia Principle in Teaching Lessons
ERIC Educational Resources Information Center
Kari Jabbour, Khayrazad
2012-01-01
Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…
Computational principles of movement neuroscience
Zoubin Ghahramani; Daniel M. Wolpert
2000-01-01
Unifying principles of movement have emerged from the computational study of motor control. We review several of these principles and show how they apply to processes such as motor planning, control, estimation, prediction and learning. Our goal is to demonstrate how specific models emerging from the computational approach provide a theoretical framework for movement neuroscience.
Biology 2250 Principles of Genetics
Innes, David J.
1 Biology 2250 Principles of Genetics Instructors: Dr. Steven M. Carr B Molecular Genetics Dr. David J. Innes B MendelianGenetics Biology 2250 Principles of Genetics Lab Instructor: Valerie Power Genetics Laboratory: SN-4110 (Lab. organization meeting week of Sept. 13: Groups A &B) Lab. Demonstrators
Uncertainty Quantification and Transdimensional Inversion
NASA Astrophysics Data System (ADS)
Sambridge, M.; Hawkins, R.
2014-12-01
Over recent years transdimensional inference methods have grown in popularity and found applications in fields ranging from Solid Earth Geophysics, to Geochemistry. In all applications of inversion assumptions are made about the nature of the model parametrisation, complexity and data noise characteristics, and results can be significantly dependent on those assumptions. Often these are in the form of fixed choices imposed a priori, e.g. in the grid size of the model or noise level in the data. A transdimensional approach allows these assumptions to be relaxed by incorporating relevant parameters as unknowns in the inference problem, e.g. the number of model parameters becomes a variable as does the form of basis functions and the variance of the data noise. In this way uncertainty due to parameterisation effects or data noise choices may be incorporated into the inference process. Probabilistic sampling techniques such as Birth-Death Markov chain Monte Carlo and the Reversible jump algorithm, allow sampling over complex posterior probability density functions providing information on constraint, trade-offs and uncertainty in the unknowns. This talk will present a review of trans-dimensional inference and its application in geophysical inversion, and highlight some emerging trends such as Multi-scale McMC, Parallel Tempering and Sequential McMC which hold the promise of further extending the range of problems where these methods are practical.
The legal status of uncertainty
NASA Astrophysics Data System (ADS)
Ferraris, L.; Miozzo, D.
2009-09-01
Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We are, in fact, presently experiencing a detrimental increase of legal actions taken against the authorities of civil protection whom, relying on the forecasts of mathematical models, fail in protecting the population. It is our profound concern that civilians have granted the right of being protected by any means, and at the same extent, from natural hazards and from the fallacious behaviour of whom should grant individual safety. But, at the same time, a dangerous overcriminalization could have a negative impact on the Civil Protection system inducing a dangerous defensive behaviour which is costly and ineffective. A few case studies are presented in which the role of uncertainty, in numerical predictions, is made evident and discussed. Scientists, thus, need to help policymakers to agree on sound procedures that must recognize the real level of unpredictability. Hence, we suggest the creation of an international and interdisciplinary committee, with the scope of having politics, jurisprudence and science communicate, to find common solutions to a common problem.
Precautionary principle in international law.
Saladin, C
2000-01-01
The deregulatory nature of trade rules frequently brings them into conflict with the precautionary principle. These rules dominate debate over the content and legal status of the precautionary principle at the international level. The World Trade Organization (WTO), because of its power in settling disputes, is a key player. Many States are concerned to define the precautionary principle consistent with WTO rules, which generally means defining it as simply a component of risk analysis. At the same time, many States, especially environmental and public health policymakers, see the principle as the legal basis for preserving domestic and public health measures in the face of deregulatory pressures from the WTO. The precautionary principle has begun to acquire greater content and to move into the operative articles of legally binding international agreements. It is important to continue this trend. PMID:11114120
Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P. [Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany); Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg, Germany and Department of Radiation Oncology, University Clinic Heidelberg, 69120 Heidelberg (Germany); Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany)
2012-04-15
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.
Harper, F.T.; Young, M.L.; Miller, L.A.
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.
Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.
James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz
2009-01-01
The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
NASA Astrophysics Data System (ADS)
Beeler, N. M.
Imagine for a moment that you are a field structural geologist, and you have just realized that your star graduate student does not know how to estimate the failure strength of intact rock at 10 km depth in a normal faulting environment. Or perhaps you are a geophysicist with graduate students modeling mantle convection who, as you come to find out, do not know what a dislocation is. You might decide that your students need to take a course in basic rock mechanics, but, and this may be easiest to imagine, you are the only staff member in your department available to teach such a course.If you are developing an introductory course in rock mechanics or you have been teaching such a course without a suitable text, this new book by Ruud Wiejermars was written specifically for you and your students. Principles of Rock Mechanics is a textbook to a one-semester course for graduate students and advanced undergraduates. There are 13 chapters, a math review section, and the obligatory introduction and final overview chapters. Each chapter is designed to be covered in two 50-minute lectures and one laboratory session. Following a formal introduction to the topic, the subsequent seven chapters serve as an introduction to the physical concepts and processes; physical quantities in rock mechanics, force and pressure, stress, elasticity, brittle failure, and ductile creep, taking the students to midterm. An unusual and welcome feature appears at the midsemester point—a math review of notation and associated concepts: differentiation of vectors and scalars, differential equations, tensors, matrices and determinants, and complex variables. This review provides an indication of the rigor to follow.
Principles of animal extrapolation
Calabrese, E.J.
1991-01-01
Animal Extrapolation presents a comprehensive examination of the scientific issues involved in extrapolating results of animal experiments to human response. This text attempts to present a comprehensive synthesis and analysis of the host of biomedical and toxicological studies of interspecies extrapolation. Calabrese's work presents not only the conceptual basis of interspecies extrapolation, but also illustrates how these principles may be better used in selection of animal experimentation models and in the interpretation of animal experimental results. The book's theme centers around four types of extrapolation: (1) from average animal model to the average human; (2) from small animals to large ones; (3) from high-risk animal to the high risk human; and (4) from high doses of exposure to lower, more realistic, doses. Calabrese attacks the issues of interspecies extrapolation by dealing individually with the factors which contribute to interspecies variability: differences in absorption, intestinal flora, tissue distribution, metabolism, repair mechanisms, and excretion. From this foundation, Calabrese then discusses the heterogeneticity of these same factors in the human population in an attempt to evaluate the representativeness of various animal models in light of interindividual variations. In addition to discussing the question of suitable animal models for specific high-risk groups and specific toxicological endpoints, the author also examines extrapolation questions related to the use of short-term tests to predict long-term human carcinogenicity and birth defects. The book is comprehensive in scope and specific in detail; for those environmental health professions seeking to understand the toxicological models which underlay health risk assessments, Animal Extrapolation is a valuable information source.
Physical Principles of Mammography
NASA Astrophysics Data System (ADS)
Dance, David R.
An outline is given of the underlying physical principles that govern the selection and use of systems for X-ray mammography. Particular attention is paid to screen-film mammography as some aspects of digital mammography are considered in another lecture. The size and composition of the compressed female breast and of calcifications are described and the magnitude of photon interaction processes in breast tissues discussed. The physical performance measures contrast, unsharpness, dose, noise and dynamic range are outlined and used in a treatment of the various components of the mammographic system. The selection of photon energy is a compromise between contrast and/or signal-to-noise ratio on the one hand, and breast dose on the other. For screen-film imaging the contrast achieved is considered to be the most important image measure and the performances of different mammographic target/filter combinations (including Mo/Mo, Mo/Rh, Rh/Rh and W/Rh) are compared on this basis. For digital imaging, the signal-tonoise ratio is the most important image measure, and the optimal X-ray spectra are then different to those for screen-film mammography. The relationship between image unsharpness and focal spot size and image magnification is explored. The importance of breast compression is stressed and the advantages of compression listed. The contrast in the image is degraded by scattered photons recorded by the image receptor and the magnitude of this effect and the reduction achievable using mammographic anti-scatter grids considered. The performance of mammographic screen-film receptors is described and analyzed, paying attention to unsharpness, noise and receptor DQE.
Hydrotectonics; principles and relevance
Kopf, R.W.
1982-01-01
Hydrotectonics combines the principles of hydraulics and rock mechanics. The hypothesis assumes that: (1) no faults are truly planar, (2) opposing noncongruent wavy wallrock surfaces form chambers and bottlenecks along the fault, and (3) most thrusting occurs beneath the water table. These physical constraints permit the following dynamics. Shear displacement accompanying faulting must constantly change the volume of each chamber. Addition of ground water liquefies dry fault breccia to a heavy incompressible viscous muddy breccia I call fault slurry. When the volume of a chamber along a thrust fault decreases faster than its fault slurry can escape laterally, overpressurized slurry is hydraulically injected into the base of near-vertical fractures in the otherwise impervious overriding plate. Breccia pipes commonly form where such fissures intersect. Alternating decrease and increase in volume of the chamber subjects this injection slurry to reversible surges that not only raft and abrade huge clasts sporadically spalled from the walls of the conduit but also act as a forceful hydraulic ram which periodically widens the conduit and extends its top. If the pipe perforates a petroleum reservoir, leaking hydrocarbons float to its top. Sudden faulting may generate a powerful water hammer that can be amplified at some distal narrow ends of the anastomosing plumbing system, where the shock may produce shatter cones. If vented on the Earth's surface, the muddy breccia, now called extrusion slurry, forms a mud volcano. This hypothesis suggests that many highly disturbed features presently attributed to such catastrophic processes as subsurface explosions or meteorite impacts are due to the rheology of tectonic slurry in an intermittently reactivated pressure-relief tube rooted in a powerful reciprocating hydrotectonic pump activated by a long-lived deep-seated thrust fault.
Principles of ecosystem sustainability
Chapin, F.S. III; Torn, M.S.; Tateno, Masaki
1996-12-01
Many natural ecosystems are self-sustaining, maintaining an characteristic mosaic of vegetation types of hundreds to thousands of years. In this article we present a new framework for defining the conditions that sustain natural ecosystems and apply these principles to sustainability of managed ecosystems. A sustainable ecosystem is one that, over the normal cycle of disturbance events, maintains its characteristics diversity of major functional groups, productivity, and rates of biogeochemical cycling. These traits are determined by a set of four {open_quotes}interactive controls{close_quotes} (climate, soil resource supply, major functional groups of organisms, and disturbance regime) that both govern and respond to ecosystem processes. Ecosystems cannot be sustained unless the interactive controls oscillate within stable bounds. This occurs when negative feedbacks constrain changes in these controls. For example, negative feedbacks associated with food availability and predation often constrain changes in the population size of a species. Linkages among ecosystems in a landscape can contribute to sustainability by creating or extending the feedback network beyond a single patch. The sustainability of managed systems can be increased by maintaining interactive controls so that they form negative feedbacks within ecosystems and by using laws and regulations to create negative feedbacks between ecosystems and human activities, such as between ocean ecosystems and marine fisheries. Degraded ecosystems can be restored through practices that enhance positive feedbacks to bring the ecosystem to a state where the interactive controls are commensurate with desired ecosystem characteristics. The possible combinations of interactive controls that govern ecosystem traits are limited by the environment, constraining the extent to which ecosystems can be managed sustainably for human purposes. 111 refs., 3 figs., 2 tabs.
Weak Equivalence Principle Test on a Sounding Rocket
James D. Phillips; Bijunath R. Patla; Eugeniu M. Popescu; Emanuele Rocco; Rajesh Thapa; Robert D. Reasenberg; Enrico C. Lorenzini
2010-08-04
SR-POEM, our principle of equivalence measurement on a sounding rocket, will compare the free fall rate of two substances yielding an uncertainty of E-16 in the estimate of \\eta. During the past two years, the design concept has matured and we have been working on the required technology, including a laser gauge that is self aligning and able to reach 0.1 pm per root hertz for periods up to 40 s. We describe the status and plans for this project.
The Travelling Wave Group - 5 Departures from Dirac's Principles
NASA Astrophysics Data System (ADS)
Bourdillon, Antony J.
2014-03-01
The Traveling Wave Group (TWG) for a free particle is written, ? = A(X2 / 2?2 + X) . Here, X = i(kx - ?t) , ? is an experimental initial value, with Aa normalizing constant dependent on it, while ? is the mean angular frequency, and
Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods
Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.
2013-12-01
The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.
Position-Momentum Uncertainty Relations in the Presence of Quantum Memory
Fabian Furrer; Mario Berta; Marco Tomamichel; Volkher B. Scholz; Matthias Christandl
2015-01-05
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.
Spectral optimization and uncertainty quantification in combustion modeling
NASA Astrophysics Data System (ADS)
Sheen, David Allan
Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.
ERIC Educational Resources Information Center
Cole, Charles; Cantero, Pablo; Sauve, Diane
1998-01-01
Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…
Capturing the uncertainty in adversary attack simulations.
Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce
2008-09-01
This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.
Incorporating Forecast Uncertainty in Utility Control Center
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2014-07-09
Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)
Notes on the effect of dose uncertainty
Morris, M.D.
1987-01-01
The apparent dose-response relationship between amount of exposure to acute radiation and level of mortality in humans is affected by uncertainties in the dose values. It is apparent that one of the greatest concerns regarding the human data from Hiroshima and Nagasaki is the unexpectedly shallow slope of the dose response curve. This may be partially explained by uncertainty in the dose estimates. Some potential effects of dose uncertainty on the apparent dose-response relationship are demonstrated.
Measuring the uncertainty of coupling
NASA Astrophysics Data System (ADS)
Zhao, Xiaojun; Shang, Pengjian
2015-06-01
A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.
Spreadsheets generate reservoir uncertainty distributions
Murtha, J.A. (Murtha (James A.), Houston, TX (United States)); Janusz, G.J. (Univ. of Illinois, Urbana (United States))
1995-03-13
Spreadsheets can quickly generate and graph normal, lognormal, and triangular distributions for analyzing prospects or problems involving uncertainty in the oil and gas industry. Monte Carlo simulation input distributions are specified in different ways, often by assigning the 10th and 90th percentiles, P10 and P90. Selecting the appropriate distribution (such as normal, lognormal, and triangular distributions) and justifying hat choice is critical to the believability of the model. The authors will show procedures for calculating parameters for normal, lognormal, and triangular distributions with a spreadsheet, and outline the proof that you always get a unique answer for a triangular distribution. The significance of the procedure is to avoid P10 and P90 inputs that yield a triangular distribution whose end points are clearly inappropriate. One such case would be if negative values are sampled to estimate net pay.
Induction of models under uncertainty
NASA Technical Reports Server (NTRS)
Cheeseman, Peter
1986-01-01
This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.
NASA Astrophysics Data System (ADS)
Bellac, Michel Le
2014-11-01
At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".
Least Action Principle in Gait
Fan, Yifang; Fan, Yubo; Xu, Zongxiang; Li, Zhiyu; Luo, Donglin
2009-01-01
We apply the laws of human gait vertical ground reaction force and discover the existence of the phenomenon of least action principle in gait. Using a capacitive mat transducer system, we obtain the variations of human gait vertical ground reaction force and establish a structure equation for the resultant of such a force. Defining the deviation of vertical force as an action function, we observe from our gait optimization analysis the least action principle at half of the stride time. We develop an evaluation index of mechanical energy consumption based upon the least action principle in gait. We conclude that these observations can be employed to enhance the accountability of gait evaluation.
OECD Principles of Corporate Governance
NSDL National Science Digital Library
The "Organisation for Economic Co-operation and Development Principles of Corporate Governance" sets out a structure for directing and controlling corporate businesses. This document (html or .pdf) consists of five sections detailing the principles: "The rights of shareholders," "The equitable treatment of shareholders," "The role of stakeholders in corporate governance," "Disclosure and transparency," and "The responsibilities of the board," as well as annotations for each of the sections. Be sure to visit the OECD Principles of Corporate Governance Q&A page, linked at the top of the page.
Principles of Pharmacotherapy: I. Pharmacodynamics
Pallasch, Thomas J.
1988-01-01
This paper and the ensuing series present the principles guiding and affecting the ability of drugs to produce therapeutic benefit or untoward harm. The principles of pharmacodynamics and pharmacokinetics, the physiologic basis of adverse drug reactions and suitable antidotal therapy, and the biologic basis of drug allergy, drug-drug interactions, pharmacogenetics, teratology and hematologic reactions to chemicals are explored. These principles serve to guide those administering and using drugs to attain the maximum benefit and least attendant harm from their use. Such is the goal of rational therapeutics. PMID:3046440
Heisenberg Uncertainty Relation for Three Canonical Observables
Spiros Kechrimparis; Stefan Weigert
2014-12-23
Uncertainty relations provide fundamental limits on what can be said about the properties of quantum systems. For a quantum particle, the commutation relation of position and momentum observables entails Heisenberg's uncertainty relation. A third observable is presented which satisfies canonical commutation relations with both position and momentum. The resulting triple of pairwise canonical observables gives rise to a Heisenberg-type uncertainty relation for the product of three standard deviations. We derive the smallest possible value of this bound and determine the specific squeezed state which saturates the triple uncertainty relation. Quantum optical experiments are proposed to verify our findings.
Modeling uncertainty: quicksand for water temperature modeling
Bartholow, John M.
2003-01-01
Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.