Sample records for uncertainty principle

  1. The Conditional Uncertainty Principle

    E-print Network

    Gilad Gour; Varun Narasimhachar; Andrzej Grudka; Micha? Horodecki; Waldemar K?obus; Justyna ?odyga

    2015-06-23

    The uncertainty principle, which states that certain sets of quantum-mechanical measurements have a minimal joint uncertainty, has many applications in quantum cryptography. But in such applications, it is important to consider the effect of a (sometimes adversarially controlled) memory that can be correlated with the system being measured: The information retained by such a memory can in fact diminish the uncertainty of measurements. Uncertainty conditioned on a memory was considered in the past by Berta et al. (Ref. 1), who found a specific uncertainty relation in terms of the von Neumann conditional entropy. But this entropy is not the only measure that can be used to quantify conditional uncertainty. In the spirit of recent work by several groups (Refs. 2--6), here we develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent form. Our formalism is built around a mathematical relation that we call conditional majorization. We define and characterize conditional majorization, and use it to develop tools for the construction of measures of the conditional uncertainty of individual measurements, and also of the joint conditional uncertainty of sets of measurements. We demonstrate the use of this framework by deriving measure-independent conditional uncertainty relations of two types: (1) A lower bound on the minimal joint uncertainty that two remote parties (Bob and Eve) have about the outcome of a given pair of measurements performed by a third remote party (Alice), conditioned on arbitrary measurements that Bob and Eve make on their own systems. This lower bound is independent of the initial state shared by the three parties; (2) An initial state--dependent lower bound on the minimal joint uncertainty that Bob has about Alice's pair of measurements in a bipartite setting, conditioned on Bob's quantum system.

  2. Uncertainty Principle Respects Locality

    E-print Network

    Dongsheng Wang

    2015-04-19

    The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one. The origin of the quantum structure of nature still remains to be explained, some post-quantum theory which is more complete in some sense than quantum mechanics is possible and might not necessarily be a hidden variable theory.

  3. Greedy Signal Recovery and Uniform Uncertainty Principles

    E-print Network

    Needell, Deanna

    Greedy Signal Recovery and Uniform Uncertainty Principles SIAM Deanna Needell Joint work with Roman Vershynin UC Davis, July 2008 Greedy Signal Recovery and Uniform Uncertainty Principles ­ p.1/27 #12;Outline · Empirical Results · Improvements Greedy Signal Recovery and Uniform Uncertainty Principles ­ p.2/27 #12

  4. Greedy Signal Recovery and Uniform Uncertainty Principles

    E-print Network

    Needell, Deanna

    Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles ­ p.1 · Main Theorem · Empirical Results · Future Work Greedy Signal Recovery and Uniform Uncertainty

  5. Quantum mechanics and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bang, Jang Young; Berger, Micheal S.

    2006-12-01

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  6. Quantum Mechanics and the Generalized Uncertainty Principle

    E-print Network

    Jang Young Bang; Micheal S. Berger

    2006-11-30

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  7. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  8. An uncertainty principle for unimodular quantum groups

    E-print Network

    Jason Crann; Mehrdad Kalantar

    2014-11-02

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to normal central states of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the central state.

  9. An uncertainty principle for unimodular quantum groups

    SciTech Connect

    Crann, Jason [School of Mathematics and Statistics, Carleton University, Ottawa, Ontario K1S 5B6 (Canada); Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex (France); Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca [School of Mathematics and Statistics, Carleton University, Ottawa, Ontario K1S 5B6 (Canada)

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  10. The Uncertainty Principle for dummies Rahul Siddharthan

    E-print Network

    Siddharthan, Rahul

    for the beginning student so we'll try to clarify it with the example of spins, which have their own angular­momentum is a particle whose spatial coordinates we are ignoring, and whose intrinsic angular momentum (or ``spin'') can of the uncertainty principle in terms of position and momentum, and one's confusion with the classical versions

  11. Quantum measurement theory and the uncertainty principle

    E-print Network

    Masanao Ozawa

    2015-07-08

    Heisenberg's uncertainty principle states that canonically conjugate observables can only be simultaneously measured under the constraint that the product of their mean errors should be no less than a limit set by Planck's constant. Heisenberg claimed that this is a straightforward mathematical consequence of basic postulates for quantum mechanics. However, Heisenberg with the subsequent completion by Kennard has long been credited only with a well-known constraint for the product of the standard deviations. Here we examine Heisenberg's original derivation of the uncertainty principle and show that Heisenberg actually derived the above mentioned constraint for simultaneous measurements but using an obsolete postulate for quantum mechanics. This assumption, known as the repeatability hypothesis, or its approximate measurement version, formulated explicitly by von Neumann and Schr\\"{o}dinger, was broadly accepted until the 1970s, whereas it was abandoned in the 1980s, when completely general quantum measurement theory was established. We also survey recent investigations to establish the universally valid reformulation of Heisenberg's uncertainty principle under this general theory of quantum measurement.

  12. Generalized Uncertainty Principle: Approaches and Applications

    E-print Network

    Abdel Nasser Tawfik; Abdel Magied Diab

    2014-11-23

    We review highlights from string theory, black hole physics and doubly special relativity and some "thought" experiments which were suggested to probe the shortest distance and/or the maximum momentum at the Planck scale. The models which are designed to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed entered the literature as the Generalized Uncertainty Principle (GUP). We compare between them. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. Furthermore, assuming modified dispersion relation allows for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of the gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. Another one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  13. Quantum Randomness Certified by the Uncertainty Principle

    E-print Network

    G. Vallone; D. Marangon; M. Tomasin; P. Villoresi

    2014-12-22

    We present an efficient method to extract the amount of true randomness that can be obtained by a Quantum Random Number Generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min- and max- entropies. We tested our method with two different QRNGs, using a train of qubits or ququart, demonstrating the scalability toward practical applications.

  14. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  15. Gravitational tests of the Generalized Uncertainty Principle

    E-print Network

    Fabio Scardigli; Roberto Casadio

    2014-07-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  16. Incorporation of generalized uncertainty principle into Lifshitz field theories

    NASA Astrophysics Data System (ADS)

    Faizal, Mir; Majumder, Barun

    2015-06-01

    In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.

  17. Open timelike curves violate Heisenberg's uncertainty principle.

    PubMed

    Pienaar, J L; Ralph, T C; Myers, C R

    2013-02-01

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity. PMID:23432226

  18. Optimal Functions for a Periodic Uncertainty Principle and Multiresolution Analysisy

    E-print Network

    Prestin, Jürgen

    Optimal Functions for a Periodic Uncertainty Principle and Multiresolution Analysisy Dedicated to a periodic multiresolution analysis where the corresponding wavelets also show similar localization] and the references therein. Further related topics in time-frequency localization in connection to multiresolution

  19. The Entropic Uncertainty Principle for Decaying Systems and CP violation

    E-print Network

    Hiesmayr, Beatrix C

    2011-01-01

    Employing an effective formalism for decaying system we are able to investigate Heisenberg's uncertainty relation for observables measured at accelerator facilities. In particular we investigate the neutral K--meson system and show that, firstly, due to the time evolution an uncertainty between strangeness measurements at different times is introduced and, secondly, due to the imbalance of matter and antimatter (CP violation) an uncertainty in the evolution of the eigenstates of the effective Hamiltonian of the system. Consequently, the existence of CP violation is linked to uncertainties of observables, i.e. the outcomes cannot be predicted even in principle to arbitrary precisions.

  20. The Entropic Uncertainty Principle for Decaying Systems and CP violation

    E-print Network

    Beatrix C. Hiesmayr

    2011-03-17

    Employing an effective formalism for decaying system we are able to investigate Heisenberg's uncertainty relation for observables measured at accelerator facilities. In particular we investigate the neutral K--meson system and show that, firstly, due to the time evolution an uncertainty between strangeness measurements at different times is introduced and, secondly, due to the imbalance of matter and antimatter (CP violation) an uncertainty in the evolution of the eigenstates of the effective Hamiltonian of the system. Consequently, the existence of CP violation is linked to uncertainties of observables, i.e. the outcomes cannot be predicted even in principle to arbitrary precisions.

  1. Uncertainty Principle, Shannon-Nyquist Sampling and Beyond

    NASA Astrophysics Data System (ADS)

    Fujikawa, Kazuo; Ge, Mo-Lin; Liu, Yu-Long; Zhao, Qing

    2015-06-01

    Donoho and Stark have shown that a precise deterministic recovery of missing information contained in a time interval shorter than the time-frequency uncertainty limit is possible. We analyze this signal recovery mechanism from a physics point of view and show that the well-known Shannon-Nyquist sampling theorem, which is fundamental in signal processing, also uses essentially the same mechanism. The uncertainty relation in the context of information theory, which is based on Fourier analysis, provides a criterion to distinguish Shannon-Nyquist sampling from compressed sensing. A new signal recovery formula, which is analogous to Donoho-Stark formula, is given using the idea of Shannon-Nyquist sampling; in this formulation, the smearing of information below the uncertainty limit as well as the recovery of information with specified bandwidth take place. We also discuss the recovery of states from the domain below the uncertainty limit of coordinate and momentum in quantum mechanics and show that in principle the state recovery works by assuming ideal measurement procedures. The recovery of the lost information in the sub-uncertainty domain means that the loss of information in such a small domain is not fatal, which is in accord with our common understanding of the uncertainty principle, although its precise recovery is something we are not used to in quantum mechanics. The uncertainty principle provides a universal sampling criterion covering both the classical Shannon-Nyquist sampling theorem and the quantum mechanical measurement.

  2. Quantum Limits of Measurements and Uncertainty Principle

    E-print Network

    Masanao Ozawa

    2015-05-19

    In this paper, we show how the Robertson uncertainty relation gives certain intrinsic quantum limits of measurements in the most general and rigorous mathematical treatment. A general lower bound for the product of the root-mean-square measurement errors arising in joint measurements of noncommuting observables is established. We give a rigorous condition for holding of the standard quantum limit (SQL) for repeated measurements, and prove that if a measuring instrument has no larger root-mean-square preparational error than the root-mean-square measurement errors then it obeys the SQL. As shown previously, we can even construct many linear models of position measurement which circumvent this condition for the SQL.

  3. Gauge theories under incorporation of a generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Kober, Martin

    2010-10-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  4. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin [Frankfurt Institute for Advanced Studies (FIAS), Institut fuer Theoretische Physik, Johann Wolfgang Goethe-Universitaet, Ruth-Moufang-Strasse 1, 60438 Frankfurt am Main (Germany)

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  5. Remarks on Generalized Uncertainty Principle Induced from Constraint System

    NASA Astrophysics Data System (ADS)

    Eune, Myungseok; Kim, Wontae

    2014-12-01

    The extended commutation relations for generalized uncertainty principle (GUP) have been based on the assumption of the minimal length in position. Instead of this assumption, we start with a constrained Hamiltonian system described by the conventional Poisson algebra and then impose appropriate second class constraints to this system. Consequently, we can show that the consistent Dirac brackets for this system are nothing, but the extended commutation relations describing the GUP.

  6. On classes of non-Gaussian asymptotic minimizers in entropic uncertainty principles

    Microsoft Academic Search

    S. Zozor; C. Vignat

    2007-01-01

    In this paper we revisit the Bialynicki-Birula and Mycielski uncertainty principle and its cases of equality. This Shannon entropic version of the well-known Heisenberg uncertainty principle can be used when dealing with variables that admit no variance. In this paper, we extend this uncertainty principle to Rényi entropies. We recall that in both Shannon and Rényi cases, and for a

  7. MaxEnt Principle for Handling Uncertainty with Qualitative Values

    NASA Astrophysics Data System (ADS)

    Pappalardo, Michele

    2006-11-01

    Bayesian mathematical model is the oldest method for modelling subjective degree of belief. If we have probabilistic measures with unknown values, then we must choose a different and appropriate model. The belief functions are a bridge between various models handling different forms of uncertainty. The conjunctive rule of Bayes builds a new set of a posteriori probability when two independent and accepted sets of random variable make inference. When two pieces of evidence are accepted with unknown values, the Dempster-Shafer's rule suggests a model for fusion of different degree of belief. In this paper we want to submit the use of MaxEnt principle for modelling the belief. Dealing with non-Bayesian sets, in which the piece of evidence represents the belief instead of the knowledge, the MaxEnt principle gives a tool to reduce the number of subsets representing the frame of discernment. The fusion a focal set with a set of max entropy cause a Bayesian approximation reducing mass function to a probabilistic distribution.

  8. Supersymmetric IIB matrix models from space-time uncertainty principle and topological symmetry

    Microsoft Academic Search

    Ichiro Oda

    1998-01-01

    Starting with topological field theory, we derive a space-time uncertainty relation proposed by Yoneya through breakdown of topological symmetry in the large N matrix model. Next, on the basis of only two basic principles, namely a generalized space-time uncertainty principle containing spinor field and topological symmetry, we construct a new matrix model. If we furthermore impose the requirement of N

  9. Generalized Uncertainty Principle and Recent Cosmic Inflation Observations

    E-print Network

    Abdel Nasser Tawfik; Abdel Magied Diab

    2014-10-29

    The recent background imaging of cosmic extragalactic polarization (BICEP2) observations are believed as an evidence for the cosmic inflation. BICEP2 provided a first direct evidence for the inflation, determined its energy scale and debriefed witnesses for the quantum gravitational processes. The ratio of scalar-to-tensor fluctuations $r$ which is the canonical measurement of the gravitational waves, was estimated as $r=0.2_{-0.05}^{+0.07}$. Apparently, this value agrees well with the upper bound value corresponding to PLANCK $r\\leq 0.012$ and to WMAP9 experiment $r=0.2$. It is believed that the existence of a minimal length is one of the greatest predictions leading to modifications in the Heisenberg uncertainty principle or a GUP at the Planck scale. In the present work, we investigate the possibility of interpreting recent BICEP2 observations through quantum gravity or GUP. We estimate the slow-roll parameters, the tensorial and the scalar density fluctuations which are characterized by the scalar field $\\phi$. Taking into account the background (matter and radiation) energy density, $\\phi$ is assumed to interact with the gravity and with itself. We first review the Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe and then suggest modification in the Friedmann equation due to GUP. By using a single potential for a chaotic inflation model, various inflationary parameters are estimated and compared with the PLANCK and BICEP2 observations. While GUP is conjectured to break down the expansion of the early Universe (Hubble parameter and scale factor), two inflation potentials based on certain minimal supersymmetric extension of the standard model result in $r$ and spectral index matching well with the observations. Corresponding to BICEP2 observations, our estimation for $r$ depends on the inflation potential and the scalar field. A power-law inflation potential does not.

  10. Violation of the Robertson-Schrödinger uncertainty principle and non-commutative quantum mechanics

    E-print Network

    Catarina Bastos; Orfeu Bertolami; Nuno Costa Dias; João Nuno Prata

    2012-11-26

    We show that a possible violation of the Robertson-Schr\\"odinger uncertainty principle may signal the existence of a deformation of the Heisenberg-Weyl algebra. More precisely, we prove that any Gaussian in phase-space (even if it violates the Robertson-Schr\\"odinger uncertainty principle) is always a quantum state of an appropriate non-commutative extension of quantum mechanics. Conversely, all canonical non-commutative extensions of quantum mechanics display states that violate the Robertson-Schr\\"odinger uncertainty principle.

  11. The Entropic Uncertainty Principle for Decaying Systems and CP violation

    Microsoft Academic Search

    Beatrix C. Hiesmayr

    2011-01-01

    Employing an effective formalism for decaying system we are able to investigate Heisenberg's uncertainty relation for observables measured at accelerator facilities. In particular we investigate the neutral K--meson system and show that, firstly, due to the time evolution an uncertainty between strangeness measurements at different times is introduced and, secondly, due to the imbalance of matter and antimatter (CP violation)

  12. Entropy of the Randall-Sundrum brane world with the generalized uncertainty principle

    E-print Network

    Wontae Kim; Yong-Wan Kim; Young-Jai Park

    2006-11-02

    By introducing the generalized uncertainty principle, we calculate the entropy of the bulk scalar field on the Randall-Sundrum brane background without any cutoff. We obtain the entropy of the massive scalar field proportional to the horizon area. Here, we observe that the mass contribution to the entropy exists in contrast to all previous results, which is independent of the mass of the scalar field, of the usual black hole cases with the generalized uncertainty principle.

  13. Semiclassical corrections to black hole entropy and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bargueño, Pedro; Vagenas, Elias C.

    2015-03-01

    In this paper, employing the path integral method in the framework of a canonical description of a Schwarzschild black hole, we obtain the corrected inverse temperature and entropy of the black hole. The corrections are those coming from the quantum effects as well as from the Generalized Uncertainty Principle effects. Furthermore, an equivalence between the polymer quantization and the Generalized Uncertainty Principle description is shown provided the parameters characterizing these two descriptions are proportional.

  14. The Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson

    E-print Network

    Ziv, Hadar

    and complexity. The complexity of software sys­ tems and their development processes is known to be intrinsic complexity are often impeded by the uncertainty permeating virtually every aspect of software develop­ mentThe Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson Information

  15. Uncertainty Principle and the Zero-Point Energy of the Harmonic Oscillator

    Microsoft Academic Search

    R. A. Newing

    1935-01-01

    ACCORDING to quantum mechanics, an oscillator possesses a definite zero-point energy of vibration, and an attempt has been made to express this result directly in terms of some general principle. It has been found that the result may be deduced from the uncertainty principle, in view of the particular relation between position, momentum and energy in a simple harmonic field.

  16. Satellite Test of the Equivalence Principle Uncertainty Analysis

    Microsoft Academic Search

    Paul Worden; John Mester

    2009-01-01

    STEP, the Satellite Test of the Equivalence Principle, is intended to test the apparent equivalence of gravitational and inertial\\u000a mass to 1 part in 1018 (Worden et al. in Adv. Space Res. 25(6):1205–1208, 2000). This will be an increase of more than five orders of magnitude over ground-based experiments and lunar laser ranging observations\\u000a (Su et al. in Phys. Rev. D 50:3614–3636, 1994;

  17. Impacts of Generalized Uncertainty Principle on Black Hole Thermodynamics and Salecker-Wigner Inequalities

    E-print Network

    Tawfik, A

    2013-01-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position u...

  18. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  19. Zero-point energies, the uncertainty principle and positivity of the quantum Brownian density operator

    E-print Network

    Allan Tameshtit

    2012-04-09

    High temperature and white noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white noise approximation is avoided, it is shown that if the zero point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.

  20. Zero-point energies, the uncertainty principle, and positivity of the quantum Brownian density operator.

    PubMed

    Tameshtit, Allan

    2012-04-01

    High-temperature and white-noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white-noise approximation is avoided, it is shown that if the zero-point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle. PMID:22680520

  1. Hawking Temperature in Taub-NUT (A)dS spaces via the Generalized Uncertainty Principle

    E-print Network

    Seyen Kouwn; Chong Oh Lee; Phillial Oh

    2010-10-15

    Using the extended forms of the Heisenberg uncertainty principle from string theory and the quantum gravity theory, we drived Hawking temperature of a Taub-Nut-(A)dS black hole. In spite of their distinctive natures such as asymptotically locally flat and breakdown of the area theorem of the horizon for the black holes, we show that the corrections to Hawking temperature by the generalized versions of the the Heisenberg uncertainty principle increases like the Schwarzschild-(A)dS black hole and give the reason why the Taub-Nut-(A)dS metric may have AdS/CFT dual picture.

  2. The uncertainty principle enables non-classical dynamics in an interferometer

    NASA Astrophysics Data System (ADS)

    Dahlsten, Oscar C. O.; Garner, Andrew J. P.; Vedral, Vlatko

    2014-08-01

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics.

  3. Symplectic quantization, inequivalent quantum theories, and Heisenberg's principle of uncertainty

    SciTech Connect

    Montesinos, Merced; Torres del Castillo, G.F. [Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN, Avenida IPN No. 2508, 07000 Ciudad de Mexico (Mexico); Departamento de Fisica Matematica, Instituto de Ciencias, Universidad Autonoma de Puebla, 72570 Puebla, Pue. (Mexico)

    2004-09-01

    We analyze the quantum dynamics of the nonrelativistic two-dimensional isotropic harmonic oscillator in Heisenberg's picture. Such a system is taken as a toy model to analyze some of the various quantum theories that can be built from the application of Dirac's quantization rule to the various symplectic structures recently reported for this classical system. It is pointed out that that these quantum theories are inequivalent in the sense that the mean values for the operators (observables) associated with the same physical classical observable do not agree with each other. The inequivalence does not arise from ambiguities in the ordering of operators but from the fact of having several symplectic structures defined with respect to the same set of coordinates. It is also shown that the uncertainty relations between the fundamental observables depend on the particular quantum theory chosen. It is important to emphasize that these (somehow paradoxical) results emerge from the combination of two paradigms: Dirac's quantization rule and the usual Copenhagen interpretation of quantum mechanics.

  4. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  5. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  6. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  7. Entropy of the FRW Universe Based on the Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Kim, Wontae; Park, Young-Jai; Yoon, Myungseok

    The statistical entropy of the FRW universe described by time-dependent metric is newly calculated using the brick wall method based on the general uncertainty principle with the minimal length. We can determine the minimal length with the Planck scale to obtain the entropy proportional to the area of the cosmological apparent horizon.

  8. PostDoc position available: Uncertainty Principles in Signal Processing, and Applications to

    E-print Network

    Feichtinger, Hans Georg

    PostDoc position available: Uncertainty Principles in Signal Processing, and Applications to Audio Signals A 1-year PostDoc position is available at the Signal and Image Processing group at LATP in Signal processing and more general data processing. There is a possibility of extending the position

  9. Average power reduction for MSM optical signals via sparsity and uncertainty principle

    Microsoft Academic Search

    Jovana Ilic; Thomas Strohmer

    2010-01-01

    Multiple subcarrier modulation is an appealing scheme for high-data rate optical communication. However a major drawback is its low average power efficiency. While subcarrier reservation is a common approach to combat this problem, little is known about the performance of algorithms that utilize subcarrier reservation. By combining properties of sparse signals with an abstract form of the Uncertainty Principle related

  10. Using uncertainty principle to find the ground-state energy of the helium and a helium-like Hookean atom

    Microsoft Academic Search

    Varun Harbola

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron cloud. Our calculation also

  11. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    SciTech Connect

    Tawfik, A., E-mail: a.tawfik@eng.mti.edu.eg [Egyptian Center for Theoretical Physics (ECTP), MTI University, 11571 Cairo (Egypt)

    2013-07-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  12. Doubly Special Relativity with a minimum speed and the Uncertainty Principle

    E-print Network

    Cláudio Nassif

    2012-03-08

    The present work aims to search for an implementation of a new symmetry in the space-time by introducing the idea of an invariant minimum speed scale ($V$). Such a lowest limit $V$, being unattainable by the particles, represents a fundamental and preferred reference frame connected to a universal background field (a vacuum energy) that breaks Lorentz symmetry. So there emerges a new principle of symmetry in the space-time at the subatomic level for very low energies close to the background frame ($v\\approx V$), providing a fundamental understanding for the uncertainty principle, i.e., the uncertainty relations should emerge from the space-time with an invariant minimum speed.

  13. The Quark-Gluon Plasma Equation of State and The Generalized Uncertainty Principle

    E-print Network

    AbouSalem, L I; Elmashad, I

    2015-01-01

    The quark-gluon plasma (QGP) equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP) is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of $n_{f}=0$, $n_{f}=2$ and $n_{f}=2+1$ flavors for the energy density, the pressure and the interaction measure. The exciting point is the large value of bag pressure especially in case of $n_{f}=2+1$ flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that, the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.

  14. Thermodynamics of (2+1)-dimensional acoustic black hole based on the generalized uncertainty principle

    E-print Network

    Wontae Kim; Edwin J. Son; Myungseok Yoon

    2008-01-09

    We study thermodynamic quantities of an acoustic black hole and its thermodynamic stability in a cavity based on the generalized uncertainty principle. It can be shown that there is a minimal black hole which can be a stable remnant after black hole evaporation. Moreover, the behavior of the free energy shows that the large black hole is stable too. Therefore, the acoustic black hole can decay into the remnant or the large black hole.

  15. The uncertainty principle enables non-classical dynamics in an interferometer.

    PubMed

    Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

    2014-01-01

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics. PMID:25105741

  16. On the connection between complementarity and uncertainty principles in the Mach-Zehnder interferometric setting

    NASA Astrophysics Data System (ADS)

    Bosyk, G. M.; Portesi, M.; Holik, F.; Plastino, A.

    2013-06-01

    We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach-Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures.

  17. THE HARDY UNCERTAINTY PRINCIPLE REVISITED M. COWLING, L. ESCAURIAZA, C. E. KENIG, G. PONCE, AND L. VEGA

    E-print Network

    Bigelow, Stephen

    . VEGA Abstract. We give a real-variable proof of the Hardy uncertainty principle. The method is based respectively. 1 #12;2 M. COWLING, L. ESCAURIAZA, C. E. KENIG, G. PONCE, AND L. VEGA The above results have

  18. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect

    Said, Jackson Levi [Physics Department, University of Malta, Msida (Malta); Adami, Kristian Zarb [Physics Department, University of Malta, Msida (Malta); Physics Department, University of Oxford, Oxford (United Kingdom)

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  19. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    SciTech Connect

    Tallacchini, Mariachiara [Bioethics, Faculty of Biotechnology, University of Milan, Via Celoria 10, 20100 Milan (Italy) and Science Technology and Law, Law Faculty, University of Piacenza, Via Emilia Parmense 84, 29100 Piacenza (Italy)]. E-mail: mariachiara.tallacchini@unimi.it

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  20. Corrections to entropy and thermodynamics of charged black hole using generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; El Dahab, Eiman Abou

    2015-03-01

    Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking (black hole) entropy, which relates the entropy to the cross-sectional area of the black hole horizon. Using generalized uncertainty principle (GUP), corrections to the geometric entropy and thermodynamics of black hole will be introduced. The impact of GUP on the entropy near the horizon of three types of black holes: Schwarzschild, Garfinkle-Horowitz-Strominger and Reissner-Nordström is determined. It is found that the logarithmic divergence in the entropy-area relation turns to be positive. The entropy S, which is assumed to be related to horizon's two-dimensional area, gets an additional terms, for instance 2? {? }? ? {S}, where ? is the GUP parameter.

  1. Trans-Planckian effects in inflationary cosmology and the modified uncertainty principle

    NASA Astrophysics Data System (ADS)

    Hassan, S. F.; Sloth, Martin S.

    2003-12-01

    There are good indications that fundamental physics gives rise to a modified space-momentum uncertainty relation that implies the existence of a minimum length scale. We implement this idea in the scalar field theory that describes density perturbations in flat Robertson-Walker space-time. This leads to a non-linear time-dependent dispersion relation that encodes the effects of Planck scale physics in the inflationary epoch. Unruh type dispersion relations naturally emerge in this approach, while unbounded ones are excluded by the minimum length principle. We also find red-shift induced modifications of the field theory, due to the reduction of degrees of freedom at high energies, that tend to dampen the fluctuations at trans-Planckian momenta. In the specific example considered, this feature helps determine the initial state of the fluctuations, leading to a flat power spectrum.

  2. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  3. f(R) in Holographic and Agegraphic Dark Energy Models and the Generalized Uncertainty Principle

    E-print Network

    Majumder, Barun

    2013-01-01

    We studied a unified approach with the holographic, new agegraphic and the $f(R)$ dark energy model to construct the form of $f(R)$ which in general responsible for the curvature driven explanation of the very early inflation along with presently observed late time acceleration. We considered the generalized uncertainty principle in our approach which incorporated the corrections in the entropy area relation and thereby modified the energy densities for the cosmological dark energy models considered. We found that holographic and new agegraphic $f(R)$ gravity models can behave like phantom or quintessence models in the spatially flat FRW universe. We also found a distinct term in the form of $f(R)$ which goes as $R^{\\frac{3}{2}}$ due to the consideration of the GUP modified energy densities. Although the presence of this term in the action can have its importance in explaining the early inflationary scenario but Capozziello {\\it et.al.} recently showed that $f(R) \\sim R^{\\frac{3}{2}}$ leads to an accelerated ...

  4. Universally valid reformulation of the Heisenberg uncertainty principle on noise and disturbance in measurement

    SciTech Connect

    Ozawa, Masanao [Graduate School of Information Sciences, Tohoku University, Aoba-ku, Sendai, 980-8579 (Japan)

    2003-04-01

    The Heisenberg uncertainty principle states that the product of the noise in a position measurement and the momentum disturbance caused by that measurement should be no less than the limit set by Planck's constant ({Dirac_h}/2{pi})/2 as demonstrated by Heisenberg's thought experiment using a {gamma}-ray microscope. Here it is shown that this common assumption is not universally true: a universally valid trade-off relation between the noise and the disturbance has an additional correlation term, which is redundant when the intervention brought by the measurement is independent of the measured object, but which allows the noise-disturbance product much below Planck's constant when the intervention is dependent. A model of measuring interaction with dependent intervention shows that Heisenberg's lower bound for the noise-disturbance product is violated even by a nearly nondisturbing precise position measurement. An experimental implementation is also proposed to realize the above model in the context of optical quadrature measurement with currently available linear optical devices.

  5. A Dark Energy Model with Generalized Uncertainty Principle in the Emergent, Intermediate and Logamediate Scenarios of the Universe

    E-print Network

    Rahul Ghosh; Surajit Chattopadhyay; Ujjal Debnath

    2011-10-22

    This work is motivated by the work of Kim et al (2008), which considered the equation of state parameter for the new agegraphic dark energy based on generalized uncertainty principle coexisting with dark matter without interaction. In this work, we have considered the same dark energy inter- acting with dark matter in emergent, intermediate and logamediate scenarios of the universe. Also, we have investigated the statefinder, kerk and lerk parameters in all three scenarios under this inter- action. The energy density and pressure for the new agegraphic dark energy based on generalized uncertainty principle have been calculated and their behaviors have been investigated. The evolu- tion of the equation of state parameter has been analyzed in the interacting and non-interacting situations in all the three scenarios. The graphical analysis shows that the dark energy behaves like quintessence era for logamediate expansion and phantom era for emergent and intermediate expansions of the universe.

  6. A violation of the uncertainty principle implies a violation of the second law of thermodynamics.

    PubMed

    Hänggi, Esther; Wehner, Stephanie

    2013-01-01

    Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature. PMID:23575674

  7. A violation of the uncertainty principle implies a violation of the second law of thermodynamics

    E-print Network

    Esther Hänggi; Stephanie Wehner

    2012-05-31

    Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? We give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.

  8. Hydrogen Atom and Helium Ion Spatial and Momentum Distribution Functions Illustrate the Uncertainty Principle

    E-print Network

    Rioux, Frank

    5 2 z 4 2 z 2 p 2 + p 4 + := Plots of the spatial and momentum radial distribution functionsHydrogen Atom and Helium Ion Spatial and Momentum Distribution Functions Illustrate, the helium ions coordinate distribution function is localized closer to the nucleus, meaning less uncertainty

  9. The effect of generalized uncertainty principle on square well, a case study

    SciTech Connect

    Ma, Meng-Sen, E-mail: mengsenma@gmail.com [Department of Physics, Shanxi Datong University, 037009 Datong (China); Institute of Theoretical Physics, Shanxi Datong University, 037009 Datong (China); Zhao, Ren [Institute of Theoretical Physics, Shanxi Datong University, 037009 Datong (China)

    2014-08-15

    According to a special case (? = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except ?. But the eigenstates will depend on another two parameters besides ?.

  10. Reverse-reconciliation continuous-variable quantum key distribution based on the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Furrer, Fabian

    2014-10-01

    A big challenge in continuous-variable quantum key distribution is to prove security against arbitrary coherent attacks including realistic assumptions such as finite-size effects. Recently, such a proof has been presented in [Phys. Rev. Lett. 109, 100502 (2012), 10.1103/PhysRevLett.109.100502] for a two-mode squeezed state protocol based on a novel uncertainty relation with quantum memories. But the transmission distances were fairly limited due to a direct reconciliation protocol. We prove here security against coherent attacks of a reverse-reconciliation protocol under similar assumptions but allowing distances of over 16 km for experimentally feasible parameters. We further clarify the limitations when using the uncertainty relation with quantum memories in security proofs of continuous-variable quantum key distribution.

  11. Maximally localized states and quantum corrections of black hole thermodynamics in the framework of a new generalized uncertainty principle

    E-print Network

    Yan-Gang Miao; Ying-Jie Zhao; Shao-Jun Zhang

    2015-05-24

    As a generalized uncertainty principle (GUP) leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states --- the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.

  12. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  13. Superparticles from the Initial Universe and deduction of the Fine Structure Constant and Uncertainty Principle directly from the Gravitation Theory

    E-print Network

    Fran De Aquino

    2001-03-29

    In a previous work it was shown that the gravitational and inertial masses are correlated by an adimensional factor, which depends on the incident radiation upon the particle. It was also shown that there is a direct correlation between the radiation absorbed by the particle and its gravitational mass, independently of the inertial mass. This finding has fundamental consequences to Unified Field Theory and Quantum Cosmology. Only in the absence of electromagnetic radiation the mentioned factor becomes equal to one. On the other hand, in specific electromagnetic conditions, it can be reduced, nullified or made negative. This means that there is the possibility of the gravitational masses can be reduced, nullified and made negative by means of electromagnetic radiation. This unexpected theoretical result was recently confirmed by an experiment (gr-qc/0005107). A fundamental consequence of the mentioned correlation is that , in specific ultra-high energy conditions, the gravitational and electromagnetic fields can be described by the same Hamiltonian , i.e., in these circumstances, they are unified. Such conditions can have occurred inclusive in the Initial Universe , before the first spontaneous breaking of symmetry. Taking as base this discovery, and starting from the gravitational mass of superparticles from the Initial Universe we show here that it is possible to deduce the reciprocal fine structure constant and the uncertainty principle directly from the Gravitation Theory(Unified Theory).

  14. Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators

    E-print Network

    Marcelo A. Marchiolli; Paulo E. M. F. Mendonca

    2013-03-31

    We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar-Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener-Kinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelets bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar-Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory.

  15. Femtoscopic scales in p+p and p+Pb collisions in view of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Braun-Munzinger, P.; Karpenko, Iu. A.; Sinyukov, Yu. M.

    2013-08-01

    A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in p+p collisions at LHC energy (?{s}=7 TeV). A prediction is also presented of pion interferometric radii for p+Pb collisions at ?{s}=5.02 TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as ‘afterburner’ is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of p+p collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic A+A collisions, also for high multiplicity p+p and p+Pb events.

  16. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Astrophysics Data System (ADS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-02-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  17. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

  18. Moments of non-Gaussian Wigner distributions and a generalized uncertainty principle: I. The single-mode case

    NASA Astrophysics Data System (ADS)

    Ivan, J. Solomon; Mukunda, N.; Simon, R.

    2012-05-01

    The non-negativity of the density operator of a state is faithfully coded in its Wigner distribution, and this coding places on the moments of the Wigner distribution constraints arising from the non-negativity of the density operator. Working in a monomial basis for the algebra \\hat{ A} of operators on the Hilbert space of a bosonic mode, we formulate these constraints in a canonically covariant form which is both concise and explicit. Since the conventional uncertainty relation is such a constraint on the first and second moments, our result constitutes a generalization of the same to all orders. The structure constants of \\hat{ A}, in the monomial basis, are shown to be essentially the SU(2) Clebsch-Gordan coefficients. Our results have applications in quantum state reconstruction using optical homodyne tomography and, when generalized to the n-mode case, which will be done in the second part of this work, will have applications also for continuous variable quantum information systems involving non-Gaussian states.

  19. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    PubMed

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124

  20. Uncertainty principles and vector quantization

    E-print Network

    Vershynin, Roman

    a great power for reduction of errors in their coefficients, including coefficient losses and distortions is the discrete Fourier transform. At the next step, one quan- tizes the coefficients ai using a convenient Yu. If the first coefficient a1 is lost (for example due to transmission failure) then we can not reconstruct

  1. The certainty principle (review)

    E-print Network

    D. A. Arbatsky

    2006-08-17

    The certainty principle (2005) allowed to conceptualize from the more fundamental grounds both the Heisenberg uncertainty principle (1927) and the Mandelshtam-Tamm relation (1945). In this review I give detailed explanation and discussion of the certainty principle, oriented to all physicists, both theorists and experimenters.

  2. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Gour, Gilad

    2014-03-01

    Uncertainty relations are a distinctive characteristic of quantum theory that imposes intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring non-commuting observables. However, I will show here that there is no fundamental reason for using entropies as quantifiers; in fact, any functional relation that characterizes the uncertainty of the measurement outcomes can be used to define an uncertainty relation. Starting from a simple assumption that any measure of uncertainty is non-decreasing under mere relabeling of the measurement outcomes, I will show that Schur-concave functions are the most general uncertainty quantifiers. I will then introduce a novel fine-grained uncertainty relation written in terms of a majorization relation, which generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary measures of uncertainty. This infinite family of uncertainty relations includes all the known entropic uncertainty relations, but is not limited to them. In this sense, the relation is universally valid and captures the essence of the uncertainty principle in quantum theory. This talk is based on a joint work with Shmuel Friedland and Vlad Gheorghiu. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada and by the Pacific Institute for Mathematical Sciences (PIMS).

  3. Entropic uncertainty relations for multiple measurements

    NASA Astrophysics Data System (ADS)

    Liu, Shang; Mu, Liang-Zhu; Fan, Heng

    2015-04-01

    We present the entropic uncertainty relations for multiple measurement settings which demonstrate the uncertainty principle of quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory, and can be proven by a unified method. Our results recover some well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. The bounds of those relations which quantify the extent of the uncertainty take concise forms and are easy to calculate. Those uncertainty relations might play important roles in the foundations of quantum theory. Potential experimental demonstration of those entropic uncertainty relations is discussed.

  4. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  5. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  6. Large-uncertainty intelligent states for angular momentum and angle

    Microsoft Academic Search

    Jörg B. Götte; Roberta Zambrini; Sonja Franke-Arnold; Stephen M. Barnett

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In

  7. Reformulating the Quantum Uncertainty Relation

    E-print Network

    Jun-Li Li; Cong-Feng Qiao

    2015-02-23

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem, the optimal ones. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in $N$-dimensional Hilbert space.

  8. Conservation law for Uncertainty relations and quantum correlations

    E-print Network

    Zhihao Ma; Shengjun Wu; Zhihua Chen

    2014-09-01

    Uncertainty principle, a fundamental principle in quantum physics, has been studied intensively via various uncertainty inequalities. Here we derive an uncertainty equality in terms of linear entropy, and show that the sum of uncertainty in complementary local bases is equal to a fixed quantity. We also introduce a measure of correlation in a bipartite state, and show that the sum of correlations revealed in a full set of complementary bases is equal to the total correlation in the bipartite state. The surprising simple equality relations we obtain imply that the study on uncertainty principle and correlations can rely on the use of linear entropy, a simple quantity that is very convenient for calculation.

  9. Measuring Uncertainty

    NSDL National Science Digital Library

    Moore, P.G.

    This article, authored by P.G. Moore for the Royal Statistical Society's website, provides well-defined exercises to assess the probabilities of decision-making and the degree of uncertainty. The author states the focus of the article as: "When analyzing situations which involve decisions to be made as between alternative courses of action under conditions of uncertainty, decision makers and their advisers are often called upon to assess judgmental probability distributions of quantities whose true values are unknown to them. How can this judgment be taught?" Moore provides five different exercises and even external reference for those interested in further study of the topic.

  10. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  11. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  12. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  13. Bernoulli's Principle

    NSDL National Science Digital Library

    Paul G. Hewitt

    2004-09-01

    Many physics teachers have an unclear understanding of Bernoulli's principle, particularly when the principle is applied to aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it altogether. The following simplified treatment of the principle ignores most of the complexities of aerodynamics and hopefully will encourage teachers to bring Bernoulli back into the classroom.

  14. Entropic uncertainty relation in de Sitter space

    NASA Astrophysics Data System (ADS)

    Jia, Lijuan; Tian, Zehua; Jing, Jiliang

    2015-02-01

    The uncertainty principle restricts our ability to simultaneously predict the measurement outcomes of two incompatible observables of a quantum particle. However, this uncertainty could be reduced and quantified by a new Entropic Uncertainty Relation (EUR). By the open quantum system approach, we explore how the nature of de Sitter space affects the EUR. When the quantum memory A freely falls in the de Sitter space, we demonstrate that the entropic uncertainty acquires an increase resulting from a thermal bath with the Gibbons-Hawking temperature. And for the static case, we find that the temperature coming from both the intrinsic thermal nature of the de Sitter space and the Unruh effect associated with the proper acceleration of A also brings effect on entropic uncertainty, and the higher the temperature, the greater the uncertainty and the quicker the uncertainty reaches the maximal value. And finally the possible mechanism behind this phenomenon is also explored.

  15. Minimum uncertainty states of angular momentum and angular position

    Microsoft Academic Search

    David T. Pegg; Stephen M. Barnett; Roberta Zambrini; Sonja Franke-Arnold; Miles Padgett

    2005-01-01

    The states of linear momentum that satisfy the equality in the Heisenberg uncertainty principle for position and momentum, that is the intelligent states, are also the states that minimize the uncertainty product for position and momentum. The corresponding uncertainty relation for angular momentum and angular position, however, is more complicated and the intelligent states need not be the constrained minimum

  16. Bellagio Principles

    NSDL National Science Digital Library

    The Bellagio Principles (available in text and as a RealAudio multimedia presentation) provide "guidelines for the practical assessment of progress towards sustainable development." These principles were developed by an international group of researchers in 1996.

  17. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  18. Angular minimum uncertainty states with large uncertainties

    Microsoft Academic Search

    Jörg B Götte; Paul M. Radmore; Roberta Zambrini; Stephen M. Barnett

    2006-01-01

    The uncertainty relation for angle and angular momentum has a lower bound which depends on the form of the state. Surprisingly, this lower bound can be very large. We derive the states which have the lowest possible uncertainty product for a given uncertainty in the angle or in the angular momentum. We show that, if the given angle uncertainty is

  19. Angular minimum uncertainty states with large uncertainties

    E-print Network

    Goette, J B; Radmore, P M; Zambrini, R; Barnett, Stephen M.; Goette, Joerg B.; Radmore, Paul M.; Zambrini, Roberta

    2005-01-01

    The uncertainty relation for angle and angular momentum has a lower bound which depends on the form of the state. Surprisingly, this lower bound can be very large. We derive the states which have the lowest possible uncertainty product for a given uncertainty in the angle or in the angular momentum. We show that, if the given angle uncertainty is close to its maximum value, the lowest possible uncertainty product tends to infinity.

  20. Angular minimum uncertainty states with large uncertainties

    E-print Network

    Joerg B. Goette; Paul M. Radmore; Roberta Zambrini; Stephen M. Barnett

    2006-04-27

    The uncertainty relation for angle and angular momentum has a lower bound which depends on the form of the state. Surprisingly, this lower bound can be very large. We derive the states which have the lowest possible uncertainty product for a given uncertainty in the angle or in the angular momentum. We show that, if the given angle uncertainty is close to its maximum value, the lowest possible uncertainty product tends to infinity.

  1. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  2. Large-uncertainty intelligent states for angular momentum and angle

    E-print Network

    Goette, J B; Franke-Arnold, S; Barnett, S M; Goette, Joerg B.; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M.

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases.

  3. Large-uncertainty intelligent states for angular momentum and angle

    E-print Network

    Joerg B. Goette; Roberta Zambrini; Sonja Franke-Arnold; Stephen M. Barnett

    2005-10-20

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases.

  4. Error propagation and uncertainty in process modeling

    SciTech Connect

    Gardner, R.H.; Dale, V.H.; O'Neill, R.V.

    1988-01-01

    The principles and procedures for estimating uncertainties associated with model predictions have been extensively studied and applied to a broad spectrum of models. These studies have quantified relationships between data input, parameter estimation, and model results. Uncertainty analyses have also determined how additional data can improve the precision and reliability of predictions. The current challenge in applying error propagation techniques to forest process models lies with the development of a conceptual framework that can distinguish between: (1) effects due to stochastic variables such as weather; (2) the uncertainties associated with the measurement of detailed physiological processes; and (3) the variability in results caused by local stand-level phenomena.

  5. Aspects of Complementarity and Uncertainty

    E-print Network

    Radhika Vathsan; Tabish Qureshi

    2014-07-23

    The two-slit experiment with quantum particles provides many insights into the behaviour of quantum mechanics, including Bohr's complementarity principle. Here we analyze Einstein's recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.

  6. Anti Heisenberg—Refutation Of Heisenberg's Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Baruk?i?, Ilija

    2011-03-01

    The quantum mechanical uncertainty principle for position and momentum plays an important role in many treatments on the (philosophical, physical and other) implications of quantum mechanics. Roughly speaking, the more precisely the momentum (position) of a (quantum mechanical) object is given, the less precisely can one say what its position (momentum) is. This quantum mechanical measurement problem is not just an interpretational difficulty, it raises broader issues as well. The measurement (of a property) of a (quantum mechanical) object determines the existence of the measured. In brief, the quantum mechanical uncertainty principle challenges some fundamental principles of Science and especially the principle of causality. In particular, an independently existing (external) objective reality is denied. As we shall see, that the quantum mechanical uncertainty principle for position and momentum is based on the assumption that 1?0, which is a logical contradiction.

  7. Meaning of delayed choice experiment and quantum uncertainty

    E-print Network

    Zinkoo Yun

    2014-04-22

    By slight modifying of the delayed-choice experiment, it is argued that the quantum wave function must be interpreted as real physical entity; With this interpretation in mind, multiple least action paths due to uncertainty leads us to new perspective on the Compton wavelength and the uncertainty principle itself.

  8. Approaches to evaluate the virtual instrumentation measurement uncertainties

    Microsoft Academic Search

    Salvatore Nuccio; Ciro Spataro

    2002-01-01

    This paper deals with the metrological characterization of virtual instruments. After a brief description of the features, the components and the working principle of the virtual instruments and the various uncertainty sources are analyzed. Then, two methods to evaluate the uncertainty of the measurement results are presented: a numerical method simulating the physical process of the A\\/D conversion, and an

  9. Approaches to evaluate the virtual instrumentation measurement uncertainties

    Microsoft Academic Search

    S. Nuccio; C. Spataro

    2001-01-01

    The paper deals with the metrological characterization of virtual instruments. After a brief description of the features, the constitution and the working principle of the virtual instruments, the various uncertainty sources are analyzed. Then two methods to evaluate the uncertainty of the measurement results are presented: a numerical method simulating the physical process of the A\\/D conversion, and an approximated

  10. Large-uncertainty intelligent states for angular momentum and angle

    Microsoft Academic Search

    Roberta Zambrini; Sonja Franke-Arnold; Stephen M. Barnett

    2006-01-01

    The equality in the uncertainty principle for linear momentum and position is\\u000aobtained for states which also minimize the uncertainty product. However, in\\u000athe uncertainty relation for angular momentum and angular position both sides\\u000aof the inequality are state dependent and therefore the intelligent states,\\u000awhich satisfy the equality, do not necessarily give a minimum for the\\u000auncertainty product. In

  11. Rocket Principles

    NSDL National Science Digital Library

    On this site from the NASA Glenn Research Center Learning Technologies Project, the science and history of rocketry is explained. Visitors will find out how rocket principles illustrate Newton's Laws of Motion. There is a second page of this site, Practical Rocketry, which discusses the workings of rockets, including propellants, engine thrust control, stability and control systems, and mass.

  12. Bernoulli's Principle

    NSDL National Science Digital Library

    Michael Horton

    2009-05-30

    In this lab, students will use a little background information about Bernoulli's principle to figure out how the spinning of a moving ball affects its trajectory. The activity is inquiry in that students will be discovering this relationship on their own.

  13. Mechanics of Uncertainty: Managing Uncertainty in Mechanics

    Microsoft Academic Search

    Roger G. Ghanem

    Uncertainty is ubiquitous in the natural, engineered, and social environ- ments. Devising rationales for explaining it, strategies for its integration into scientic determinism and mitigating its consequences has been an active arena of rational endeavor where many scientic concepts have taken turn at fame and infamy. Far from being a static concept, uncertainty is the complement of knowledge, and as

  14. Reproducibility and uncertainty of wastewater turbidity measurements.

    PubMed

    Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G

    2008-01-01

    Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used. PMID:18520026

  15. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  16. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  17. Language Use Uncertainty

    E-print Network

    Löwe, Benedikt

    and Probabilistic Models Questions Premature Optimism Reactions: Reaction of hardcore "computer scientists": "I Uncertainty and Probabilistic Models Questions Premature Optimism Reactions: Reaction of hardcore "computer Uncertainty and Probabilistic Models Questions Premature Optimism Reactions: Reaction of hardcore "computer

  18. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  19. Direct Aerosol Forcing Uncertainty

    SciTech Connect

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  20. Universal uncertainty relations.

    PubMed

    Friedland, Shmuel; Gheorghiu, Vlad; Gour, Gilad

    2013-12-01

    Uncertainty relations are a distinctive characteristic of quantum theory that impose intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring noncommuting observables. However, there is no fundamental reason for using entropies as quantifiers; any functional relation that characterizes the uncertainty of the measurement outcomes defines an uncertainty relation. Starting from a very reasonable assumption of invariance under mere relabeling of the measurement outcomes, we show that Schur-concave functions are the most general uncertainty quantifiers. We then discover a fine-grained uncertainty relation that is given in terms of the majorization order between two probability vectors, significantly extending a majorization-based uncertainty relation first introduced in M. H. Partovi, Phys. Rev. A 84, 052117 (2011). Such a vector-type uncertainty relation generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary uncertainty quantifiers. Our relation is therefore universal and captures the essence of uncertainty in quantum theory. PMID:24476234

  1. Uncertainties in Gapped Graphene

    E-print Network

    Eylee Jung; Kwang S. Kim; DaeKil Park

    2012-03-20

    Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.

  2. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  3. The Precautionary Principle in Environmental Science

    Microsoft Academic Search

    David Kriebel; Joel Tickner; Paul Epstein; John Lemons; Richard Levins; Edward L. Loechler; Margaret Quinn; Ruthann Rudel; Ted Schettler; Michael Stoto

    Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, pro- posed as a new guideline in environmental decision making, has four central components: taking pre- ventive action in the face of uncertainty; shifting the burden of proof to the proponents of

  4. Information-Disturbance theorem and Uncertainty Relation

    E-print Network

    Miyadera, Takayuki

    2007-01-01

    It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.

  5. Information-Disturbance theorem and Uncertainty Relation

    E-print Network

    Takayuki Miyadera; Hideki Imai

    2007-07-31

    It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.

  6. Role of the precautionary principle in water recycling

    Microsoft Academic Search

    A. I. Schäfera; S. Beder

    2006-01-01

    In an engineering context the precautionary principle is often perceived as an excuse to do nothing or a substantial barrier to technical progress. The precautionary principle requires that remedial measures be taken in situations of scientific uncertainty where evidence of harm cannot be proven but potential damage to human or environmental health is significant. In this paper the scope of

  7. Relevance of the precautionary principle in water recycling

    Microsoft Academic Search

    A. I. Schäfer; S. Beder

    2006-01-01

    In an engineering context the precautionary principle is often perceived as an excuse to do nothing or a substantial barrier to technical progress. The precautionary principle requires that remedial measures be taken in situations of scientific uncertainty where evidence of harm cannot be proven but potential damage to human or environmental health is significant. In this paper the scope of

  8. Uncertainty in bulk-liquid hydrodynamics and biofilm dynamics creates uncertainties in biofilm reactor design.

    PubMed

    Boltz, J P; Daigger, G T

    2010-01-01

    While biofilm reactors may be classified as one of seven different types, the design of each is unified by fundamental biofilm principles. It follows that state-of-the art design of each biofilm reactor type is subject to the same uncertainties (although the degree of uncertainty may vary). This paper describes unifying biofilm principles and uncertainties of importance in biofilm reactor design. This approach to biofilm reactor design represents a shift from the historical approach which was based on empirical criteria and design formulations. The use of such design criteria was largely due to inherent uncertainty over reactor-scale hydrodynamics and biofilm dynamics, which correlate with biofilm thickness, structure and function. An understanding of two fundamental concepts is required to rationally design biofilm reactors: bioreactor hydrodynamics and biofilm dynamics (with particular emphasis on mass transfer resistances). Bulk-liquid hydrodynamics influences biofilm thickness control, surface area, and development. Biofilm dynamics influences biofilm thickness, structure and function. While the complex hydrodynamics of some biofilm reactors such as trickling filters and biological filters have prevented the widespread use of fundamental biofilm principles and mechanistic models in practice, reactors utilizing integrated fixed-film activated sludge or moving bed technology provide a bulk-liquid hydrodynamic environment allowing for their application. From a substrate transformation perspective, mass transfer in biofilm reactors defines the primary difference between suspended growth and biofilm systems: suspended growth systems are kinetically (i.e., biomass) limited and biofilm reactors are primarily diffusion (i.e., biofilm growth surface area) limited. PMID:20107256

  9. UNCERTAINTY IN DATA INTEGRATION

    Microsoft Academic Search

    Alon Halevy

    Data integration has been an important area of research for several years. In this chapter, we argue that supporting modern data integration applications requires systems to handle uncertainty at every step of integration. We provide a formal framework for data integration systems with uncertainty. We define probabilistic schema mappings and probabilistic mediated schemas, show how they can be constructed automatically

  10. 3 Component PIV Uncertainty

    NASA Astrophysics Data System (ADS)

    Warner, Scott; Smith, Barton

    2013-11-01

    The random uncertainty of 2-component (2C) Particle Image Velocimetry (PIV) has recently been addressed in three unique methods called the Uncertainty Surface Method (USM) from Utah State University, Image Matching (IM) method from Lavision and Delft, and correlation Signal to Noise Ration (SNR) methods from Virginia Tech. Since 3C (stereo) Particle Image Velocimetry (PIV) velocity fields are derived from two, 2C fields, random uncertainties from the 2C fields clearly propagate into the 3C field. In this work, we will demonstrate such a propagation using commercial PIV software and the USM method, although the propagation works similarly for any 2C random uncertainty method. Stereo calibration information is needed to perform this propagation. As a starting point, a pair of 2C uncertainty fields will be combined in exactly the same manner as velocity fields to form a 3C uncertainty field using commercial software. Correlated uncertainties between the components in the two 2C fields will be addressed. These results will then by compared to a more rigorous propagation, which requires access to the calibration information. The random uncertainty of 2-component (2C) Particle Image Velocimetry (PIV) has recently been addressed in three unique methods called the Uncertainty Surface Method (USM) from Utah State University, Image Matching (IM) method from Lavision and Delft, and correlation Signal to Noise Ration (SNR) methods from Virginia Tech. Since 3C (stereo) Particle Image Velocimetry (PIV) velocity fields are derived from two, 2C fields, random uncertainties from the 2C fields clearly propagate into the 3C field. In this work, we will demonstrate such a propagation using commercial PIV software and the USM method, although the propagation works similarly for any 2C random uncertainty method. Stereo calibration information is needed to perform this propagation. As a starting point, a pair of 2C uncertainty fields will be combined in exactly the same manner as velocity fields to form a 3C uncertainty field using commercial software. Correlated uncertainties between the components in the two 2C fields will be addressed. These results will then by compared to a more rigorous propagation, which requires access to the calibration information. Thanks to the Nuclear Science & Technology Directorate at Idaho National Laboratory. The work was supported through the U.S. Department of Energy, Laboratory Directed Research & Development grant under DOE Contract 122440 (Project Number: 12-045).

  11. Quantitative Robust Uncertainty Principles and Optimally Sparse Decompositions

    E-print Network

    Soatto, Stefano

    supported on T whose discrete Fourier transform ^f is supported on . In fact, we can make the above by National Science Foundation grants DMS 01-40698 (FRG) and ACI-0204932 (ITR), and by an Alfred P. Sloan Fellowship. J. R. is supported by those same National Science Foundation grants. E. C. would like to thank

  12. Fuzzy dimensions and Planck's uncertainty principle for p-branes

    NASA Astrophysics Data System (ADS)

    Aurilia, Antonio; Ansoldi, Stefano; Spallucci, Euro

    2002-06-01

    The explicit form of the quantum propagator of a bosonic p-brane, previously obtained by the authors in the quenched-minisuperspace approximation, suggests the possibility of a novel, unified description of p-branes with different dimensionality. The background metric that emerges in this framework is a quadratic form on a Clifford manifold. Substitution of the Lorentzian metric with the Clifford line element has two far-reaching consequences. On the one hand, it changes the very structure of the spacetime fabric since the new metric is built out of a minimum length below which it is impossible to resolve the distance between two points; on the other hand, the introduction of the Clifford line element extends the usual relativity of motion to the case of relative dimensionalism of all p-branes that make up the spacetime manifold near the Planck scale.

  13. Heisenberg uncertainty principle and economic analogues of basic physical quantities

    E-print Network

    Soloviev, Vladimir

    2011-01-01

    From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.

  14. Heisenberg uncertainty principle and economic analogues of basic physical quantities

    E-print Network

    Vladimir Soloviev; Vladimir Saptsin

    2011-11-10

    From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.

  15. Heisenberg uncertainty principle and economic analogues of basic physical quantities

    Microsoft Academic Search

    Vladimir Soloviev; Vladimir Saptsin

    2011-01-01

    From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of

  16. Phase-space noncommutative formulation of Ozawa's uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno

    2014-08-01

    Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.

  17. Annals of Mathematics, 165 (2007), 143 An uncertainty principle

    E-print Network

    Granville, Andrew

    2007-01-01

    the limitations to the equidistribution of in- teresting "arithmetic sequences" in arithmetic progressions, there exists N x and an arithmetic progression a (mod q) with q x such that nA, nN na (mod q) 1 - 1 q nA n be an arithmetic progression in which the number of elements of A is a little different from the average. Following

  18. UNCERTAINTY PRINCIPLE, NON-SQUEEZING THEOREM AND THE SYMPLECTIC RIGIDITY

    E-print Network

    Oh, Yong-Geun

    ) : = 1 2 m x2 - F(x) = 1 2 p2 - F(q) is conserved along each trajectory of (1.4). With the function H = E the trajectory of (1.5). There could be other type of conserved quantities depending on the type, then the angular momentum will be conserved. In general, each symmetry of the mechanical system gives rise

  19. Principles of Cyberwarfare

    Microsoft Academic Search

    Raymond C. Parks; David P. Duggan

    2011-01-01

    This paper proposes some principles of cyber- warfare. The principles of warfare are well documented, but are not always applicable to cyber-warfare. Differences between cyberspace and the real world suggest some additional principles. This is not intended to be a comprehensive listing of such principles but suggestions leading toward discussion and dialogue. The current candidate list of principles of cyber-warfare

  20. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  1. Computation of LFT Uncertainty Bounds with Repeated Parametric Uncertainties

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    A new methodology in which linear fractional transformation uncertainty bounds are directly constructed for use in robust control design and analysis is proposed. Existence conditions for model validating solutions with or without repeated scalar uncertainty are given. The approach is based on minimax formulation to deal with multiple non-repeated structured uncertainty components subject to fixed levels of repeated scalar uncertainties. Input directional dependence and variations with different experiments are addressed by maximizing uncertainty levels over multiple experimental data sets. Preliminary results show that reasonable uncertainty bounds on structured non-repeated uncertainties can be identified directly from measurement data by assuming reasonable levels of repeated scalar uncertainties.

  2. Analysis of Infiltration Uncertainty

    SciTech Connect

    MCCURLEY,RONALD D.; HO,CLIFFORD K.; WILSON,MICHAEL L.; HEVESI,JOSEPH A.

    2000-10-30

    In a total-system performance assessment (TSPA), uncertainty in the performance measure (e.g., radiation dose) is estimated by first estimating the uncertain y in the input variables and then propagating that uncertain y through the model system by means of Monte Carlo simulation. This paper discusses uncertainty in surface infiltration, which is one of the input variables needed for performance assessments of the Yucca Mountain site. Infiltration has been represented in recent TSPA simulations by using three discrete infiltration maps (i.e., spatial distributions of infiltration) for each climate state in the calculation of unsaturated-zone flow and transport. A detailed uncertainty analysis of infiltration was carried out for two purposes: to better quantify the possible range of infiltration, and to determine what probability weights should be assigned to the three infiltration cases in a TSPA simulation. The remainder of this paper presents the approach and methodology for the uncertainty analysis, along with a discussion of the results.

  3. Agricultural Impacts: Robust uncertainty

    NASA Astrophysics Data System (ADS)

    Rötter, Reimund P.

    2014-04-01

    An up-to-date synthesis of climate change impacts on crop yields shows that the bounds of uncertainty are increasing. So why do estimates of the effect of climate change on crop productivity differ so much?

  4. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D. [Los Alamos National Lab., NM (United States)

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  5. Dasymetric Modeling and Uncertainty

    PubMed Central

    Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth

    2014-01-01

    Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846

  6. Dasymetric Modeling and Uncertainty.

    PubMed

    Nagle, Nicholas N; Buttenfield, Barbara P; Leyk, Stefan; Speilman, Seth

    2014-01-01

    Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846

  7. Picture independent quantum action principle

    SciTech Connect

    Mantke, W.J.

    1992-01-01

    The Schwinger action principle for quantum mechanics is extended into a picture independent form. This displays the quantum connection. Time variations are formulated as variations of a time variable and included into the kinematical variations. Kets and bras represent experimental operations. Experimental operations at different times cannot be identified. The ket and the bra spaces are fiber bundles over time. The same applies to the classical configuration space. For the classical action principle the action can be varied by changing the path or the classical variables. The latter variation of classical functions corresponds to kinematical variations of quantum variables. The picture independent formulation represents time evolution by a connection. A standard experiment is represented by a ket, a connection and a bra. For particular start and end times of experiments, the action and the contraction into a transition amplitude are elements of a new tensor space of quantum correspondents of path functionals. The classical correspondent of the transition amplitude is the probability for a specified state to evolve along a particular path segment. The elements of the dual tensor space represent standard experiments or superpositions thereof. The kinematical variations of the quantum variables are commuting numbers. Variations that include the effect of Poincare or gauge transformations have different commutator properties. The Schwinger action principle is derived from the Feynman path integral formulation. The limitations from the time-energy uncertainty relation might be accommodated by superposing experiments that differ in their start- and end-times. In its picture independent form the action principle can be applied to all superpositions of standard experiments. This may involve superpositions of different connections. The extension of the superposition principle to connections allows representation of a quantum field by a part of the connection.

  8. Mobility Reduces Uncertainty in MANETs

    Microsoft Academic Search

    Feng Li; Jie Wu

    2007-01-01

    Evaluating and quantifying trust stimulates collab- oration in mobile ad hoc networks (MANETs). Many existing reputation systems sharply divide the trust value into right or wrong, thus ignore another core dimension of trust: uncertainty. As uncertainty deeply impacts a node's anticipation of others' behavior and decisions during interaction, we include uncertainty in the reputation system. Specifically, we use an uncertainty

  9. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  10. New approach to nonperturbative quantum mechanics with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Pedram, Pouria

    2012-01-01

    The existence of a minimal measurable length is a common feature of various approaches to quantum gravity such as string theory, loop quantum gravity, and black-hole physics. In this scenario, all commutation relations are modified and the Heisenberg uncertainty principle is changed to the so-called Generalized (Gravitational) Uncertainty Principle (GUP). Here, we present a one-dimensional nonperturbative approach to quantum mechanics with minimal length uncertainty relation which implies X=x to all orders and P=p+(1)/(3)?p3 to first order of GUP parameter ?, where X and P are the generalized position and momentum operators and [x,p]=i?. We show that this formalism is an equivalent representation of the seminal proposal by Kempf, Mangano, and Mann and predicts the same physics. However, this proposal reveals many significant aspects of the generalized uncertainty principle in a simple and comprehensive form and the existence of a maximal canonical momentum is manifest through this representation. The problems of the free particle and the harmonic oscillator are exactly solved in this GUP framework and the effects of GUP on the thermodynamics of these systems are also presented. Although X, P, and the Hamiltonian of the harmonic oscillator all are formally self-adjoint, the careful study of the domains of these operators shows that only the momentum operator remains self-adjoint in the presence of the minimal length uncertainty. We finally discuss the difficulties with the definition of potentials with infinitely sharp boundaries.

  11. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul, E-mail: paul.busch@york.ac.uk [Department of Mathematics, University of York, York (United Kingdom)] [Department of Mathematics, University of York, York (United Kingdom); Lahti, Pekka, E-mail: pekka.lahti@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland)] [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku, FI-20014 Turku (Finland); Werner, Reinhard F., E-mail: reinhard.werner@itp.uni-hannover.de [Institut für Theoretische Physik, Leibniz Universität, Hannover (Germany)

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  12. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  13. Equivalence of wave-particle duality to entropic uncertainty.

    PubMed

    Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie

    2014-01-01

    Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138

  14. Uncertainty relation for photons.

    PubMed

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Zofia

    2012-04-01

    The uncertainty relation for the photons in three dimensions that overcomes the difficulties caused by the nonexistence of the photon position operator is derived in quantum electrodynamics. The photon energy density plays the role of the probability density in configuration space. It is shown that the measure of the spatial extension based on the energy distribution in space leads to an inequality that is a natural counterpart of the standard Heisenberg relation. The equation satisfied by the photon wave function in momentum space which saturates the uncertainty relations has the form of the Schrödinger equation in coordinate space in the presence of electric and magnetic charges. PMID:22540772

  15. Treatment of Data Uncertainties

    SciTech Connect

    Larson, N.M. [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6171 (United States)

    2005-05-24

    The generation and use of data covariance matrices are discussed within the context of the analysis of neutron-induced cross-section data via the R-matrix code SAMMY. Two complementary approaches are described, the first involving mathematical manipulation of Bayes' equations and the second utilizing computer simulations. A new procedure for propagating uncertainties on unvaried parameters will allow the effect of all relevant experimental uncertainties to be reflected in the analysis results, without placing excessive additional burden on the analyst. Implementation of this procedure within SAMMY is described and illustrated through the simulations.

  16. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  17. Uncertainties in the Measurement of the Momentum and Position of an Electron

    E-print Network

    Kirk T. McDonald

    2003-12-03

    It has been suggested that the uncertainty in the measurement of a particle's momentum could be made arbitrarily small by observing the particle at two ends of an arbitrarily long flight path. However, consideration of the nature of the detection process shows that the usual limits of the uncertainty principle hold independent of the length of the flight path.

  18. Multiresolutional models of uncertainty generation and reduction

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.

  19. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  20. Comment on "Uncertainty in measurements of distance"

    E-print Network

    Y. Jack Ng; H. van Dam

    2002-09-06

    We have argued that quantum mechanics and general relativity give a lower bound $\\delta l \\gtrsim l^{1/3} l_P^{2/3}$ on the measurement uncertainty of any distance $l$ much greater than the Planck length $l_P$. Recently Baez and Olson have claimed that one can go below this bound by attaching the measuring device to a massive elastic rod. Here we refute their claim. We also reiterate (and invite our critics to ponder on) the intimate relationship and consistency between black hole physics (including the holographic principle) and our bound on distance measurements.

  1. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  2. Identity Uncertainty Stuart Russell

    E-print Network

    Russell, Stuart

    Identity Uncertainty Stuart Russell Computer Science Division University of California, Berkeley, CA 94720, USA russell@cs.berkeley.edu Abstract We are often uncertain about the identity of objects probabilis- tic approach to reasoning about identity under uncer- tainty in the framework of first

  3. Generalized uncertainty relations

    Microsoft Academic Search

    Burcu Elif Akten

    1999-01-01

    The Heisenberg uncertainty relation has been put into a stronger form by Schrödinger and Robertson. This inequality is also canonically invariant. We ask if there are other independent inequalities for higher orders. The aim is to find a systematic way for writing these inequalities. After an overview of the Heisenberg and Schrödinger-Robertson inequalities and their minimal states in Chapter 1,

  4. MRST partons and uncertainties

    Microsoft Academic Search

    A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

    2003-01-01

    We discuss uncertainties in the extraction of parton distributions from global analyses of DIS and related data. We present conservative sets of partons, at both NLO and NNLO, which are stable to x,Q^2,W^2 cuts on the data. We give the corresponding values of alpha(M_Z^2) and the cross sections for W production at the Tevatron.

  5. Classification images with uncertainty

    Microsoft Academic Search

    Bosco S. Tjan; Anirvan S. Nandy

    pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not

  6. Inflation Uncertainty, Output Growth Uncertainty and Macroeconomic Performance: Comparing Alternative

    E-print Network

    Paris-Sud XI, Université de

    Inflation Uncertainty, Output Growth Uncertainty and Macroeconomic Performance: Comparing severe financial and economic crisis, accompanied by inflation and exchange rate instability, Eastern and Inflation targeting). The task of our study is to compare econometrically the performance of these two

  7. Principles of administration revisited

    Microsoft Academic Search

    John Donaldson; Irene Fafaliou

    2007-01-01

    Purpose – To explore the assumptions underlying the traditional “principles of administration” in the light of the rise of interest in corporate social responsibility, business ethics and corporate governance and to link revised principles to practical stakeholder models, using, for example, modern communications media. Design\\/methodology\\/approach – Using concepts of “fit” between traditional administrative principles and common problems of business administration,

  8. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  9. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  10. Principles of plasma diagnostics

    Microsoft Academic Search

    Ian H. Hutchinson

    1987-01-01

    Principles of Plasma Diagnostics provides a detailed derivation and discussion of the plasma physics principles on which diagnostics are base, including magnetic measurements, electric probes, refractive index, radiation emission and scattering, and ionic processes. The text is based on first-principles development of the required concepts and includes examples of diagnostics in action taken from fusion research.

  11. Predicting System Performance with Uncertainty

    E-print Network

    Yan, B.; Malkawi, A.

    2012-01-01

    The main purpose of this research is to include uncertainty that lies in modeling process and that arises from input values when predicting system performance, and to incorporate uncertainty related to system controls in a computationally...

  12. Uncertainties in parton distribution functions

    Microsoft Academic Search

    A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

    2000-01-01

    We discuss the uncertainty in the predictions for hard scattering cross sections at hadron colliders due to uncertainties in the input parton distributions, using W production at the LHC as an example.

  13. Visualizing Java uncertainty

    NASA Astrophysics Data System (ADS)

    Knight, Claire; Munro, Malcolm

    2001-07-01

    Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.

  14. MRST partons and uncertainties

    Microsoft Academic Search

    A. D. Martin; R. G. Roberts; W. J. Stirling; R. S. Thorne

    2003-01-01

    We discuss uncertainties in the extraction of parton distributions from\\u000aglobal analyses of DIS and related data. We present conservative sets of\\u000apartons, at both NLO and NNLO, which are stable to x,Q^2,W^2 cuts on the data.\\u000aWe give the corresponding values of alpha(M_Z^2) and the cross sections for W\\u000aproduction at the Tevatron.

  15. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  16. Implementing the Precautionary Principle: Incorporting Science, Technology, Fairness, and Accountability in Environmental, Health and Safety Decisions

    E-print Network

    Ashford, Nicholas

    2005-01-01

    The precautionary principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on allegedly more "rational" ...

  17. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  18. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  19. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  20. Calculating efficiencies and their uncertainties

    SciTech Connect

    Paterno, Marc; /Fermilab

    2004-12-01

    The commonly used methods for the calculation of the statistical uncertainties in cut efficiencies (''Poisson'' and ''binomial'' errors) are both defective, as is seen in extreme cases. A method for the calculation of uncertainties based upon Bayes' Theorem is presented; this method has no problem with extreme cases. A program for the calculation of such uncertainties is also available.

  1. Entropic uncertainty relations - A survey

    E-print Network

    Stephanie Wehner; Andreas Winter

    2009-07-21

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, rather little is known about entropic uncertainty relations with more than two measurement settings. In this note we review known results and open questions.

  2. Tevatron Measurements and PDF Uncertainties

    SciTech Connect

    Chlebana, Frank [Fermi National Accelerator Lab, P.O. Box 500, Batavia IL, 60150 (United States)

    2005-10-06

    The impact of PDF uncertainties on recent Tevatron measurements is explored. One of the most poorly constrained PDFs is the gluon distribution which is seen to be the dominant source of uncertainty for many interesting calculations. Tevatron measurements that can be used to better constrain PDFs are highlighted. Recent techniques to quantify the error on measured distributions resulting from PDF uncertainties are discussed.

  3. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  4. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  5. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables. PMID:25763941

  6. Direct tests of measurement uncertainty relations: what it takes

    E-print Network

    Paul Busch; Neil Stevens

    2015-01-17

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that in nearly 90 years there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance), and precise formulations of such relations that are {\\em universally valid}and {\\em directly testable}. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of {\\em value deviation errors} (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for {\\em state-dependent} error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify {\\em distances between observables}.

  7. Uncertainty of data obtained in SRF cavity vertical test

    E-print Network

    He, Feisi

    2013-01-01

    Vertical test is a commonly used experimental method to qualify Superconducting Radio Frequency (SRF) cavities. Taking the experiences at Jefferson Lab (JLab) in US for example, over thousand of vertical tests have been performed on over 500 different cavities up to now [1]. Most of the tests at JLab followed the method as described in [1], but all the uncertainties of the calculated quality factors as well as the gradients were in-accurate due to the wrong algorithm used. In this paper, a first-principle method was applied to analyze the uncertainty of the data, and the results were compared with those in [1] under typical experiment conditions.

  8. Uncertainty Quantification in Lattice QCD Calculations for Nuclear Physics

    E-print Network

    Silas R. Beane; William Detmold; Kostas Orginos; Martin J. Savage

    2014-10-11

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. We review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  9. Model uncertainty--parameter uncertainty versus conceptual models.

    PubMed

    Højberg, A L; Refsgaard, J C

    2005-01-01

    Uncertainties in model structures have been recognised often to be the main source of uncertainty in predictive model simulations. Despite this knowledge, uncertainty studies are traditionally limited to a single deterministic model and the uncertainty addressed by a parameter uncertainty study. The extent to which a parameter uncertainty study may encompass model structure errors in a groundwater model is studied in a case study. Three groundwater models were constructed on the basis of three different hydrogeological interpretations. Each of the models was calibrated inversely against groundwater heads and streamflows. A parameter uncertainty analysis was carried out for each of the three conceptual models by Monte Carlo simulations. A comparison of the predictive uncertainties for the three conceptual models showed large differences between the uncertainty intervals. Most discrepancies were observed for data types not used in the model calibration. Thus uncertainties in the conceptual models become of increasing importance when predictive simulations consider data types that are extrapolates from the data types used for calibration. PMID:16304950

  10. Uncertainty Estimates for Millennial Scale Geomgagnetic Field Models

    NASA Astrophysics Data System (ADS)

    Korte, M.; Constable, C. G.; Donadini, F.

    2008-12-01

    Continuous geomagnetic field models spanning several millennia have recently been developed using various selections of archeo- and paleomagnetic data and their inferred ages. In each case the geographic and temporal distribution of available data is far from uniform and both the magnetic data and ages have large uncertainties. We estimate error bars for both the models and their predictions using two statistical resampling techniques and a combination thereof. First, we used what we call the spatial and temporal (ST) bootstrap yielding different spatial and temporal distributions taken randomly from the original dataset. Second, we kept the original (temporal and spatial) distribution of data, but varied each datum randomly within the expected distributions of uncertainty in both the magnetic observation and assigned ages. We call this the magnetic/age (MA) Bootstrap. We produced a large number of models based on resampled data using each of the ST and MA bootstrap methods and then obtain standard deviations for both global model coefficients and predictions of field components. The ST and MA methods yield model uncertainties of the same order of magnitude. A sequential combination of MA and ST resampling takes into account the influence of uncertainties in both magnetic elements and ages as well as the unsatisfactory data distribution. We present global and regional results from this analysis and compare the uncertainties obtained from model predictions to the assigned data errors. The uncertainties obtained for magnetic field elements vary depending on whether they are obtained by error propagation from uncertainties in the model coefficients or by computing the standard error in the individual element predictions for all resampled models. The propagated uncertainties do not currently allow for covariance among the coefficients. Hence, they can be too large in some geographic regions and time intervals with good data coverage. Individual element uncertainty predictions incorporate any such covariance automatically, and can in principle better accommodate regional variations in model accuracy.

  11. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

  12. Schwarzschild mass uncertainty

    NASA Astrophysics Data System (ADS)

    Davidson, Aharon; Yellin, Ben

    2014-02-01

    Applying Dirac's procedure to -dependent constrained systems, we derive a reduced total Hamiltonian, resembling an upside down harmonic oscillator, which generates the Schwarzschild solution in the mini super-spacetime. Associated with the now -dependent Schrodinger equation is a tower of localized Guth-Pi-Barton wave packets, orthonormal and non-singular, admitting equally spaced average-`energy' levels. Our approach is characterized by a universal quantum mechanical uncertainty structure which enters the game already at the flat spacetime level, and accompanies the massive Schwarzschild sector for any arbitrary mean mass. The average black hole horizon surface area is linearly quantized.

  13. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  14. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  15. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  16. Disturbance trade-off principle for quantum measurements

    NASA Astrophysics Data System (ADS)

    Mandayam, Prabha; Srinivas, M. D.

    2014-12-01

    We demonstrate a fundamental principle of disturbance tradeoff for quantum measurements, along the lines of the celebrated uncertainty principle: The disturbances associated with measurements performed on distinct yet identically prepared ensembles of systems in a pure state cannot all be made arbitrarily small. Indeed, we show that the average of the disturbances associated with a set of projective measurements is strictly greater than zero whenever the associated observables do not have a common eigenvector. For such measurements, we show an equivalence between disturbance tradeoff measured in terms of fidelity and the entropic uncertainty tradeoff formulated in terms of the Tsallis entropy (T2). We also investigate the disturbances associated with the class of nonprojective measurements, where the difference between the disturbance tradeoff and the uncertainty tradeoff manifests quite clearly.

  17. Uncertainty in Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2006-12-01

    Uncertainty is a part of our life, and society has to deal with it, even though it is sometimes difficult to estimate. This is particularly true in seismic hazard assessment for large events, such as the mega-tsunami in Southeast Asia and the great New Madrid earthquakes in the central United States. There are two types of uncertainty in seismic hazard assessment: temporal and spatial. Temporal uncertainty describes distribution of the events in time and is estimated from the historical records, while spatial uncertainty describes distribution of physical measurements generated at a specific point by the events and is estimated from the measurements at the point. These uncertainties are of different characteristics and generally considered separately in hazard assessment. For example, temporal uncertainty (i.e., the probability of exceedance in a period) is considered separately from spatial uncertainty (a confidence level of physical measurement) in flood hazard assessment. Although estimating spatial uncertainty in seismic hazard assessment is difficult because there are not enough physical measurements (i.e., ground motions), it can be supplemented by numerical modeling. For example, the ground motion uncertainty or tsunami uncertainty at a point of interest has been estimated from numerical modeling. Estimating temporal uncertainty is particularly difficult, especially for large earthquakes, because there are not enough instrumental, historical, and geological records. Therefore, the temporal and spatial uncertainties in seismic hazard assessment are of different characteristics and should be determined separately. Probabilistic seismic hazard analysis (PSHA), the most widely used method to assess seismic hazard for various aspects of public and financial policy, uses spatial uncertainty (ground motion uncertainty) to extrapolate temporal uncertainty (ground motion occurrence), however. This extrapolation, or so-called ergodic assumption, is caused by a mathematical error in hazard calculation of PSHA: incorrectly equating the conditional exceedance probability of the ground-motion attenuation relationship (a function) to the exceedance probability of the ground-motion uncertainty (a variable). An alternative approach has been developed to correct the error and to determine temporal and spatial uncertainties separately.

  18. Conceptual Framework of the Precautionary Principle Approach in the Decision-making process

    Microsoft Academic Search

    Clare D'Souza; Mehdi Taghian; Australia Rajiv Khosla

    This article reviews the precautionary principle as an approach in addressing decisions with far reaching environmental consequences under scientific uncertainty. The precautionary principle is intended to assist with structuring environmentally risky decisions toward sustainable development. It responds to the lack of scientific evidence of a potential environmental problem. There is currently no framework to assist with the process indicating the

  19. Minimal length in quantum gravity, equivalence principle and holographic entropy bound

    Microsoft Academic Search

    Ahmed Farag Ali

    2011-01-01

    A possible discrepancy has been found between the results of a neutron interferometry experiment and quantum mechanics. This experiment suggests that the weak equivalence principle is violated at small length scales, which quantum mechanics cannot explain. In this paper, we investigated whether the generalized uncertainty principle (GUP), proposed by some approaches to quantum gravity such as string theory and doubly

  20. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity to assessing the damage to different elements at risk, of the databases on different elements at risk, such as population and building stock distribution, as well critical facilities characteristics, on the reliability of expected loss estimations at regional and global scale.

  1. Principles of control thermodynamics

    Microsoft Academic Search

    P. Salamon; J. D. Nulton; G. Siragusa; T. R. Andersen; A. Limon

    2001-01-01

    The article presents a partial synthesis of progress in control thermodynamics by laying out the main results as a sequence of principles. We state and discuss nine general principles (0–8) for finding bounds on the effectiveness of energy conversion in finite-time.

  2. Government Information Policy Principles.

    ERIC Educational Resources Information Center

    Hernon, Peter

    1991-01-01

    Analyzes the utility of policy principles advanced by professional associations for public access to government information. The National Commission on Libraries and Information Science (NCLIS), the Information Industry Association (IIA), and the Office of Technology Assessment (OTA) urge the adoption of principles for the dissemination of public…

  3. Hamilton's Principle for Beginners

    ERIC Educational Resources Information Center

    Brun, J. L.

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…

  4. Pauli Exclusion Principle

    NSDL National Science Digital Library

    Dr. Rod Nave

    This tutorial provides instruction on Pauli's exclusion principle, formulated by physicist Wolfgang Pauli in 1925, which states that no two electrons in an atom can have identical quantum numbers. Topics include a mathematical statement of the principle, descriptions of some of its applications, and its role in ionic and covalent bonding, nuclear shell structure, and nuclear binding energy.

  5. Joint measurements of spin, operational locality and uncertainty

    E-print Network

    Erika Andersson; Stephen M. Barnett; Alain Aspect

    2005-09-21

    Joint, or simultaneous, measurements of non-commuting observables are possible within quantum mechanics, if one accepts an increase in the variances of the jointly measured observables. In this paper, we discuss joint measurements of a spin 1/2 particle along any two directions. Starting from an operational locality principle, it is shown how to obtain a bound on how sharp the joint measurement can be. We give a direct interpretation of this bound in terms of an uncertainty relation.

  6. Parts Orienting with Shape Uncertainty

    Microsoft Academic Search

    Srinivas Akella; Matthew T. Mason

    1998-01-01

    Abstract Parts manufactured,to tolerances have shape variations. Most work in robotic manipulation,assumes,that part shape does not vary. Orienting devices such as bowl feeders frequently fail due to variations in part shape. In this paper we develop techniques to orient parts with shape uncertainty. We present a shape uncertainty model and describe the nondeterminism,in parts orienting that arises from shape uncertainty.

  7. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  8. MRST partons and uncertainties.

    E-print Network

    Martin, A D; Roberts, R G; Stirling, W James; Thorne, Robert S

    ar X iv :h ep -p h/ 03 07 26 2v 1 2 1 Ju l 2 00 3 IPPP/03/43 DCPT/03/86 Cavendish-HEP-2003/13 21 July 2003 MRST partons and uncertainties A.D. Martin1, R.G. Roberts1, W.J. Stirling1 and R.S. Thorne2 1IPPP, Durham, DH1 3LE, UK 2Cavendish... : D. Stump et al., Phys. Rev. D65 (2002) 014012; CTEQ Collaboration: J. Pumplin et al., Phys. Rev. D65 (2002) 014013. [2] A.D. Martin, R.G. Roberts, W.J. Stirling and R.S. Thorne, Eur. Phys. J. C28 (2003) 455. [3] A.D. Martin, R.G. Roberts, W...

  9. Integrating out astrophysical uncertainties

    NASA Astrophysics Data System (ADS)

    Fox, Patrick J.; Liu, Jia; Weiner, Neal

    2011-05-01

    Underground searches for dark matter involve a complicated interplay of particle physics, nuclear physics, atomic physics, and astrophysics. We attempt to remove the uncertainties associated with astrophysics by developing the means to map the observed signal in one experiment directly into a predicted rate at another. We argue that it is possible to make experimental comparisons that are completely free of astrophysical uncertainties by focusing on integral quantities, such as g(vmin?)=?vmin?dvf(v)/v and ?vthreshdvvg(v). Direct comparisons are possible when the vmin? space probed by different experiments overlap. As examples, we consider the possible dark matter signals at CoGeNT, DAMA, and CRESST-Oxygen. We find that the expected rate from CoGeNT in the XENON10 experiment is higher than observed, unless scintillation light output is low. Moreover, we determine that S2-only analyses are constraining, unless the charge yields Qy<2.4electrons/keV. For DAMA to be consistent with XENON10, we find for qNa=0.3 that the modulation rate must be extremely high (?70% for m?=7GeV), while for higher quenching factors, it makes an explicit prediction (0.8-0.9cpd/kg) for the modulation to be observed at CoGeNT. Finally, we find CDMS-Si, even with a 10 keV threshold, as well as XENON10, even with low scintillation, would have seen significant rates if the excess events at CRESST arise from elastic WIMP scattering, making it very unlikely to be the explanation of this anomaly.

  10. Medical decisions under uncertainty.

    PubMed

    Carmi, A

    1993-01-01

    The court applies the criteria of the reasonable doctor and common practice in order to consider the behaviour of a defendant physician. The meaning of our demand that the doctor expects that his or her acts or omissions will bring about certain implications is that, according to the present circumstances and subject to the limited knowledge of the common practice, the course of certain events or situations in the future may be assumed in spite of the fog of uncertainty which surrounds us. The miracles and wonders of creation are concealed from us, and we are not aware of the way and the nature of our bodily functioning. Therefore, there seems to be no way to avoid mistakes, because in several cases the correct diagnosis cannot be determined even with the most advanced application of all information available. Doctors find it difficult to admit that they grope in the dark. They wish to form clear and accurate diagnoses for their patients. The fact that their profession is faced with innumerable and unavoidable risks and mistakes is hard to swallow, and many of them claim that in their everyday work this does not happen. They should not content themselves by changing their style. A radical metamorphosis is needed. They should not be tempted to formulate their diagnoses in 'neutral' statements in order to be on the safe side. Uncertainty should be accepted and acknowledged by the profession and by the public at large as a human phenomenon, as an integral part of any human decision, and as a clear characteristic of any legal or medical diagnosis.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8231694

  11. Linking ethical principles with community practice.

    PubMed

    Ross, M E

    1993-01-01

    Nurses must frequently make arduous decisions when faced with ethical dilemmas that occur in clinical practice. Utilizing ethical principles for analyzing and reflecting on the issues may ease this difficult task. In addition, the nurse involved may experience less anxiety and uncertainty over whether or not the correct decision was made. The purpose of this article is to present a case study that involves the allocation of a scarce resource that posed an ethical dilemma for a nursing instructor and her students in a community health setting. Toulmin's framework is utilized for analyzing this dilemma. PMID:8126529

  12. Structural Uncertainty The conventional method of quantifying model uncertainty involves

    E-print Network

    Jones, Peter JS

    ) Equilibrium cli- mate sensitivity: is it accurate to use a slab ocean model? J. Climate, 22, 2494­2499. Forest. Comparison of perturbed physics to multi-model ensembles implies ocean model uncertainty has little global emulation be a useful tool in ocean uncertainty quantification? Chris Brierley1 and Serge Guillas2 1UCL

  13. Uncertainty analysis of satellite structures

    Microsoft Academic Search

    G. S. Szekely; H. J. Pradlwarte; G. I. Schueller; E. Marchante

    2002-01-01

    This paper discusses the analysis of uncertainties in structural analysis of space crafts. Various sources for the discrepancies between prediction and measured response are identified. It is shown that only the effect of scatter of the structural parameters can be studied further by appropriate analysis tools. It is suggested to model uncertainties mathematically within the framework of stochas- tic analysis.

  14. Measuring uncertainties in MRP environments

    Microsoft Academic Search

    S. C. Koh; M. H. Jones; S. M. Saad; S. Arunachalam; A. Gunasekaran

    2000-01-01

    Many uncertainties in Material Requirements Planning (MRP) systems are treated as “controllable” elements, with a variety of buffering, dampening and other approaches being used to cope with them. However, such approaches are often found wanting, forcing enterprises into emergency measures to ensure delivery performance. Based upon the results of a questionnaire survey, this paper analyses the uncertainties in the aggregate,

  15. Designing for Uncertainty: Three Approaches

    ERIC Educational Resources Information Center

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  16. Planning ATES systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.

  17. The Principles of Accreditation

    E-print Network

    Boyce, Richard L.

    #12;The Principles of Accreditation: Foundations for Quality Enhancement Commission on Colleges..........................................................................................2 Organization of the Commission and the Association ............4 The Process of Accreditation Policies ....................31 3.14 Representation of Accreditation Status

  18. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  19. Cancelling out systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Noreña, Jorge; Verde, Licia; Jimenez, Raul; Peña-Garay, Carlos; Gomez, Cesar

    2012-01-01

    We present a method to minimize, or even cancel out, the nuisance parameters affecting a measurement. Our approach is general and can be applied to any experiment or observation where systematic errors are a concern e.g. are larger than statistical errors. We compare it with the Bayesian technique used to deal with nuisance parameters: marginalization, and show how the method compares and improves by avoiding biases. We illustrate the method with several examples taken from the astrophysics and cosmology world: baryonic acoustic oscillations (BAOs), cosmic clocks, Type Ia supernova (SNIa) luminosity distance, neutrino oscillations and dark matter detection. By applying the method we not only recover some known results but also find some interesting new ones. For BAO experiments we show how to combine radial and angular BAO measurements in order to completely eliminate the dependence on the sound horizon at radiation drag. In the case of exploiting SNIa as standard candles we show how the uncertainty in the luminosity distance by a second parameter modelled as a metallicity dependence can be eliminated or greatly reduced. When using cosmic clocks to measure the expansion rate of the universe, we demonstrate how a particular combination of observables nearly removes the metallicity dependence of the galaxy on determining differential ages, thus removing the age-metallicity degeneracy in stellar populations. We hope that these findings will be useful in future surveys to obtain robust constraints on the dark energy equation of state.

  20. Archimedes' Principle and Applications Objectives

    E-print Network

    Yu, Jaehoon

    Lab 9 Archimedes' Principle and Applications Objectives: Upon successful completion of this exercise you will have ... 1. ... utilized Archimedes' principle to determine the density and specific gravity of a variety of substances. 2. ... utilized Archimedes' principle to determine the density

  1. Principles of Information Assurance

    NSDL National Science Digital Library

    This course on the Principles of Information Assurance is provided by the Cyber Security Education Consortium (CSEC). The course includes introductory security principles and gives students "an understanding of the current threats and vulnerabilities of the cyber landscape, plus other topics relating to the information assurance field." Links are provided to learn more about the Major Topics Covered, Course Learning Objectives, and Course Outline. The Course Outline includes a list of careers that require the knowledge from this course and related textbooks.

  2. Principles of Forecasting Project

    NSDL National Science Digital Library

    Directed by J. Scott Armstrong at the Wharton School of the University of Pennsylvania, the Principles of Forecasting Project seeks to "develop a comprehensive and structured review of the state of knowledge in the field of forecasting" in order to aid future research. The project will lead to a book entitled Principles of Forecasting: A Handbook for Researchers and Practitioners, and sample chapters, contact information, updates, and links to forecasting resources add value to this expanding compilation.

  3. Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation

    SciTech Connect

    Smith, D.L. [Argonne National Laboratory, 1710 Avenida Del Mundo 1506, Coronado, CA 92118 (United States)] [Argonne National Laboratory, 1710 Avenida Del Mundo 1506, Coronado, CA 92118 (United States); Otuka, N. [Nuclear Data Section, International Atomic Energy Agency, Wagramerstrasse 5, A-1400 Wien (Austria)] [Nuclear Data Section, International Atomic Energy Agency, Wagramerstrasse 5, A-1400 Wien (Austria)

    2012-12-15

    This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

  4. Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation

    NASA Astrophysics Data System (ADS)

    Smith, D. L.; Otuka, N.

    2012-12-01

    This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

  5. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

  6. Uncertainties in 4??–? coincidence counting

    NASA Astrophysics Data System (ADS)

    Fitzgerald, R.; Bailat, C.; Bobin, C.; Keightley, J. D.

    2015-06-01

    The 4??–? coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.

  7. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie [Los Alamos National Laboratory; Vostrotin, Vadim [SUBI; Vvdensky, Vladimir [SUBI

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  8. Uncertainty in perception and the Hierarchical Gaussian Filter

    PubMed Central

    Mathys, Christoph D.; Lomakina, Ekaterina I.; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H.; Friston, Karl J.; Stephan, Klaas E.

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder–Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient—but at the same time intuitive—framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  9. Uncertainty in perception and the Hierarchical Gaussian Filter.

    PubMed

    Mathys, Christoph D; Lomakina, Ekaterina I; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H; Friston, Karl J; Stephan, Klaas E

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient-but at the same time intuitive-framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  10. Natural Communication about Uncertainties in Situated Interaction

    E-print Network

    Horvitz, Eric

    and coordinate the production of nonverbal and verbal behaviors to communicate the system's uncertaintiesNatural Communication about Uncertainties in Situated Interaction Tomislav Pejsa* University present methods for estimating and communicating about different uncertainties in situated interaction

  11. Predicting System Performance with Uncertainty 

    E-print Network

    Yan, B.; Malkawi, A.

    2012-01-01

    inexpensive way. We propose using Gaussian Processes for system performance predictions and explain the types of uncertainties included. As an example, we use a Gaussian Process to predict chilled water use and compare the results with Neural Network...

  12. Uncertainty in Climate Change Modeling

    NSDL National Science Digital Library

    Wisconsin ECB

    2010-11-30

    Learn why trout are important indicators in Wisconsin’s changing climate, and why the uncertainty of global climate models complicates predictions about their survival, in this video produced by the Wisconsin Educational Communications Board.

  13. 3, 26752706, 2006 Uncertainty in

    E-print Network

    Boyer, Edmond

    in relation to the Water Framework Directive (WFD). One of the key15 sources of uncertainty of importance for evaluating the effect and cost of a measure in relation to preparing a WFD-compliant river basin management

  14. The principle of material frame indifference and the covarianee principle

    Microsoft Academic Search

    L. J. T. M. Kempers

    1989-01-01

    Summary  A comparison between the formulations of the principle of material frame indifference in continuum mechanics, which principle\\u000a refers to stress-strain relations without inertial forces, and the covariance principle in the theory of general relativity\\u000a indicates that a relationship between them can be established. It is shown that the principle of material frame indifference\\u000a follows from the covariance principle in the

  15. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

  16. Uncertainty Quantification in Fluid Flow

    Microsoft Academic Search

    Habib N. Najm

    \\u000a This chapter addresses the topic of uncertainty quantification in fluid flow computations. The relevance and utility of this\\u000a pursuit are discussed, outlining highlights of available methodologies. Particular attention is focused on spectral polynomial\\u000a chaos methods for uncertainty quantification that have seen significant development over the past two decades. The fundamental\\u000a structure of these methods is presented, along with associated challenges.

  17. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  18. Basic Principles of Chromatography

    NASA Astrophysics Data System (ADS)

    Ismail, Baraem; Nielsen, S. Suzanne

    Chromatography has a great impact on all areas of analysis and, therefore, on the progress of science in general. Chromatography differs from other methods of separation in that a wide variety of materials, equipment, and techniques can be used. [Readers are referred to references (1-19) for general and specific information on chromatography.]. This chapter will focus on the principles of chromatography, mainly liquid chromatography (LC). Detailed principles and applications of gas chromatography (GC) will be discussed in Chap. 29. In view of its widespread use and applications, high-performance liquid chromatography (HPLC) will be discussed in a separate chapter (Chap. 28). The general principles of extraction are first described as a basis for understanding chromatography.

  19. The Box Principle Dragos Hrimiuc

    E-print Network

    Bowman,John C.

    The Box Principle Dragos Hrimiuc There are different versions of the Box Principle (or Pi- geonhole Principle). Essentially it says: It n+1 balls are distributed in n boxes then at least one box has more than are distributed in n boxes, then at least one box has more than m balls. This elementary principle, used first

  20. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  1. Clocks, computers, black holes, spacetime foam, and holographic principle

    E-print Network

    Y. Jack Ng

    2000-10-25

    What do simple clocks, simple computers, black holes, space-time foam, and holographic principle have in common? I will show that the physics behind them is inter-related, linking together our concepts of information, gravity, and quantum uncertainty. Thus, the physics that sets the limits to computation and clock precision also yields Hawking radiation of black holes and the holographic principle. Moreover, the latter two strongly imply that space-time undergoes much larger quantum fluctuations than what the folklore suggests --- large enough to be detected with modern gravitational-wave interferometers through future refinements.

  2. Mirror Principle I

    Microsoft Academic Search

    Bong H. Lian; Kefeng Liu; S. T. Yau

    1997-01-01

    We propose and study the following Mirror Principle: certain sequences of multiplicative equivariant characteristic classes on Kontsevich's stable map moduli spaces can be computed in terms of certain hypergeometric type classes. As applications, we compute the equivariant Euler classes of obstruction bundles induced by any concavex bundles -- including any direct sum of line bundles -- on $\\\\P^n$. This includes

  3. RADAR PRINCIPLES I Introduction

    E-print Network

    Sato, Toru

    ) bands. Antenna size of weather radarsis a few to about ten metersin diameter, but an} atmospheric radar atmospheric radars have antennas witli dialneter of 10- 300 in. Weather radars cover a wide horizontal areaRADAR PRINCIPLES I Introduction Radar is a general technique, willcli has a wide range

  4. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  5. Laboratory Safety Principles

    NSDL National Science Digital Library

    Jerry Staiger, Keith Carlson, Jim Laver, Ray Arntson (University of Minnesota; )

    2008-04-11

    This workshop covers major principles and regulations pertinent to working in laboratories with hazardous materials. It is divided into 45 minute segments dealing with: Radioactive Materials (Staiger); Toxic, Reactive, Carcinogenic, and Teratogenic Chemicals (Carlson); Infectious Agents (Laver); and Fire Safety Concepts and Physical Hazards (Arnston).

  6. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  7. Principles for Professional Ethics.

    ERIC Educational Resources Information Center

    School Psychology Review, 1997

    1997-01-01

    Reviews principles based on assumptions that school psychologists will act as advocates for their clients and will do no harm. Includes sections on professional competency, relationships and responsibilities, and practice in public and private settings. Presents extensive information on procedural guidelines for adjudication of ethical complaints.…

  8. Using Lean Principles to \\

    Microsoft Academic Search

    Sandra Barkman; Bryon S. Marks

    With contacts throughout the supply chain, who better than the Supply Chain professional to implement Lean principles? This paper explores Supply Chain Management's (SCM) strategic use of Lean as a business philosophy. Defining the value stream from raw material to finished product delivery, SCM links with multiple functions throughout the product life cycle. Using these contacts, the SCM function ideally

  9. Principles of Software Testing

    E-print Network

    Meyer, Bertrand

    testing is the process used to assess the quality of computer software. Software testing is an empiricalSeven Principles of Software Testing Bertrand Meyer, ETH Zürich and Eiffel Software W hile everyone knows the theoret- ical limitations of software testing, in practice we devote considerable effort

  10. Principles of Applied Mathematics

    NSDL National Science Digital Library

    Kasimov, Aslan

    This course, presented by MIT and taught by professor Aslan Kasimov, describes basic principles of applied mathematics. Specifically, the material looks at mathematical analysis of continuum models of various natural phenomena. The course materials include student assignments and exams. MIT presents OpenCourseWare as free educational material online. No registration or enrollment is required to use the materials.

  11. Identifying Product Scaling Principles 

    E-print Network

    Perez, Angel 1986-

    2011-06-02

    -by-analogy and bioinspired design. Despite various scaling laws for specific systems, there are no global principles for scaling systems, for example from a biological nano scale to macro scale. This is likely one of the reasons that bioinspired design is difficult. Very...

  12. Quantum from principles

    E-print Network

    Giulio Chiribella; Giacomo Mauro D'Ariano; Paolo Perinotti

    2015-06-01

    Quantum theory was discovered in an adventurous way, under the urge to solve puzzles-like the spectrum of the blackbody radiation-that haunted the physics community at the beginning of the 20th century. It soon became clear, though, that quantum theory was not just a theory of specific physical systems, but rather a new language of universal applicability. Can this language be reconstructed from first principles? Can we arrive at it from logical reasoning, instead of ad hoc guesswork? A positive answer was provided in Refs. [1, 2], where we put forward six principles that identify quantum theory uniquely in a broad class of theories. We first defined a class of "theories of information", constructed as extensions of probability theory in which events can be connected into networks. In this framework, we formulated the six principles as rules governing the control and the accessibility of information. Directly from these rules, we reconstructed a number of quantum information features, and eventually, the whole Hilbert space framework. In short, our principles characterize quantum theory as the theory of information that allows for maximal control of randomness.

  13. Principles of Plasma Diagnostics

    Microsoft Academic Search

    I. H. Hutchinson

    2002-01-01

    This book provides a systematic introduction to the physics of plasma diagnostics measurements. It develops from first principles the concepts needed to plan, execute and interpret plasma measurements, making it a suitable book for graduate students and professionals with little plasma physics background. The book will also be a valuable reference for seasoned plasma physicists, both experimental and theoretical, as

  14. Principles of Plasma Diagnostics

    Microsoft Academic Search

    I. H. Hutchinson

    2005-01-01

    This book provides a systematic introduction to the physics of plasma diagnostics measurements. It develops from first principles the concepts needed to plan, execute and interpret plasma measurements, making it a suitable book for graduate students and professionals with little plasma physics background. The book will also be a valuable reference for seasoned plasma physicists, both experimental and theoretical, as

  15. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  16. The Physical Principles of Quantum Mechanics. A critical review

    E-print Network

    F. Strocchi

    2012-01-04

    The standard presentation of the principles of quantum mechanics is critically reviewed both from the experimental/operational point and with respect to the request of mathematical consistency and logical economy. A simpler and more physically motivated formulation is discussed. The existence of non commuting observables, which characterizes quantum mechanics with respect to classical mechanics, is related to operationally testable complementarity relations, rather than to uncertainty relations. The drawbacks of Dirac argument for canonical quantization are avoided by a more geometrical approach.

  17. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  18. Uncertainty Modeling and Reduction in MANETs

    Microsoft Academic Search

    Feng Li; Jie Wu

    2010-01-01

    Evaluating and quantifying trust stimulates collaboration in mobile ad hoc networks (MANETs). Many existing reputation systems sharply divide the trust value into right or wrong, thus ignoring another core dimension of trust: uncertainty. As uncertainty deeply impacts a node's anticipation of others' behavior and decisions during interaction, we include uncertainty in the reputation system. Specifically, we define a new uncertainty

  19. Uncertainty analysis of transient population dynamics

    Microsoft Academic Search

    Chonggang Xu; George Z. Gertner

    2009-01-01

    Two types of demographic analyses, perturbation analysis and uncertainty analysis, can be conducted to gain insights about matrix population models and guide population management. Perturbation analysis studies how the perturbation of demographic parameters (survival, growth, and reproduction parameters) may affect the population projection, while uncertainty analysis evaluates how much uncertainty there is in population dynamic predictions and where the uncertainty

  20. Techniques and methods for uncertainty management

    Microsoft Academic Search

    J. Kwakkel; S. Cunningham

    2008-01-01

    Uncertainty is inherent in modern day strategic planning. However, there is little consensus in how to define uncertainty, what its characteristics are, and how we should relate these characteristics to the appropriate treatment or management of uncertainty. One way of identifying the different meanings of uncertainty and the techniques used for treating it, is to survey the usage of the

  1. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    Microsoft Academic Search

    F. Smith; M. Phifer

    2011-01-01

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the

  2. Data uncertainty in face recognition.

    PubMed

    Xu, Yong; Fang, Xiaozhao; Li, Xuelong; Yang, Jiang; You, Jane; Liu, Hong; Teng, Shaohua

    2014-10-01

    The image of a face varies with the illumination, pose, and facial expression, thus we say that a single face image is of high uncertainty for representing the face. In this sense, a face image is just an observation and it should not be considered as the absolutely accurate representation of the face. As more face images from the same person provide more observations of the face, more face images may be useful for reducing the uncertainty of the representation of the face and improving the accuracy of face recognition. However, in a real world face recognition system, a subject usually has only a limited number of available face images and thus there is high uncertainty. In this paper, we attempt to improve the face recognition accuracy by reducing the uncertainty. First, we reduce the uncertainty of the face representation by synthesizing the virtual training samples. Then, we select useful training samples that are similar to the test sample from the set of all the original and synthesized virtual training samples. Moreover, we state a theorem that determines the upper bound of the number of useful training samples. Finally, we devise a representation approach based on the selected useful training samples to perform face recognition. Experimental results on five widely used face databases demonstrate that our proposed approach can not only obtain a high face recognition accuracy, but also has a lower computational complexity than the other state-of-the-art approaches. PMID:25222733

  3. Heisenberg Uncertainty and the Allowable Masses of the Up Quark and Down Quark

    Microsoft Academic Search

    Brian Orr

    2004-01-01

    A possible explanation for the inability to attain deterministic measurements of an elementary particle's energy, as given by the Heisenberg Uncertainty Principle, manifests itself in an interesting anthropic consequent of Andrei Linde's Self-reproducing Inflationary Multiverse model. In Linde's model, the physical laws and constants that govern our universe adopt other values in other universes, due to variable Higgs fields. While

  4. Does quantum uncertainty have a place in everyday applied statistics?1 Andrew Gelman and Michael Betancourt

    E-print Network

    Gelman, Andrew

    quantum probability models for statistics in the social and behavioral sciences is not by directly using, Heisenberg's uncertainty principle seems naturally relevant in the social and behavioral sciences, where Busemeyer, for Behavioral and Brain Sciences. #12;We propose that the best way to use ideas of quantum

  5. Uncertainty of height information in coherence scanning interferometry

    NASA Astrophysics Data System (ADS)

    Seewig, J.; Böttner, T.; Broschart, D.

    2011-05-01

    Coherence scanning interferometry CSI with a broadband light source (short known as white light interferometry) is, beside the confocal technique, one of the most popular optical principles to measure surface topography. Compared to coherent interferometry, the broadband light source leads, theoretically, to an unambiguous phase information. The paper describes the properties of the correlogram in the spatial and in the frequency domain. All deviations from the ideal correlogram are expressed by an addition phase term. The uncertainty of height information is discussed for both, the frequency domain analyse (FDA) proposed by de Groot and the Hilbert transform. For the frequency domain analyse, the uncertainty is quantified by the Cramér-Rao bound. The second part of the paper deals with the phase evaluation of the correlogram, which is necessary to achieve a high vertical resolution. Because the envelope function is often distorted, phase jumps lead to ambiguous height informations. In particular, this effect can be observed measuring rough surfaces.

  6. Who plays dice? Subjective uncertainty in deterministic quantum world

    NASA Astrophysics Data System (ADS)

    Carter, Brandon

    2006-11-01

    Einstein's 1905 recognition that light consists of discrete ``quanta'' inaugurated the duality (wave versus particle) paradox that was resolved 20 years later by Born's introduction of the probability interpretation on which modem quantum theory is based. Einstein's refusal to abandon the classical notion of deterministic evolution - despite the unqualified success of the new paradigm on a local scale - foreshadowed the restoration of determinism in the attempt to develop a global treatment applicable to cosmology by Everett, who failed however to provide a logically coherent treatment of subjective uncertainty at a local level. This drawback has recently been overcome in an extended formulation allowing deterministic description of a physical universe in which the uncertainty concerns only our own particular local situations, whose probability is prescribed by an appropriate micro-anthropic principle.

  7. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  8. Estimating uncertainty in resolution tests

    NASA Astrophysics Data System (ADS)

    Goncalves, Duarte P.; Griffith, Derek J.

    2006-05-01

    Resolution testing of imaging optical equipment is still commonly performed using the USAF 1951 target. The limiting resolution is normally calculated from the group and element that can just be resolved by an observer. Although resolution testing has limitations, its appeal lies in the fact that it is a quick test with low complexity. Resolution uncertainty can serve as a diagnostic tool, aid in understanding observer variability, and assist in planning experiments. It may also be necessary to satisfy a customer requirement or international standard. This paper derives theoretical results for estimating resolution and calculating its uncertainty, based on observer measurements, while taking the target spatial-frequency quantization into account. We show that estimating the resolution by simply averaging the target spatial frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger uncertainty analysis.

  9. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  10. Uncertainty Relations for Angular Momentum

    E-print Network

    Lars Dammeier; Rene Schwonnek; Reinhard F. Werner

    2015-04-30

    In this work we study various notions of uncertainty for angular momentum in the spin-s representation of SU(2). We characterize the "uncertainty regions" given by all vectors, whose components are the variances of the three angular momentum components. A basic feature of this set is a lower bound for the sum of the three variances. We give a method for obtaining optimal lower bounds for uncertainty regions for general operator triples, and evaluate these for small s. Further lower bounds are derived by generalizing the technique by which Robertson obtained his state-dependent lower bound. These are optimal for large s, since they are saturated by states taken from the Holstein-Primakoff approximation. We show that, for all s, all variances are consistent with the so-called vector model, i.e., they can also be realized by a classical probability measure on a sphere of radius sqrt(s(s+1)). Entropic uncertainty relations can be discussed similarly, but are minimized by quite different states from the variance minimizing ones for small s. For large s the Maassen-Uffink bound becomes sharp, again being saturated by Holstein-Primakoff states. Measurement uncertainty, as recently discussed by Busch, Lahti and Werner for position and momentum, is introduced and a generalized observable (POVM) which minimizes the worst case measurement uncertainty of all angular momentum components is explicitly determined. Its outputs are angular momentum vectors whose absolute value r(s) depends only on s. The function r is determined explicitly, and r(s)/s approaches 1 from below.

  11. Principles of Semiconductor Devices

    NSDL National Science Digital Library

    Van Zeghbroeck, Bart Jozef

    Home page of an online and interactive textbook, Principles of Semiconductor Devices., written by Bart J. Van Zeghbroeck, Ph.D., Professor in the Department of Electrical and Computer Engineering at the University of Colorado at Boulder. The goal of this text is to provide the basic principles of common semiconductor devices, with a special focus on Metal-Oxide-Semiconductor Field-Effect-Transistors (MOSFETs). A browser environment was chosen so that text, figures and equations can be linked for easy reference. A table of contents, a glossary, active figures and some study aids are integrated with the text with the intention to provide a more effective reference and learning environment. Chapter titles include: Semiconductor Fundamentals, Metal-Semiconductor Junctions, p-n Junctions, Bipolar Transistors, MOS Capacitors, and MOSFET.

  12. Principles of Chemical Science

    NSDL National Science Digital Library

    Drennan, Catherine

    2008-01-01

    The basic principles behind chemical science are the bedrock of a number of scientific endeavors, and this remarkable course from MIT's OpenCourseWare initiative is quite a find. Professor Catherine Drennan and Dr. Elizabeth Vogel Taylor created the materials for this course, and the site includes video lectures, lecture notes, and exams. Visitors will note that these materials can be found on the left-hand side of the page, and they can also be downloaded en masse via the "Download Course Materials" link. The topics covered here include the basic principles of atomic and molecular electronic structure, thermodynamics, acid-base and redox equilibria, and chemical kinetics. Also, visitors are encouraged to offer their own feedback on the course, or even provide a donation to help out with this initiative.

  13. The Equivalence Principle Revisited

    E-print Network

    R. Aldrovandi; P. B. Barros; J. G. Pereira

    2002-12-07

    A precise formulation of the strong Equivalence Principle is essential to the understanding of the relationship between gravitation and quantum mechanics. The relevant aspects are reviewed in a context including General Relativity, but allowing for the presence of torsion. For the sake of brevity, a concise statement is proposed for the Principle: "An ideal observer immersed in a gravitational field can choose a reference frame in which gravitation goes unnoticed". This statement is given a clear mathematical meaning through an accurate discussion of its terms. It holds for ideal observers (time-like smooth non-intersecting curves), but not for real, spatially extended observers. Analogous results hold for gauge fields. The difference between gravitation and the other fundamental interactions comes from their distinct roles in the equation of force.

  14. The Principles of Flight

    NSDL National Science Digital Library

    The Principle's of Flight Web site is offered by the Pilot's Web Aviation Journal and contains an excellent introduction to the physics of flight. Topics include Newton's laws of motion and force, airfoils, lift and drag, forces acting on an airplane, speed, flight maneuvers, the effects of roll, and more. Each topic contains good illustrations, descriptions, and equations. Overall, the site is an interesting and informative look behind the science of flight.

  15. Principles of lake sedimentology

    SciTech Connect

    Janasson, L.

    1983-01-01

    This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.

  16. Principles of Optics

    Microsoft Academic Search

    Max Born; Emil Wolf

    1999-01-01

    Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous

  17. A biomechanical inactivation principle

    Microsoft Academic Search

    Jean-Paul Gauthier; Bastien Berret; Frédéric Jean

    2010-01-01

    This paper develops the mathematical side of a theory of inactivations in human biomechanics. This theory has been validated\\u000a by practical experiments, including zero-gravity experiments. The theory mostly relies on Pontryagin’s maximum principle on\\u000a the one side and on transversality theory on the other side. It turns out that the periods of silence in the activation of\\u000a muscles that are

  18. Probing Mach's principle

    NASA Astrophysics Data System (ADS)

    Annila, Arto

    2012-06-01

    The principle of least action in its original form á la Maupertuis is used to explain geodetic and frame-dragging precessions which are customarily accounted for a curved space-time in general relativity. The least-time equations of motion agree with observations and are also in concert with general relativity. Yet according to the least-time principle, gravitation does not relate to the mathematical metric of space-time, but to a tangible energy density embodied by photons. The density of free space is in balance with the total mass of the Universein accord with the Planck law. Likewise, a local photon density and its phase distribution are in balance with the mass and charge distribution of a local body. Here gravitational force is understood as an energy density difference that will diminish when the oppositely polarized pairs of photons co-propagate from the energy-dense system of bodies to the energy-sparse system of the surrounding free space. Thus when the body changes its state of motion, the surrounding energy density must accommodate the change. The concurrent resistance in restructuring the surroundings, ultimately involving the entire Universe, is known as inertia. The all-around propagating energy density couples everything with everything else in accord with Mach’s principle.

  19. 4, 507532, 2004 Emission uncertainty

    E-print Network

    Boyer, Edmond

    has been analyzed using a regional chemical transport model. Three independent emissions inventories) for the year 1995,5 (ii) a regional emission inventory used in the Transport and Chemical Evolution overACPD 4, 507­532, 2004 Emission uncertainty and ozone modelling J. Ma and J. A. van Aardenne Title

  20. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  1. Evaluating Uncertainty in Engineering Calculations

    Microsoft Academic Search

    J. E. WALSTROM; T. D. MUELLER; R. C. McFARLANE

    1967-01-01

    It is customary, in evaluating uncertainty, to perform repeated experiments and to draw conclusions from the distribution of the results of these experiments. With the advent of the high-speed electronic computer, it is possible to construct mathematical models which simulate complex experiments or operations and to perform the experiments repeatedly utilizing the models. Statistical methods are then applied to the

  2. Spectral Methods for Uncertainty Quantication

    E-print Network

    Spectral Methods for Uncertainty Quantication Emil Brandt Kærgaard Kongens Lyngby 2013 IMM-M.Sc.-2013-0503 #12;Technical University of Denmark Department of Applied Mathematics and Computer Science Building 303B, DK-2800 Kongens Lyngby, Denmark www.compute.dtu.dk IMM-M.Sc.-2013-0503 #12;Summary (English

  3. UNCERTAINTY ANALYSIS FOR WATERSHED MANAGEMENT

    Microsoft Academic Search

    Larry Olmsted

    ABSTRACT The Watershed Analysis Risk Management ,Framework ,is being ,applied to the ,Catawba ,River Basin of North and South Carolina. To reach consensus on a watershed management plan, stakeholders need to know the cost, effectiveness and chance of failure of various management alternatives. This information, supplied by hydrologic and water quality models and cost analyses, contains uncertainty. The sources of

  4. Uncertainty modelling for threat analysis

    Microsoft Academic Search

    Mikael Lundin

    Accurate modelling of information and knowledge is central to the modern command and control (C2) process. Without models and a language for describing them, it is impossible to collaborate on C2. All information which enters a C2 system will be uncertain, and hence it is important to be able to model the uncertainty in a way that makes it possible

  5. An Uncertainty Framework for Classification

    Microsoft Academic Search

    Loo-nin Teow; Kia-fock Loe

    2000-01-01

    We define a generalized likelihood function based on uncertainty measures and show that maximizing such a likelihood function for different measures induces different types of classifiers. In the probabilistic framework, we obtain classifiers that optimize the cross-entropy function. In the possibilistic framework, we obtain classifiers that maximize the interclass margin. Furthermore, we show that the support vector machine is a

  6. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta [ORNL; King, Anthony Wayne [ORNL

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  7. Encoding uncertainty in the hippocampus

    Microsoft Academic Search

    L. M. Harrison; Andrew Duggins; Karl J. Friston

    2006-01-01

    The medial temporal lobe may play a critical role in binding successive events into memory while encoding contextual information in implicit and explicit memory tasks. Information theory provides a quantitative basis to model contextual information engendered by conditional dependence between, or conditional uncertainty about, consecutive events in a sequence. We show that information theoretic indices characterizing contextual dependence within a

  8. -uncertainty Anonymization by Partial Suppression

    E-print Network

    Zhu, Kenny Q.

    -uncertainty Anonymization by Partial Suppression Xiao Jia1 , Chao Pan1 , Xinhui Xu1 , Kenny Q. Zhu Results (a) Original Dataset TID Transaction 1 bread, milk, condom 2 bread, milk 3 milk, condom 4 flour, fruits 5 flour, condom 6 bread, fruits 7 fruits, condom (b) Global Suppression TID Transaction 1 bread

  9. Quantification of entanglement via uncertainties

    SciTech Connect

    Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S. [Faculty of Science, Bilkent University, Bilkent, Ankara, 06800 Turkey (Turkey)

    2007-03-15

    We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.

  10. Squeezed states and uncertainty relations

    Microsoft Academic Search

    Octavio Castaños; Rocio Jauregue-Renaud; Young S Kim; Margarita A Manko; Hector Moya-Cessa

    2004-01-01

    This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9–13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México.

  11. Intelligent Interface Learning with Uncertainty

    Microsoft Academic Search

    Robert A. Harrington; Scott M. Brown

    1997-01-01

    This paper presents an intelligent user interface agent architecture based on Bayesian networks. Using a Bayesian network knowledge representa- tion not only dynamically captures and models user behavior, but it also dynamically captures and models uncertainty in the interface's reason- ing process. Bayesian networks' sound seman- tics and mathematical basis enhances it's ability to make correct, intelligent inferences as to

  12. Principles of climate service development

    NASA Astrophysics Data System (ADS)

    Buontempo, Carlo; Liggins, Felicity; Newton, Paula

    2015-04-01

    In November 2014, a group of 30 international experts in climate service development gathered in Honiton, UK, to discuss and identify the key principles that should be considered when developing new climate services by all the actors involved. Through an interactive and dynamic workshop the attendees identified seven principles. This contribution summarises these principles.

  13. Attention, Uncertainty, and Free-Energy

    PubMed Central

    Feldman, Harriet; Friston, Karl J.

    2010-01-01

    We suggested recently that attention can be understood as inferring the level of uncertainty or precision during hierarchical perception. In this paper, we try to substantiate this claim using neuronal simulations of directed spatial attention and biased competition. These simulations assume that neuronal activity encodes a probabilistic representation of the world that optimizes free-energy in a Bayesian fashion. Because free-energy bounds surprise or the (negative) log-evidence for internal models of the world, this optimization can be regarded as evidence accumulation or (generalized) predictive coding. Crucially, both predictions about the state of the world generating sensory data and the precision of those data have to be optimized. Here, we show that if the precision depends on the states, one can explain many aspects of attention. We illustrate this in the context of the Posner paradigm, using the simulations to generate both psychophysical and electrophysiological responses. These simulated responses are consistent with attentional bias or gating, competition for attentional resources, attentional capture and associated speed-accuracy trade-offs. Furthermore, if we present both attended and non-attended stimuli simultaneously, biased competition for neuronal representation emerges as a principled and straightforward property of Bayes-optimal perception. PMID:21160551

  14. Structural Damage Assessment under Uncertainty

    NASA Astrophysics Data System (ADS)

    Lopez Martinez, Israel

    Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are isolated based on the integration of sensitivity analysis and statistical sampling, which minimizes the occurrence of false-damage indication due to uncertainty. To perform diagnostic decision-making under uncertainty, an evidential reasoning approach for damage assessment is developed for addressing the possible imprecision in the damage localization results. The newly developed damage detection and localization techniques are applied and validated through both vibration test data from literature and in house laboratory experiments.

  15. The Precautionary Principle in EU and US Chemicals Policy: A Comparison of Industrial Chemicals Legislation

    Microsoft Academic Search

    Mikael Karlsson

    \\u000a In this chapter, the precautionary principle will be considered as the ­starting point for decision-making on chemicals in\\u000a cases of scientific uncertainty. The principle will serve as the reference point for an analysis and a comparison of chemicals­\\u000a policies and, in particular, of legislation for industrial chemicals in the European Union and the United States of America.\\u000a In the second

  16. Genetics and psychiatry: a proposal for the application of the precautionary principle.

    PubMed

    Porteri, Corinna

    2013-08-01

    The paper suggests an application of the precautionary principle to the use of genetics in psychiatry focusing on scientific uncertainty. Different levels of uncertainty are taken into consideration--from the acknowledgement that the genetic paradigm is only one of the possible ways to explain psychiatric disorders, via the difficulties related to the diagnostic path and genetic methods, to the value of the results of studies carried out in this field. Considering those uncertainties, some measures for the use of genetics in psychiatry are suggested. Some of those measures are related to the conceptual limits of the genetic paradigm; others are related to present knowledge and should be re-evaluated. PMID:22460929

  17. Improvement of Uncertainty Relations for Mixed States

    E-print Network

    Yong Moon Park

    2004-09-03

    We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold.

  18. Use of Combined Uncertainty of Pesticide Residue Results for Testing Compliance with Maximum Residue Limits (MRLs).

    PubMed

    Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, Árpád

    2015-05-13

    The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants. PMID:25658668

  19. Principles of smile design

    PubMed Central

    Bhuvaneswaran, Mohan

    2010-01-01

    An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile. PMID:21217950

  20. Remote Sensing Principles

    NSDL National Science Digital Library

    This introduction to Earth observation includes definitions of several terms, examples taken from real situations, and questions, answers, and exercises. A simple example of traditional chorological mapping methods and is used to show some fundamental principles of satellite images. Histogram, pixel and classification are introduced. There are discussions about remote sensing, the history of Earth observation, and geostationary and solar synchronous orbits. In addition, the basic physical concepts underlying remote sensing are explained, with the help of some relatively simple viewgraphs. This site is also available in German, French, Italian and Spanish.

  1. Nonequilibrium quantum Landauer principle.

    PubMed

    Goold, John; Paternostro, Mauro; Modi, Kavan

    2015-02-13

    Using the operational framework of completely positive, trace preserving operations and thermodynamic fluctuation relations, we derive a lower bound for the heat exchange in a Landauer erasure process on a quantum system. Our bound comes from a nonphenomenological derivation of the Landauer principle which holds for generic nonequilibrium dynamics. Furthermore, the bound depends on the nonunitality of dynamics, giving it a physical significance that differs from other derivations. We apply our framework to the model of a spin-1/2 system coupled to an interacting spin chain at finite temperature. PMID:25723198

  2. Principles of Safety Pharmacology

    PubMed Central

    Pugsley, M K; Authier, S; Curtis, M J

    2008-01-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

  3. Great Lakes Literacy Principles

    NASA Astrophysics Data System (ADS)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  4. Principle of relative locality

    SciTech Connect

    Amelino-Camelia, Giovanni [Dipartimento di Fisica, Universita 'La Sapienza', and Sez. Roma1 INFN, P. le A. Moro 2, 00185 Roma (Italy); Freidel, Laurent; Smolin, Lee [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario N2J 2Y5 (Canada); Kowalski-Glikman, Jerzy [Institute for Theoretical Physics, University of Wroclaw, Pl. Maxa Borna 9, 50-204 Wroclaw (Poland)

    2011-10-15

    We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

  5. Trading indicators with information-gap uncertainty

    E-print Network

    Guttman, Tony

    1 Trading indicators with information-gap uncertainty Colin J. Thompson ARC Centre of Excellence. Practical implications ­ An additional technical trading tool for applying Information ­ Gap theory trading indicators, Information gaps, Uncertainty, Robustness, Financial modelling Paper type Research

  6. RESEARCH ARTICLE Application of uncertainty visualization methods

    E-print Network

    Laidlaw, David

    RESEARCH ARTICLE Application of uncertainty visualization methods to meteorological trajectories of uncertainty visualiza- tion to air parcel trajectories generated from a global meteorological model. We derive visualization of trajectories. Our work enables efficient visual pruning of unlikely results, especially

  7. Optimization under uncertainty in radiation therapy

    E-print Network

    Chan, Timothy Ching-Yee

    2007-01-01

    In the context of patient care for life-threatening illnesses, the presence of uncertainty may compromise the quality of a treatment. In this thesis, we investigate robust approaches to managing uncertainty in radiation ...

  8. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  9. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  10. Uncertainty in Mineral Prospectivity Prediction

    Microsoft Academic Search

    Pawalai Kraipeerapun; Chun Che Fung; Warick Brown; Kok Wai Wong; Tamás D. Gedeon

    2006-01-01

    This paper presents an approach to the prediction of mineral prospectivity that provides an assessment of uncertainty. Two\\u000a feed-forward backpropagation neural networks are used for the prediction. One network is used to predict degrees of favourability\\u000a for deposit and another one is used to predict degrees of likelihood for barren, which is opposite to deposit. These two types\\u000a of values

  11. Chance, uncertainty, and paraphysical beliefs 

    E-print Network

    Perlitz, Pamela Minnie

    1981-01-01

    , and excluding major re- ligions (Christianity, Judiasm, Islam, etc. ), will refer to beliefs considered to be nonempirical, nonscientific, or highly controversial' For present purposes, therefore, the general term paraphysical beliefs will be used to include... and occult die- ties are inter-related are a number of theories suggesting tha+ conversion to a fundamentalist superne+ural perspective serves to resolve 'oot? uncertainty and pre-existing persona. l conflicts within members (Eister, 1972; Glock and Stark...

  12. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  13. Uncertainty compliant design flood estimation

    NASA Astrophysics Data System (ADS)

    Botto, A.; Ganora, D.; Laio, F.; Claps, P.

    2014-05-01

    Hydraulic infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of Uncertainty Compliant Design (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimizes the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.

  14. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind [ORNL; Jha, Sumit Kumar [University of Central Florida

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  15. The Stock Market: Risk vs. Uncertainty.

    ERIC Educational Resources Information Center

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty

  16. Efficient uncertainty quantification in computational fluid dynamics

    Microsoft Academic Search

    G. J. A. Loeven

    2010-01-01

    When modeling physical systems, several sources of uncertainty are present. For example, variability in boundary conditions like free stream velocity or ambient pressure are always present. Furthermore, uncertainties in geometry arise from production tolerances, wear or unknown deformations under loading. Uncertainties in computational fluid dynamics (CFD) simulations can have a significant impact on the computed aerodynamic performance. Since CFD simulations

  17. Industry information uncertainty and stock return comovement

    Microsoft Academic Search

    Ting Luo; Wenjuan Xie

    2012-01-01

    This study investigates the association between industry information uncertainty and stock return comovement within industries. We test two predictions on industry comovement given correlated overweight among investors on past industry returns when there is greater industry-level uncertainty: (1) we find that stocks in high-uncertainty industries are more likely to move with other stocks in the same industry; (2) we find

  18. Uncertainty and information in classical mechanics formulation. Common ground for thermodynamics and quantum mechanics

    E-print Network

    Adrian Faigon

    2007-11-01

    Mechanics can be founded on a principle relating the uncertainty delta-q in the trajectory of an observable particle to its motion relative to the observer. From this principle, p.delta-q=const., p being the q-conjugated momentum, mechanical laws are derived and the meaning of the Lagrangian and Hamiltonian functions are discussed. The connection between the presented principle and Hamilton's Least Action Principle is examined. Wave mechanics and Schrodinger equation appear without additional assumptions by choosing the representation for delta-q in the case the motion is not trajectory describable. The Cramer-Rao inequality serves that purpose. For a particle hidden from direct observation, the position uncertainty determined by the enclosing boundaries leads to thermodynamics in a straightforward extension of the presented formalism. The introduction of uncertainty in classical mechanics formulation enables the translation of mechanical laws into the wide ranging conceptual framework of information theory. The boundaries between classical mechanics, thermodynamics and quantum mechanics are defined in terms of informational changes associated with the system evolution. As a direct application of the proposed formulation upper bounds for the rate of information transfer are derived.

  19. Kepler and Mach's Principle

    NASA Astrophysics Data System (ADS)

    Barbour, Julian

    The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.

  20. Polydimensional Supersymmetric Principles

    E-print Network

    William M. Pezzaglia

    1999-09-22

    Systems of equations are invariant under "polydimensional transformations" which reshuffle the geometry such that what is a line or a plane is dependent upon the frame of reference. This leads us to propose an extension of Clifford calculus in which each geometric element (vector, bivector) has its own coordinate. A new classical action principle is proposed in which particles take paths which minimize the distance traveled plus area swept out by the spin. This leads to a solution of the 50 year old conundrum of `what is the correct Lagrangian' in which to derive the Papapetrou equations of motion for spinning particles in curved space (including torsion). Based on talk given at: 5th International Conference on Clifford Algebras and their Applications in Mathematical Physics, Ixtapa-Zihuatanejo, Mexico, June 27-July 4, 1999.

  1. Dynamical principles in neuroscience

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  2. Fault Management Guiding Principles

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  3. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  4. Defining Uncertainty: A Conceptual Basis for Uncertainty Management in Model-Based Decision Support

    Microsoft Academic Search

    W. E. Walker; P. Harremoës; J. Rotmans; J. P. Van der Sluijs; M. B. A. Van Asselt; P. Janssen; M. P. Krayer von Krauss

    2003-01-01

    The aim of this paper is to provide a conceptual basis for the systematic treatment of uncertainty in model-based decision support activities such as policy analysis, integrated assessment and risk assessment. It focuses on the uncertainty perceived from the point of view of those providing information to support policy decisions (i.e., the modellers’ view on uncertainty) – uncertainty regarding the

  5. Theory Comparison: Uncertainty Reduction, Problematic Integration, Uncertainty Management, and Other Curious Constructs.

    ERIC Educational Resources Information Center

    Bradac, James J.

    2001-01-01

    Compares three theories examining the role of communication in producing and coping with subjective uncertainty. Notes that uncertainty reduction theory offers axioms and derived theorems that describe communicative and noncommunicative causes and consequences of uncertainty. Compares meanings of "uncertainty" in the three theories as well as the…

  6. Few group collapsing of covariance matrix data based on a conservation principle

    SciTech Connect

    Hiruta,H.; Palmiotti, G.; Salvatores, M.; Arcilla, Jr., R.; Oblozinsky, P.; McKnight, R.D.

    2008-06-24

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows preserving at a broad energy group structure the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients.

  7. Uncertainty In Lagrangian Pollutant Transport Simulations Due to Meteorological Uncertainty at Mesoscale

    NASA Astrophysics Data System (ADS)

    Angevine, W. M.; Brioude, J. F.; McKeen, S. A.

    2014-12-01

    Lagrangian particle dispersion models, used to estimate emissions from observations, require meteorological fields as input. Uncertainty in the driving meteorology is one of the major uncertainties in the results. The propagation of uncertainty through the system is not simple, and has not been thoroughly explored. Here, we take an ensemble approach. Six different configurations of the Weather Research and Forecast (WRF) model drive otherwise identical simulations with FLEXPART for 49 days over eastern North America. The ensemble spreads of wind speed, mixing height, and tracer concentration are presented. Uncertainty of tracer concentrations due solely to meteorological uncertainty is 30-40%. Spatial and temporal averaging reduces the uncertainty marginally. Tracer age uncertainty due solely to meteorological uncertainty is 15-20%. These are lower bounds on the uncertainty, because a number of processes are not accounted for in the analysis. It is not yet known exactly how these uncertainties will propagate through inversions to affect emissions estimates.

  8. Towards first-principles electrochemistry

    E-print Network

    Dabo, Ismaila

    2008-01-01

    This doctoral dissertation presents a comprehensive computational approach to describe quantum mechanical systems embedded in complex ionic media, primarily focusing on the first-principles representation of catalytic ...

  9. Entropic uncertainty relations for the ground state of a coupled system

    SciTech Connect

    Santhanam, M.S. [Max Planck Institute for the Physics of Complex Systems, Noethnitzer Strasse 38, Dresden 01187 (Germany)

    2004-04-01

    There is a renewed interest in the uncertainty principle, reformulated from the information theoretic point of view, called the entropic uncertainty relations. They have been studied for various integrable systems as a function of their quantum numbers. In this work, focussing on the ground state of a nonlinear, coupled Hamiltonian system, we show that approximate eigenstates can be constructed within the framework of adiabatic theory. Using the adiabatic eigenstates, we estimate the information entropies and their sum as a function of the nonlinearity parameter. We also briefly look at the information entropies for the highly excited states in the system.

  10. Uncertainty in geological and hydrogeological data

    NASA Astrophysics Data System (ADS)

    Nilsson, B.; Højberg, A. L.; Refsgaard, J. C.; Troldborg, L.

    2007-09-01

    Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  11. Uncertainty in geological and hydrogeological data

    NASA Astrophysics Data System (ADS)

    Nilsson, B.; Højberg, A. L.; Refsgaard, J. C.; Troldborg, L.

    2006-08-01

    Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible it is necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  12. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. PMID:21175720

  13. Sticking to its principles.

    PubMed

    1992-03-27

    Planned Parenthood says that rather than accept the Bush administration's gag rule it will give up federal funding of its operations. The gag rule forbids professionals at birth control clinics from even referring to abortion as an option to a pregnant woman, much less recommending one. President Bush has agreed to a policy which allows physicians but no one else at clinics to discuss abortion in at least some cases. In his view, according to White House officials, this was an admitted attempt to straddle the issue. Why he would want to straddle is understandable. The right wing of his party, which has always been suspicious of Mr. Bush, is pushing him to uphold what it regards as the Reagan legacy on this issue. The original gag rule, which prevented even physicians from discussing abortion as an option in almost all cases, was issued in the last presidents's 2nd term and upheld last year by the Supreme Court. Give Planned Parenthood credit for sticking to its principles. A lot of recipients of all sorts of federal funds want it both ways, take the money but not accept federal policy guidelines. When they find they can't, many "rise above principle," take the money and adjust policy accordingly. It is not going to be easy for Planned Parenthood now. Federal funds account for a significant portion of the organizations's budgets. Planned Parenthood of Maryland, for example, gets about $500,000 a year from the federal government, or about 12-13% of its total budget. It will either have to cut back on its services or increase its fundraising from other sources or charge women more for services--or all of those things. This is not the end of the story. It is certainly not the end of the political story. Pat Buchanan said of the new regulations, "I like the old position, to be quite candid." Thank goodness he never won a primary. George Bush would not have moved even as far as he hid on the gag rule. There will be a lot of agreement with the Buchanan view at the Republican national convention. We can only hope that by then the president will be looking to the general election campaign and a Democratic opponent who will be appealing to Republican women on this issue. Perhaps then he will relax the gag order a little more. PMID:12317218

  14. A bayesian foundation for individual learning under uncertainty.

    PubMed

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory. PMID:21629826

  15. Essays on pricing under uncertainty 

    E-print Network

    Escobari Urday, Diego Alfonso

    2008-10-10

    Approved by: Chair of Committee, Li Gan Committee Members, Hae-Shin Hwang Steven Puller Ximing Wu Head of Department, Lawrence J. Oliver May 2008 Major Subject: Economics iii ABSTRACT Essays on Pricing Under Uncertainty. (May 2008) Diego Alfonso Escobari... little angel Valentina and my wife Carolina. vi ACKNOWLEDGMENTS I am deeply grateful for guidance and support from my committee chair, Dr. Li Gan, and the help of my committee members. Special thanks to Dr. Hae-Shin Hwang who gave me invaluable guidance...

  16. Uncertainties in forecasting future climate

    SciTech Connect

    MacCracken, M.C.

    1990-11-01

    The increasing atmospheric concentrations of carbon dioxide, methane, chlorofluorocarbons, and other trace gases (collectively, greenhouse gases) pose a three-part challenge: (1) What the changes to atmospheric composition and the climate system will be; (2) What impacts (both detrimental and beneficial) these changes will induce on the biosphere and natural and societal resources; and (3) What the appropriate response, if any, might be when considering the changes themselves, the resulting impacts, and the benefits and other impacts of the activities generating the emissions. This brief summary will address only areas of agreement and areas of uncertainty related to the first challenge.

  17. Uncertainty of combined activity estimations

    NASA Astrophysics Data System (ADS)

    Ratel, G.; Michotte, C.; Bochud, François O.

    2015-06-01

    This paper discusses basic theoretical strategies used to deal with measurement uncertainties arising from different experimental situations. It attempts to indicate the most appropriate method of obtaining a reliable estimate of the quantity to be evaluated depending on the characteristics of the data available. The theoretical strategies discussed are supported by experimental detail, and the conditions and results have been taken from examples in the field of radionuclide metrology. Special care regarding the correct treatment of covariances is emphasized because of the unreliability of the results obtained if these are neglected.

  18. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for comparison of the measurement accuracy of existing or newly developed PIV interrogation algorithms. The database is publicly available on the website www.piv.de/uncertainty.

  19. Mirror Principle I

    E-print Network

    B. Lian; K. Liu; S. T. Yau

    1997-12-11

    We propose and study the following Mirror Principle: certain sequences of multiplicative equivariant characteristic classes on Kontsevich's stable map moduli spaces can be computed in terms of certain hypergeometric type classes. As applications, we compute the equivariant Euler classes of obstruction bundles induced by any concavex bundles -- including any direct sum of line bundles -- on $\\P^n$. This includes proving the formula of Candelas-de la Ossa-Green-Parkes hence completing the program of Candelas et al, Kontesevich, Manin, and Givental, to compute rigorously the instanton prepotential function for the quintic in $\\P^4$. We derive, among many other examples, the multiple cover formula for Gromov-Witten invariants of $\\P^1$, computed earlier by Morrison-Aspinwall and by Manin in different approaches. We also prove a formula for enumerating Euler classes which arise in the so-called local mirror symmetry for some noncompact Calabi-Yau manifolds. At the end we interprete an infinite dimensional transformation group, called the mirror group, acting on Euler data, as a certain duality group of the linear sigma model.

  20. Vedic principles of therapy.

    PubMed

    Boyer, R W

    2012-01-01

    This paper introduces Vedic principles of therapy as a holistic integration of healing and human development. The most integrative aspect is a "consciousness-based" approach in which the bottom line of the mind is consciousness itself, accessed by transcending mental activity to its simplest ground state. This directly contrasts with "unconscious-based" approaches that hold the basis of conscious mind is the unconscious, such as analytic, humanistic, and cognitive-behavioral approaches. Although not presented as a specific therapeutic approach, interventions associated with this Vedic approach have extensive support in the applied research literature. A brief review of experimental research toward a general model of mind-and cutting-edge developments in quantum physics toward nonlocal mind-shows a convergence on the ancient Vedic model of mind. Comparisons with contemporary therapies further show that the simplicity, subtlety, and holistic nature of the Vedic approach represent a significant advance over approaches which have overlooked the fundamental ground state of the mind. PMID:22225931

  1. Bateman's principle and immunity.

    PubMed Central

    Rolff, Jens

    2002-01-01

    The immunocompetence handicap hypothesis (ICHH) of Folstad and Karter has inspired a large number of studies that have tried to understand the causal basis of parasite-mediated sexual selection. Even though this hypothesis is based on the double function of testosterone, a hormone restricted to vertebrates, studies of invertebrates have tended to provide central support for specific predictions of the ICHH. I propose an alternative hypothesis that explains many of the findings without relying on testosterone or other biochemical feedback loops. This alternative is based on Bateman's principle, that males gain fitness by increasing their mating success whilst females increase fitness through longevity because their reproductive effort is much higher. Consequently, I predict that females should invest more in immunity than males. The extent of this dimorphism is determined by the mating system and the genetic correlation between males and females in immune traits. In support of my arguments, I mainly use studies on insects that share innate immunity with vertebrates and have the advantage that they are easier to study. PMID:11958720

  2. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  3. [Principles of wound treatment].

    PubMed

    Bruhin, A; Metzger, J

    2007-09-01

    New techniques and devices have revolutionized the treatment of wounds during the last years. For the treatment of wounds we have nowadays a great variety of new gadgets, tools and methods. Complex wounds require specific skills, given the fact that a great number of different promising methods are on the market to enable an optimal wound management. Well educated "wound experts" are required to overcome the problems of very complicated and chronic wound problems. The importance of an interdisciplinary team increases while facing the problems of special wound disorders such as a diabetic food, food ulcers or the problems of open abdomen in case of severe peritonitis. In this overview the main principles of modern wound treatment are outlined. The aim of this article is to present a good summary of wound judgement and treatment for the practioner. Increasingly important is it to point out the situation of complexe wounds which should be judgded and treated with the help of a "wound expert". PMID:18075140

  4. Meaty Principles for Environmental Educators.

    ERIC Educational Resources Information Center

    Rockcastle, V. N.

    1985-01-01

    Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)

  5. [Ethical principles in psychiatric action].

    PubMed

    Rüther, Eckart

    2014-07-01

    There is no specific psychiatric ethic. The ethical principles for practical actions in psychiatry have to be adapted on the basis of the generally accepted ethical principles, which are based on psychobiologically developed ethic of love: honesty, discretion, empathy, patience, distance, consistency, accountability, tolerance, economic neutrality. PMID:24983582

  6. Children's Understanding of Conversational Principles.

    ERIC Educational Resources Information Center

    Conti, Daniel J.; Camras, Linda A.

    1984-01-01

    Investigates the development of awareness of conversational principles in preschool, first-, and third-grade children by presenting them with short stories ending with a verbal statement by a story character. Results suggest that children's understanding of conversational principles improves considerably between preschool and first grade.…

  7. A variational principle of hydromechanics

    Microsoft Academic Search

    S. Drobot; A. Rybarski

    1958-01-01

    1. The purpose of this paper is to reveal the part played in variational principles of hydromechanics by a certain group of infinitesimal transformations of the fields of density and velocity. This group, called here the hydromechanical variation and originating from some elementary requirement concerning variation of matter, seems to be essential in variational principles of hydromechanic,4, iaot only in

  8. The strong Ekeland variational principle

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomonari

    2006-08-01

    In this paper, we consider the strong Ekeland variational principle due to Georgiev [P.G. Georgiev, The strong Ekeland variational principle, the strong drop theorem and applications, J. Math. Anal. Appl. 131 (1988) 1-21]. We discuss it for functions defined on Banach spaces and on compact metric spaces. We also prove the [tau]-distance version of it.

  9. Principles of Public Paul Tabbush

    E-print Network

    -making 3. Managing public access in forests is very often about service delivery rather than environmental access for recreation or health initiatives, using forests and woodlands as the sites for public eventsPrinciples of Public Engagement Paul Tabbush Bianca Ambrose-Oji #12;Principles of Public Engagement

  10. Principles of Instructed Language Learning

    ERIC Educational Resources Information Center

    Ellis, Rod

    2005-01-01

    This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on…

  11. Challenging Proteins Principles and Methods

    E-print Network

    Lebendiker, Mario

    Gel Filtration Principles and Methods 18-1022-18 Recombinant Protein Purification Handbook Principles and Methods 18-1142-75 Protein Purification Handbook 18-1132-29 Hydrophobic Interaction and Reversed Phase....................................................................................................................10 Purification of integral membrane proteins for structural and functional studies

  12. Multimedia Principle in Teaching Lessons

    ERIC Educational Resources Information Center

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  13. Principles of Play for Soccer

    ERIC Educational Resources Information Center

    Ouellette, John

    2004-01-01

    Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…

  14. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  15. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  16. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  17. Uncertainty in Lagrangian pollutant transport simulations due to meteorological uncertainty at mesoscale

    NASA Astrophysics Data System (ADS)

    Angevine, W. M.; Brioude, J.; McKeen, S.; Holloway, J. S.

    2014-07-01

    Lagrangian particle dispersion models require meteorological fields as input. Uncertainty in the driving meteorology is one of the major uncertainties in the results. The propagation of uncertainty through the system is not simple, and has not been thoroughly explored. Here, we take an ensemble approach. Six different configurations of the Weather Research and Forecast (WRF) model drive otherwise identical simulations with FLEXPART for 49 days over eastern North America. The ensemble spreads of wind speed, mixing height, and tracer concentration are presented. Uncertainty of tracer concentrations due solely to meteorological uncertainty is 30-40%. Spatial and temporal averaging reduces the uncertainty marginally. Tracer age uncertainty due solely to meteorological uncertainty is 15-20%. These are lower bounds on the uncertainty, because a number of processes are not accounted for in the analysis.

  18. Uncertainty in Lagrangian pollutant transport simulations due to meteorological uncertainty from a mesoscale WRF ensemble

    NASA Astrophysics Data System (ADS)

    Angevine, W. M.; Brioude, J.; McKeen, S.; Holloway, J. S.

    2014-12-01

    Lagrangian particle dispersion models require meteorological fields as input. Uncertainty in the driving meteorology is one of the major uncertainties in the results. The propagation of uncertainty through the system is not simple, and it has not been thoroughly explored. Here, we take an ensemble approach. Six different configurations of the Weather Research and Forecast (WRF) model drive otherwise identical simulations with FLEXPART-WRF for 49 days over eastern North America. The ensemble spreads of wind speed, mixing height, and tracer concentration are presented. Uncertainty of tracer concentrations due solely to meteorological uncertainty is 30-40%. Spatial and temporal averaging reduces the uncertainty marginally. Tracer age uncertainty due solely to meteorological uncertainty is 15-20%. These are lower bounds on the uncertainty, because a number of processes are not accounted for in the analysis.

  19. Position-Momentum Uncertainty Relations in the Presence of Quantum Memory

    E-print Network

    Fabian Furrer; Mario Berta; Marco Tomamichel; Volkher B. Scholz; Matthias Christandl

    2015-01-05

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  20. Position-momentum uncertainty relations in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-01

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  1. The Travelling Wave Group - 5 Departures from Dirac's Principles

    NASA Astrophysics Data System (ADS)

    Bourdillon, Antony J.

    2014-03-01

    The Traveling Wave Group (TWG) for a free particle is written, ? = A(X2 / 2?2 + X) . Here, X = i(kx - ?t) , ? is an experimental initial value, with Aa normalizing constant dependent on it, while ? is the mean angular frequency, and k the mean wave vector. Unlike Dirac's unstable wave packet; the TWG is stable. From it, the following are derived: the Uncertainty Principle; Planck's law; the de Broglie hypothesis; phase velocity; pseudo mass M'; conservation of M'PT; 5-dimensional space; mass as a local excess of energy over momentum; an explanation for entanglement at a distance, etc.

  2. Optimality principles for the visual code

    NASA Astrophysics Data System (ADS)

    Pitkow, Xaq

    One way to try to make sense of the complexities of our visual system is to hypothesize that evolution has developed nearly optimal solutions to the problems organisms face in the environment. In this thesis, we study two such principles of optimality for the visual code. In the first half of this dissertation, we consider the principle of decorrelation. Influential theories assert that the center-surround receptive fields of retinal neurons remove spatial correlations present in the visual world. It has been proposed that this decorrelation serves to maximize information transmission to the brain by avoiding transfer of redundant information through optic nerve fibers of limited capacity. While these theories successfully account for several aspects of visual perception, the notion that the outputs of the retina are less correlated than its inputs has never been directly tested at the site of the putative information bottleneck, the optic nerve. We presented visual stimuli with naturalistic image correlations to the salamander retina while recording responses of many retinal ganglion cells using a microelectrode array. The output signals of ganglion cells are indeed decorrelated compared to the visual input, but the receptive fields are only partly responsible. Much of the decorrelation is due to the nonlinear processing by neurons rather than the linear receptive fields. This form of decorrelation dramatically limits information transmission. Instead of improving coding efficiency we show that the nonlinearity is well suited to enable a combinatorial code or to signal robust stimulus features. In the second half of this dissertation, we develop an ideal observer model for the task of discriminating between two small stimuli which move along an unknown retinal trajectory induced by fixational eye movements. The ideal observer is provided with the responses of a model retina and guesses the stimulus identity based on the maximum likelihood rule, which involves sums over all random walk trajectories. These sums can be implemented in a biologically plausible way. The necessary ingredients are: neurons modeled as a cascade of a linear filter followed by a static nonlinearity, a recurrent network with additive and multiplicative interactions between neurons, and divisive global inhibition. This architecture implements Bayesian inference by representing likelihoods as neural activity which can then diffuse through the recurrent network and modulate the influence of later information. We also develop approximation methods for characterizing the performance of the ideal observer. We find that the effect of positional uncertainty is essentially to slow the acquisition of signal. The time scaling is related to the size of the uncertainty region, which is in turn related to both the signal strength and the statistics of the fixational eye movements. These results imply that localization cues should determine the slope of the performance curve in time.

  3. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.

  4. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  5. Assessment of AERONET-OC LWN uncertainties

    NASA Astrophysics Data System (ADS)

    Gergely, Mathias; Zibordi, Giuseppe

    2014-02-01

    This study presents a detailed analysis of the uncertainties affecting the normalized water-leaving radiance (LWN) from above-water measurements performed within the context of the Ocean Colour component of the Aerosol Robotic Network (AERONET-OC). The analysis, conducted in agreement with the ‘Guide to the Expression of Uncertainty in Measurement’ (GUM), indicates uncertainties of LWN markedly dependent on the optical properties of seawater for a number of AERONET-OC sites located in different marine regions. Results obtained for the Adriatic Sea site, characterized by a large variety of measurement conditions, confirm previous uncertainties from an independent study indicating median values of relative combined uncertainties of 5% in the blue-green part of the spectrum and of approximately 7% in the red. Additional investigations show that the former uncertainties can be reduced by ˜1% when restricting the determination of AERONET-OC LWN to measurements performed at low sun zenith angle and low aerosol optical thickness.

  6. Environmental load uncertainties for offshore structures

    SciTech Connect

    Nessim, M.A.; Hong, H.P. [Centre for Engineering Research Inc., Edmonton, Alberta (Canada); Jordaan, I.J. [Memorial Univ. of Newfoundland, St. John`s, Newfoundland (Canada). Faculty of Engineering and Applied Science

    1995-11-01

    A methodology for assessing the effect of different sources of uncertainty on the calculation of load effect on offshore structures is presented. A consistent classification of uncertainties was adopted and used as a basis to develop models to estimate the effect of different uncertainties on specified design loads. It is shown that distribution parameter uncertainties arising from limitations on the quantity of statistical data are not likely to have a significant effect on design loads. By contrast, model uncertainties can greatly increase the design loads, and the increase is sensitive to the probabilistic models used to describe model error. The methodology and results can be used by design engineers to take model uncertainties into account in estimating specified loads. They also form the basis for developing and calibrating a new information-sensitive code format.

  7. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pommé, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as ‘easy’ and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  8. Dealing with parameter uncertainty in the calculation of water surface profiles 

    E-print Network

    Vargas-Cruz, Ruben F.

    1998-01-01

    . . . . . . 26 Flow Classification. . State of Flow Governing Principles in Open-Channel Hydraulics . . . . . Conservation of Mass. . Conservation of Energy. Conservation of Momentum. Uniform Flov . . . . . . . . 27 . . . . . . . 28 . . . . . . . 3 0...) backwater profile; (v) flood velocities; (vi) local inflow, and; (vii) the effect of increase in regulatory-flood height on the outer fringe boundary location (Burges, 1979). Therefore, uncertainty must be considered at the moment of the estimation...

  9. Uncertainty relation for the optimization of optical-fiber transmission systems simulations.

    PubMed

    Rieznik, A; Tolisano, T; Callegari, F A; Grosz, D; Fragnito, H

    2005-05-16

    The mathematical inequality which in quantum mechanics gives rise to the uncertainty principle between two non commuting operators is used to develop a spatial step-size selection algorithm for the Split-Step Fourier Method (SSFM) for solving Generalized Non-Linear Schrödinger Equations (G-NLSEs). Numerical experiments are performed to analyze the efficiency of the method in modeling optical-fiber communications systems, showing its advantages relative to other algorithms. PMID:19495289

  10. Geostatistical modelling of uncertainty in soil science

    Microsoft Academic Search

    P. Goovaerts

    2001-01-01

    This paper addresses the issue of modelling the uncertainty about the value of continuous soil attributes, at any particular unsampled location (local uncertainty) as well as jointly over several locations (multiple-point or spatial uncertainty). Two approaches are presented: kriging-based and simulation-based techniques that can be implemented within a parametric (e.g. multi-Gaussian) or non-parametric (indicator) frameworks. As expected in theory and

  11. Uncertainties in nuclear decay data evaluations

    NASA Astrophysics Data System (ADS)

    Bé, M.-M.; Chechev, V. P.; Pearce, A.

    2015-06-01

    Over the years, scientists have compiled and evaluated experimental results and combined these with theoretical studies with the goal of obtaining the best values, and their uncertainties, for the quantities related to radioactive decay. To meet the demand, an international group, the Decay Data Evaluation Project, was organised in 1995. The methodology adopted by this group is detailed. Some examples are given explaining how the final uncertainties are assessed from the experimental results and their uncertainties.

  12. Robust Preliminary Space Mission Design under Uncertainty

    Microsoft Academic Search

    Massimiliano Vasile; Nicolas Croisard

    \\u000a This chapter presents the design of a space mission at a preliminary stage, when uncertainties are high. At this particular\\u000a stage, an insufficient consideration for uncertainty could lead to a wrong decision on the feasibility of the mission. Contrary\\u000a to the traditional margin approach, the methodology presented here explicitly introduces uncertainties in the design process.\\u000a The overall system design is

  13. Uncertainty in integrated coastal zone management

    Microsoft Academic Search

    Henriëtte S. Otter

    2000-01-01

    Uncertainty plays a major role in Integrated Coastal Zone Management (ICZM). A large part of this uncertainty is connected\\u000a to our lack of knowledge of the integrated functioning of the coastal system and to the increasing need to act in a pro-active\\u000a way. Increasingly, coastal managers are forced to take decisions based on information which is surrounded by uncertainties.\\u000a Different

  14. Progress in classical and quantum variational principles

    Microsoft Academic Search

    C. G. Gray; G. Karl; V. A. Novikov

    2004-01-01

    We review the development and practical uses of a generalized Maupertuis least action principle in classical mechanics in which the action is varied under the constraint of fixed mean energy for the trial trajectory. The original Maupertuis (Euler-Lagrange) principle constrains the energy at every point along the trajectory. The generalized Maupertuis principle is equivalent to Hamilton's principle. Reciprocal principles are

  15. Progress in classical and quantum variational principles

    Microsoft Academic Search

    C. G. Gray; G. Karl; V. A. Novikov

    2004-01-01

    We review the development and practical uses of a generalized Maupertuis least action principle in classical mechanics in which the action is varied under the constraint of fixed mean energy for the trial trajectory. The original Maupertuis (Euler–Lagrange) principle constrains the energy at every point along the trajectory. The generalized Maupertuis principle is equivalent to Hamilton's principle. Reciprocal principles are

  16. An Inconvenient Principle

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

  17. OECD Principles of Corporate Governance

    NSDL National Science Digital Library

    The "Organisation for Economic Co-operation and Development Principles of Corporate Governance" sets out a structure for directing and controlling corporate businesses. This document (html or .pdf) consists of five sections detailing the principles: "The rights of shareholders," "The equitable treatment of shareholders," "The role of stakeholders in corporate governance," "Disclosure and transparency," and "The responsibilities of the board," as well as annotations for each of the sections. Be sure to visit the OECD Principles of Corporate Governance Q&A page, linked at the top of the page.

  18. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    SciTech Connect

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  19. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  20. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  1. Comment on 'Symplectic quantization, inequivalent quantum theories, and Heisenberg's principle of uncertainty'

    SciTech Connect

    Latimer, D. C. [Department of Physics and Astronomy, Valparaiso University, Valparaiso, Indiana 46383 (United States)

    2007-06-15

    In Phys. Rev. A 70, 032104 (2004), M. Montesinos and G. F. Torres del Castillo consider various symplectic structures on the classical phase-space of the two-dimensional isotropic harmonic oscillator. Using Dirac's quantization condition, the authors investigate how these alternative symplectic forms affect this system's quantization. They claim that these symplectic structures result in mutually inequivalent quantum theories. In fact, we show here that there exists a unitary map between the two representation spaces so that the various quantizations are equivalent.

  2. A Computational Model of Limb Impedance Control Based on Principles of Internal Model Uncertainty 

    E-print Network

    Mitrovic, Djordje; Klanke, Stefan; Osu, Rieko; Kawato, Mitsuo; Vijayakumar, Sethu

    Efficient human motor control is characterized by an extensive use of joint impedance modulation, which is achieved by co-contracting antagonistic muscles in a way that is beneficial to the specific task. While there is ...

  3. Certainty in Heisenberg's uncertainty principle: Revisiting definitions for estimation errors and disturbance

    NASA Astrophysics Data System (ADS)

    Dressel, Justin; Nori, Franco

    2014-02-01

    We revisit the definitions of error and disturbance recently used in error-disturbance inequalities derived by Ozawa and others by expressing them in the reduced system space. The interpretation of the definitions as mean-squared deviations relies on an implicit assumption that is generally incompatible with the Bell-Kochen-Specker-Spekkens contextuality theorems, and which results in averaging the deviations over a non-positive-semidefinite joint quasiprobability distribution. For unbiased measurements, the error admits a concrete interpretation as the dispersion in the estimation of the mean induced by the measurement ambiguity. We demonstrate how to directly measure not only this dispersion but also every observable moment with the same experimental data, and thus demonstrate that perfect distributional estimations can have nonzero error according to this measure. We conclude that the inequalities using these definitions do not capture the spirit of Heisenberg's eponymous inequality, but do indicate a qualitatively different relationship between dispersion and disturbance that is appropriate for ensembles being probed by all outcomes of an apparatus. To reconnect with the discussion of Heisenberg, we suggest alternative definitions of error and disturbance that are intrinsic to a single apparatus outcome. These definitions naturally involve the retrodictive and interdictive states for that outcome, and produce complementarity and error-disturbance inequalities that have the same form as the traditional Heisenberg relation.

  4. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  5. Measurement of nuclear activity with Ge detectors and its uncertainty

    E-print Network

    Cortes, C A P

    1999-01-01

    presented in the fifth chapter and they are applied to establish the optimum conditions for the measurement of the activity of a gamma transmitter isolated radioactive source with a spectrometer with germanium detector. (Author) The objective of this work is to analyse the influence magnitudes which affect the activity measurement of gamma transmitter isolated radioactive sources. They prepared by means of the gravimetric method, as well as, determining the uncertainty of such measurement when this is carried out with a gamma spectrometer system with a germanium detector. This work is developed in five chapters: In the first one, named Basic principles it is made a brief description about the meaning of the word Measurement and its implications and the necessaries concepts are presented which are used in this work. In the second chapter it is exposed the gravimetric method used for the manufacture of the gamma transmitter isolated radioactive sources, it is tackled the problem to determine the main influence ...

  6. Form of prior for constrained thermodynamic processes with uncertainty

    NASA Astrophysics Data System (ADS)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  7. Exactly Solvable Dynamical Models with a Minimal Length Uncertainty

    NASA Astrophysics Data System (ADS)

    Bernardo, Reginald Christian S.; Esguerra, Jose Perico H.

    2015-05-01

    We present exact analytical solutions to the classical equations of motion and analyze the dynamical consequences of the existence of a minimal length for the free particle, particle in a linear potential, anti-symmetric constant force oscillator, harmonic oscillator, vertical harmonic oscillator, linear diatomic chain, and linear triatomic chain. It turns out that the speed of a free particle and the magnitude of the acceleration of a particle in a linear potential have larger values compared to the non-minimal length counterparts - the increase in magnitudes come from multiplicative factors proportional to what is known as the generalized uncertainty principle parameter. Our analysis of oscillator systems suggests that the characteristic frequencies of systems also have larger values than the non-minimal length counterparts. In connection with this, we discuss a kind of experimental test with which the existence of a minimal length may be detected on a classical level.

  8. Get Provoked: Applying Tilden's Principles.

    ERIC Educational Resources Information Center

    Shively, Carol A.

    1995-01-01

    This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

  9. Fundamental principles of particle detectors

    SciTech Connect

    Fernow, R.C.

    1988-01-01

    This paper goes through the fundamental physics of particles-matter interactions which is necessary for the detection of these particles with detectors. A listing of 41 concepts and detector principles are given. 14 refs., 11 figs.

  10. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of meteorological, public health, media studies, social science, and psychology literature opens up a vast and interesting library that is not obvious to the volcanologist at a first glance. It shows that forecasts and their uncertainty can be phrased and delivered, and followed-up upon, in a manner that reduces the chance of message distortion. The mass-media delivery vehicle requires careful tracking because the potential for forecast distortion can result in a frame that the scientific response is "absurd", "confused", "shambolic", or "dysfunctional." This can help set up a "frightened", "frustrated", "angry", even "furious" reaction to the forecast and forecaster.

  11. Uncertainty Analysis of CROPGRO-Cotton Model

    NASA Astrophysics Data System (ADS)

    Pathak, T. B.; Jones, J. W.; Fraisse, C.; Wright, D.; Hoogenboom, G.; Judge, J.

    2009-12-01

    An application of crop simulation models have become an inherent part of research and decision making process. As many decision making processes solely rely on the results obtained from simulation models, consideration of model uncertainties along with model accuracy in decision making processes have also become increasingly important. Newly developed crop model, CROPGRO - Cotton model is complex simulation model that has been heavily parameterized. The values of those parameters were obtained from literature which also carries uncertainties. True uncertainty associated with important model parameters were not known. The objective of this study was to estimate uncertainties associated with model parameters and associated uncertainties in model outputs. The uncertainty assessment was carried out using widely accepted Geenralized Likelihood Uncertainty Estimation (GLUE technique. Dataset on this analysis was collected from four different experiments at three geographic locations. Primary results show that the amount of uncertainties in model input parameters were narrowed down significantly from the priori knowledge of selected parameters. The expected means of parameters obtained from their posterior distributions were not considerably different from their prior means and default values in the model. However, importantly the coefficient of variation of those parameters were reduced considerably. Maximum likelihood estimates of selected parameter improved the model performance. The fitting of the model to measured LAI, and biomass components was reasonably well with R-squared values for total above ground biomass for all four sites ranging between 0.86 and 0.98. Approximate reduction of uncertainties in input parameters ranged between 25%-85% and corresponding model output uncertainties reductions ranged between 62%-76%. Most of the measurements were covered within the 95% confidence interval estimated from 2.5% and 97.5% quantiles of cumulative distributions of model outputs generated from posterior distribution of model parameters. The study demonstrated an efficient prediction of uncertainties in model input and outputs using a widely accepted GLUE methodology.

  12. Data Communication Principles Reliable Data Transfer

    E-print Network

    Ramkumar, Mahalingam

    Data Communication Principles Switching Reliable Data Transfer Data Communication Basics Mahalingam Ramkumar Mississippi State University, MS September 8, 2014 Ramkumar CSE 4153 / 6153 #12;Data Communication Principles Switching Reliable Data Transfer 1 Data Communication Principles Data Rate of a Communication

  13. A new robust optimization approach for scheduling under uncertainty: II. Uncertainty with known probability distribution

    Microsoft Academic Search

    Stacy L. Janak; Xiaoxia Lin; Christodoulos A. Floudas

    2007-01-01

    In this work, we consider the problem of scheduling under uncertainty where the uncertain problem parameters can be described by a known probability distribution function. A novel robust optimization methodology, originally proposed by Lin, Janak, and Floudas [Lin, X., Janak, S. L., & Floudas, C. A. (2004). A new robust optimization approach for scheduling under uncertainty: I. Bounded uncertainty. Computers

  14. Map scale and the communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    Conventionally the scale at which mapped information is presented in earth sciences reflects the uncertainty in this information. This partly reflects the cartographic sources of error in printed maps, but also conventions on the amount of underpinning observation on which the map is based. In soil surveys a convention is that the number of soil profile observations per unit area of printed map is fixed over a range of scales. For example, for surveys in the Netherlands, Steur (1961) suggested that there should be 5 field observations per cm2 of map. Bie and Beckett (1970) showed that there is a consistent relationship between map scale and the field effort of the soil survey. It is now common practice to map variables by geostatistical methods. The output from kriging can be on the support of the original data (point kriging) or can be upscaled to 'blocks' by block kriging. The block kriging prediction is of the spatial mean of the target variable across a block of specified dimensions. In principle the size of the block on which data are presented can by varied arbitrarily. In some circumstances the block size may be determined by operational requirements. However, for general purposes, predictions can be presented for blocks of any size. The same variable, sampled at a fixed intensity, could be presented as estimates for blocks 10 × 10 m on one map and 100 × 100 m on another map. The data user might be tempted to assume that the predictions on smaller blocks provide more information than the larger blocks. However, the prediction variance of the block mean diminishes with block size so improvement of the notional resolution of the information is accompanied by a reduction in its precision. This precision can be quantified by the block kriging variance, however this on its own may not serve to indicate whether the block size represents a good compromise between resolution and precision in a particular circumstance such that the resolution reasonably communicates the uncertainty of information to the data user. In this presentation I show how, in place of the block kriging variance, one can use the model-based correlation between the block kriged estimate and the true spatial mean of the block as a readilly interpreted measure of the quality of block-kriging predictions. Graphs of this correlation as a function of block size, for a given sampling configuration, allow one to assess the suitability of different block sizes in circumstances where these are not fixed by operational requirements. For example, it would be possible to determine a new convention by which block kriged predictions are routinely presented only for block sizes such that the correlation exceeds some threshold value. Steur, G.G.L. 1961. Methods of soil survey in use in the Netherlands Soil Survey Institute. Boor Spade 11, 59-77. Bie, S.W., Beckett, P.H.T. 1970. The costs of soil survey. Soils and Fertilizers 34, 1-15.

  15. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  16. DO MODEL UNCERTAINTY WITH CORRELATED INPUTS

    EPA Science Inventory

    The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

  17. Resolving Identity Uncertainty with Learned Random Walks

    Microsoft Academic Search

    Ted Sandler; Lyle H. Ungar; Koby Crammer

    2009-01-01

    A pervasive problem in large relational databases is identity uncertainty which occurs when multiple entries in a database refer to the same underlying entity in the world. Relational databases exhibit rich graphical structure and are naturally modeled as graphs whose nodes represent entities and whose typed-edges represent relations between them. We propose using random walk models for resolving identity uncertainty

  18. UFLOW: visualizing uncertainty in fluid flow

    Microsoft Academic Search

    Suresh K. Lodha; Alex Pang; Robert E. Sheehan; Craig M. Wittenbrink

    1996-01-01

    Uncertainty or errors are introduced in fluid flow data as the data is acquired, transformed and rendered. Although researchers are aware of these uncertainties, little has been done to incorporate them in the existing visualization systems for fluid flow. In the absence of integrated presentation of data and its associated un- certainty, the analysis of the visualization is incomplete at

  19. Decision-making under great uncertainty: environmental

    E-print Network

    Weiblen, George D

    Decision-making under great uncertainty: environmental management in an era of global change and potential conse- quences in decision-making. Standard approaches to decision-making under uncertainty decisions. Decision-making in the context of global change Humanity faces unprecedented challenges arising

  20. Efficient uncertainty quantification in unsteady aeroelastic simulations

    Microsoft Academic Search

    J. A. S. Witteveen; H. Bijl

    2009-01-01

    An efficient uncertainty quantification method for unsteady problems is presented in order to achieve a constant accuracy in time for a constant number of samples. The approach is applied to the aeroelastic problems of a transonic airfoil flutter system and the AGARD 445.6 wing benchmark with uncertainties in the flow and the structure.

  1. Programmatic methods for addressing contaminated volume uncertainties

    Microsoft Academic Search

    L. A. DURHAM; R. L. JOHNSON; C. R. RIEMAN; H. L. SPECTOR

    2007-01-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly

  2. Programmatic methods for addressing contaminated volume uncertainties

    Microsoft Academic Search

    C. R. Rieman; H. L. Spector; L. A. Durham; R. L. Johnson

    2007-01-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly

  3. UNCERTAINTY OF ROAD TRAFFIC SAFETY MEASUREMENTS

    Microsoft Academic Search

    Edi Kulderknup; Jürgen Riim; Tuuli Levandi

    Road traffic has important part for the every days life. Traffic accidents are related to the vehicles technical side and to the traffic surveillance from authorities and police. Environment is damaged mainly through vehicles exhaust gases, noise and vibration. This study work deals with road traffic safety measurements uncertainties. Damages can be minimised if uncertainties of the measurements are estimated

  4. Uncertainty quantification in ground vehicle dynamics through

    E-print Network

    Negrut, Dan

    Uncertainty quantification in ground vehicle dynamics through high fidelity co-simulation Master Mazhar Martin Tupy 2 #12;Explanation of the Title Uncertainty quantification in ground vehicle dynamics #12;ADAMS/Car 6 #12;ADAMS/Car A simulation tool for vehicle dynamics Commercially available

  5. Communicating Spatial Uncertainty using Geospatial Reasoning

    Microsoft Academic Search

    Blair C. Darragh; Chris J. Bellman

    Users of remotely sensed imagery need to know about the spatial uncertainty in their image data in order to make sound decisions using their imagery. As a minimum, the user should understand the presence and level of spatial uncertainty in their data. This paper proposes the use of intelligent software agents as a means of assisting users to understand the

  6. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  7. Uncertainties in Adversarial Patrol (Extended Abstract)

    E-print Network

    Kraus, Sarit

    }@cs.biu.ac.il ABSTRACT In this work, we study the problem of multi-robot perimeter patrol in adversarial environments, under uncertainty. In this problem, the robots patrol around a closed area, where their goal into the area. Uncertainties may rise in different aspects in this domain, and herein our focus is twofold

  8. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  9. Environmental Risks, Uncertainty and Intergenerational Ethics

    Microsoft Academic Search

    Kristian Skagen Ekeli

    2004-01-01

    The way our decisions and actions can affect future generations is surrounded by uncertainty. This is evident in current discussions of environmental risks related to global climate change, biotechnology and the use and storage of nuclear energy. The aim of this paper is to consider more closely how uncertainty affects our moral responsibility to future generations, and to what extent

  10. Estimating Uncertainty in Brain Region Delineations

    E-print Network

    Carmichael, Owen

    a method for estimating uncertainty in MRI-based brain region delineations provided by fully-automated segEstimating Uncertainty in Brain Region Delineations Karl R. Beutner III1 , Gautam Prasad2 , Evan- cally detect when the delineating surface of the entire brain is unclear due to poor image quality

  11. Attribute uncertainty modelling in lunar spatial data

    Microsoft Academic Search

    P. Weiss; Wen-Zhong Shi; Kai-Leung Yung

    2010-01-01

    A novel methodology to evaluate uncertainties in lunar elemental abundances is presented. Contrary to most terrestrial applications, lunar geographic information system (GIS) data cannot be verified by in situ measurements because of the limited number of ground control points and their reduced spatial extent. This work reports on investigations to evaluate the uncertainty in lunar abundance measurements without the use of

  12. Uncertainty and Engagement with Learning Games

    ERIC Educational Resources Information Center

    Howard-Jones, Paul A.; Demetriou, Skevi

    2009-01-01

    Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

  13. COMMON KNOWLEDGE, COHERENT UNCERTAINTIES AND CONSENSUS

    E-print Network

    Rimon, Elon

    COMMON KNOWLEDGE, COHERENT UNCERTAINTIES AND CONSENSUS by Yakov Ben-Haim TECHNICAL REPORT ETR-2001 of Mechanical Engineering #12;Working Paper Common Knowledge, Coherent Uncertainties and Consensus Yakov Ben- and knowledge-functions, common knowledge and consensus. Our main results are that knowledge is constricted

  14. Uncertainty specification for data acquisition (DAQ) devices

    Microsoft Academic Search

    David W. Braudaway

    2006-01-01

    Specification of uncertainty has historically been done by a variety of methods with differences in results, especially for instruments and standards that achieve small uncertainty values. A relatively simple but practical approach is used in newly published standard IEC 62008 1st Edition, \\

  15. The Economic Implications of Carbon Cycle Uncertainty

    SciTech Connect

    Smith, Steven J.; Edmonds, James A.

    2006-10-17

    This paper examines the implications of uncertainty in the carbon-cycle for the cost of stabilizing carbon-dioxide concentrations. We find that uncertainty in our understanding of the carbon-dioxide has significant implications for the costs of a climate stabilization policy, equivalent to a change in concentration target of up to 100 ppmv.

  16. Reliable water supply system design under uncertainty

    Microsoft Academic Search

    G. Chung; K. Lansey; Güzin Bayraksan

    2009-01-01

    Given the natural variability and uncertainties in long-term predictions, reliability is a critical design factor for water supply systems. However, the large scale of the problem and the correlated nature of the involved uncertainties result in models that are often intractable. In this paper, we consider a municipal water supply system over a 15-year planning period with initial infrastructure and

  17. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    PubMed

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  18. Generalized Quantization Principle in Canonical Quantum Gravity and Application to Quantum Cosmology

    NASA Astrophysics Data System (ADS)

    Kober, Martin

    2012-08-01

    In this paper, a generalized quantization principle for the gravitational field in canonical quantum gravity, especially with respect to quantum geometrodynamics is considered. This assumption can be interpreted as a transfer from the generalized uncertainty principle in quantum mechanics, which is postulated as generalization of the Heisenberg algebra to introduce a minimal length, to a corresponding quantization principle concerning the quantities of quantum gravity. According to this presupposition there have to be given generalized representations of the operators referring to the observables in the canonical approach of a quantum description of general relativity. This also leads to generalized constraints for the states and thus to a generalized Wheeler-DeWitt equation determining a new dynamical behavior. As a special manifestation of this modified canonical theory of quantum gravity, quantum cosmology is explored. The generalized cosmological Wheeler-DeWitt equation corresponding to the application of the generalized quantization principle to the cosmological degree of freedom is solved by using Sommerfelds polynomial method.

  19. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  20. Clocking in the face of unpredictability beyond quantum uncertainty

    E-print Network

    F. Hadi Madjid; John M. Myers

    2015-04-16

    In earlier papers we showed unpredictability beyond quantum uncertainty in atomic clocks, ensuing from a proven gap between given evidence and explanations of that evidence. Here we reconceive a clock, not as an isolated entity, but as enmeshed in a self-adjusting communications network adapted to one or another particular investigation, in contact with an unpredictable environment. From the practical uses of clocks, we abstract a clock enlivened with the computational capacity of a Turing machine, modified to transmit and to receive numerical communications. Such "live clocks" phase the steps of their computations to mesh with the arrival of transmitted numbers. We lift this phasing, known in digital communications, to a principle of \\emph{logical synchronization}, distinct from the synchronization defined by Einstein in special relativity. Logical synchronization elevates digital communication to a topic in physics, including applications to biology. One explores how feedback loops in clocking affect numerical signaling among entities functioning in the face of unpredictable influences, making the influences themselves into subjects of investigation. The formulation of communications networks in terms of live clocks extends information theory by expressing the need to actively maintain communications channels, and potentially, to create or drop them. We show how networks of live clocks are presupposed by the concept of coordinates in a spacetime. A network serves as an organizing principle, even when the concept of the rigid body that anchors a special-relativistic coordinate system is inapplicable, as is the case, for example, in a generic curved spacetime.

  1. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    SciTech Connect

    Porter, D.W.

    1996-04-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites.

  2. Constructing the Uncertainty of Due Dates

    PubMed Central

    Vos, Sarah C.; Anthony, Kathryn E.; O'Hair, H. Dan

    2015-01-01

    By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make of how and when to give birth. Thirty-three women who were pregnant or had given birth within the past two years participated in key informant or small group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait on birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

  3. Numerical uncertainty in computational engineering and physics

    SciTech Connect

    Hemez, Francois M [Los Alamos National Laboratory

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  4. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  5. Uncertainty Representation in Stochastic Reservoir Optimization

    NASA Astrophysics Data System (ADS)

    Lamontagne, J. R.; Stedinger, J. R.; Shoemaker, C. A.; Tan, S. N.

    2014-12-01

    Water resources managers attempt to operate reservoir and hydropower systems to maximize system objectives, subject to a host of physical and policy constraints, and in light of uncertainty about future conditions. Optimization models are widely used to advise the decision making process. An important aspect of such models is how uncertainties related to future hydrologic and economic conditions are represented, and the extent to which different uncertainty representations affect the quality of recommended decisions. This study explores the consequences of different uncertainty representations in stochastic optimization models of hydropower systems by comparing simulated system performance using different stochastic optimization models. An important question is whether the added computational burden from greater uncertainty resolution (which can be prohibitive for operational models in many cases) actually improves model recommendations. This is particularly relevant as more complex, ensemble forecasts are incorporated into short- and mid-term planning models. Another important consideration is how watershed hydrology (both seasonal and episodic characteristics), system size, economic context, and the temporal resolution of the model influence how uncertainty should be represented. These topics are explored through several US examples including a sampling stochastic dynamic programming (SSDP) model of a small single-reservoir system on the Kennebec River in Maine, and a stochastic programming model of the large multi-reservoir Federal Columbia River system in the Pacific Northwest. These studies highlight the importance of flexible model frameworks which allow exploration of different representations of a system and of uncertainties before locking operational decision support system development into a specific representation.

  6. A modular approach to linear uncertainty analysis.

    PubMed

    Weathers, J B; Luck, R; Weathers, J W

    2010-01-01

    This paper introduces a methodology to simplify the uncertainty analysis of large-scale problems where many outputs and/or inputs are of interest. The modular uncertainty technique presented here can be utilized to analyze the results spanning a wide range of engineering problems with constant sensitivities within parameter uncertainty bounds. The proposed modular approach provides the same results as the traditional propagation of errors methodology with fewer conceptual steps allowing for a relatively straightforward implementation of a comprehensive uncertainty analysis effort. The structure of the modular technique allows easy integration into most experimental/modeling programs or data acquisition systems. The proposed methodology also provides correlation information between all outputs, thus providing information not easily obtained using the traditional uncertainty process based on analyzing one data reduction equation (DRE)/model at a time. Finally, the paper presents a straightforward methodology to obtain the covariance matrix for the input variables using uncorrelated elemental sources of systematic uncertainties along with uncorrelated sources corresponding to random uncertainties. PMID:19942216

  7. Spatial uncertainty analysis of population models

    SciTech Connect

    Jager, Yetta [ORNL; King, Anthony Wayne [ORNL; Schumaker, Nathan [U.S. Environmental Protection Agency, Corvallis, OR; Ashwood, Tom L [ORNL; Jackson, Barbara L [ORNL

    2004-01-01

    This paper describes an approach for conducting spatial uncertainty analysis of spatial population models, and illustrates the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial population models typically simulate birth, death, and migration on an input map that describes habitat. Typically, only a single reference map is available, but we can imagine that a collection of other, slightly different, maps could be drawn to represent a particular species' habitat. As a first approximation, our approach assumes that spatial uncertainty (i.e., the variation among values assigned to a location by such a collection of maps) is constrained by characteristics of the reference map, regardless of how the map was produced. Our approach produces lower levels of uncertainty than alternative methods used in landscape ecology because we condition our alternative landscapes on local properties of the reference map. Simulated spatial uncertainty was higher near the borders of patches. Consequently, average uncertainty was highest for reference maps with equal proportions of suitable and unsuitable habitat, and no spatial autocorrelation. We used two population viability models to evaluate the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial uncertainty produced larger variation among predictions of a spatially explicit model than those of a spatially implicit model. Spatially explicit model predictions of final female population size varied most among landscapes with enough clustered habitat to allow persistence. In contrast, predictions of population growth rate varied most among landscapes with only enough clustered habitat to support a small population, i.e., near a spatially mediated extinction threshold. We conclude that spatial uncertainty has the greatest effect on persistence when the amount and arrangement of suitable habitat are such that habitat capacity is near the minimum required for persistence.

  8. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  9. Bayes and the Simplicity Principle in Perception

    ERIC Educational Resources Information Center

    Feldman, Jacob

    2009-01-01

    Discussions of the foundations of perceptual inference have often centered on 2 governing principles, the likelihood principle and the simplicity principle. Historically, these principles have usually been seen as opposed, but contemporary statistical (e.g., Bayesian) theory tends to see them as consistent, because for a variety of reasons simpler…

  10. Uncertainty: the Curate's egg in financial economics.

    PubMed

    Pixley, Jocelyn

    2014-06-01

    Economic theories of uncertainty are unpopular with financial experts. As sociologists, we rightly refuse predictions, but the uncertainties of money are constantly sifted and turned into semi-denial by a financial economics set on somehow beating the future. Picking out 'bits' of the future as 'risk' and 'parts' as 'information' is attractive but socially dangerous, I argue, because money's promises are always uncertain. New studies of uncertainty are reversing sociology's neglect of the unavoidable inability to know the forces that will shape the financial future. PMID:24712756

  11. PDF Uncertainties in WH Production at Tevatron

    SciTech Connect

    P. M. Nadolsky and Z. Sullivan

    2001-11-02

    We apply a method proposed by members of CTEQ Collaboration to estimate the uncertainty in associated W-Higgs boson production at Run II of the Tevatron due to our imprecise knowledge of parton distribution functions. We find that the PDF uncertainties for the signal and background rates are of the order 3%. The PDF uncertainties for the important statistical quantities (significance of the Higgs boson discovery, accuracy of the measurement of the WH cross section) are smaller (1.5%) due to the strong correlation of the signal and background.

  12. Fractional revivals through Rényi uncertainty relations

    E-print Network

    Elvira Romera; Francisco de los Santos

    2014-09-19

    We show that the R\\'enyi uncertainty relations give a good description of the dynamical behavior of wave packets and constitute a sound approach to revival phenomena by analyzing three model systems: the simple harmonic oscillator, the infinite square well, and the quantum bouncer. We prove the usefulness of entropic uncertainty relations as a tool for identifying fractional revivals by providing a comparison in different contexts with the usual Heisenberg uncertainty relation and with the common approach in terms of the autocorrelation function.

  13. Principle Paradigms Revisiting the Dublin Core 1:1 Principle

    ERIC Educational Resources Information Center

    Urban, Richard J.

    2012-01-01

    The Dublin Core "1:1 Principle" asserts that "related but conceptually different entities, for example a painting and a digital image of the painting, are described by separate metadata records" (Woodley et al., 2005). While this seems to be a simple requirement, studies of metadata quality have found that cultural heritage…

  14. Beyond Bellman's principle of optimality; the principle of \\

    Microsoft Academic Search

    C. D. Johnson

    2005-01-01

    Bellman's principle of optimality and his dynamic programming technique for computing optimal sequential-decisions may not apply to problems involving uncertain, non-noisy exogenous-variables. In this paper, we show that if the uncertain behavior of non-noisy exogenous-variables can be modeled by a class of spline-expressions, with known basis-functions and unknown, \\

  15. RPL Dosimetry: Principles and Applications

    SciTech Connect

    Yamamoto, Takayoshi [Radioisotope Research Center, Osaka University, 2-4, Yamadaoka, Suita, Osaka 565-0871 (Japan) and Oarai Research Center, Chiyoda Technol Corporation, 3681, Narita-cho, Oarai-machi, Higashi-ibaraki-gun, Ibaraki-ken, 311-1313 (Japan)

    2011-05-05

    The principle of radio-photoluminescence (RPL) is applied to the glass dosimeter, which is one of the most excellent solid state dosimeters. The silver activated phosphate glass irradiated with ionizing radiations emits luminescence when exposed to UV light. This phenomenon is called RPL. The most characteristic features of the glass dosimeters are data accumulation and no fading. The basic principle of RPL is described and then how it is applied to the glass dosimeter is explained. Finally some applications of RPL will be introduced.

  16. Doing without the Equivalence Principle

    E-print Network

    R. Aldrovandi; J. G. Pereira; K. H. Vu

    2004-10-08

    In Einstein's general relativity, geometry replaces the concept of force in the description of the gravitation interaction. Such an approach rests on the universality of free-fall--the weak equivalence principle--and would break down without it. On the other hand, the teleparallel version of general relativity, a gauge theory for the translation group, describes the gravitational interaction by a force similar to the Lorentz force of electromagnetism, a non-universal interaction. It is shown that, similarly to the Maxwell's description of electromagnetism, the teleparallel gauge approach provides a consistent theory for gravitation even in the absence of the weak equivalence principle.

  17. STOCHASTIC APPROACHES TO UNCERTAINTY QUANTIFICATION IN CFD SIMULATIONS

    E-print Network

    Mathelin, Lionel

    STOCHASTIC APPROACHES TO UNCERTAINTY QUANTIFICATION IN CFD SIMULATIONS LIONEL MATHELIN, M. YOUSUFF from time immemorial, uncertainty in computational fluid dynamics (CFD) has meant primarily types of uncertainties associated with CFD simulations, and one ought to account for all aspects

  18. Error Detection and Recovery for Robot Motion Planning with Uncertainty

    E-print Network

    Donald, Bruce Randall

    1987-07-01

    Robots must plan and execute tasks in the presence of uncertainty. Uncertainty arises from sensing errors, control errors, and uncertainty in the geometry of the environment. The last, which is called model error, has ...

  19. Heisenberg Uncertainty and the Allowable Masses of the Up Quark and Down Quark

    NASA Astrophysics Data System (ADS)

    Orr, Brian

    2004-05-01

    A possible explanation for the inability to attain deterministic measurements of an elementary particle's energy, as given by the Heisenberg Uncertainty Principle, manifests itself in an interesting anthropic consequent of Andrei Linde's Self-reproducing Inflationary Multiverse model. In Linde's model, the physical laws and constants that govern our universe adopt other values in other universes, due to variable Higgs fields. While the physics in our universe allow for the advent of life and consciousness, the physics necessary for life are not likely to exist in other universes -- Linde demonstrates this through a kind of Darwinism for universes. Our universe, then, is unique. But what are the physical laws and constants that make our universe what it is? Craig Hogan identifies five physical constants that are not bound by symmetry. Fine-tuning these constants gives rise to the basic behavior and structures of the universe. Three of the non-symmetric constants are fermion masses: the up quark mass, the down quark mass, and the electron mass. I will explore Linde's and Hogan's works by comparing the amount of uncertainty in quark masses, as calculated from the Heisenberg Uncertainty Principle, to the range of quark mass values consistent with our observed universe. Should the fine-tuning of the up quark and down quark masses be greater than the range of Heisenberg uncertainties in their respective masses (as I predict, due to quantum tunneling), then perhaps there is a correlation between the measured Heisenberg uncertainty in quark masses and the fine-tuning of masses required for our universe to be as it is. Hogan; "Why the Universe is Just So;" Reviews of Modern Physics; Issue 4; Vol. 72; pg. 1149-1161; Oct. 2000 Linde, "The Self-Reproducing Inflationary Universe;" Scientific American; No. 5; Vol. 271; pg. 48-55; Nov. 1994

  20. Realizing primary reference values in the nanoflow regime, a proof of principle

    NASA Astrophysics Data System (ADS)

    van der Beek, Mijndert P.; Lucas, Peter

    2010-07-01

    In this paper a proof of principle is demonstrated for a primary standard/nanoflow generator developed at the VSL (Van Swinden Laboratorium, The Netherlands). The ultimate goal of the research carried out at the VSL is to develop primary standards for liquid volume flows down to a few nanoliters per minute on the basis of sound traceability and low uncertainty. The nanoflow generator discussed in this paper is a first step in that direction and can generate flows in suction and discharge mode. The uncertainty of the generated flow is approximately 0.1% (2 s) relative at best. The flow generating process is based on the control of thermodynamic expansion of a driver liquid mass, whereas traceability is based upon volume transfer and conservation of mass (dynamic displacement principle).

  1. QUANTIFICATION OF UNCERTAINTY IN COMPUTATIONAL FLUID DYNAMICS

    Microsoft Academic Search

    P. J. Roache

    1997-01-01

    This review covers Verification, Validation, Confirmation and related subjects for computational fluid dynamics (CFD), including error taxonomies, error estima- tion and banding, convergence rates, surrogate estimators, nonlinear dynamics, and error estimation for grid adaptation vs Quantification of Uncertainty.

  2. Performance and robustness analysis for structured uncertainty

    Microsoft Academic Search

    John C. Doyle; Joseph E. Wall; Gunter Stein

    1982-01-01

    This paper introduces a nonconservative measure of performance for linear feedback systems in the face of structured uncertainty. This measure is based on a new matrix function, which we call the Structured Singular Value.

  3. Outage Probability Under Channel Distribution Uncertainty

    E-print Network

    Loyka, Sergey

    Outage Probability Under Channel Distribution Uncertainty Ioanna Ioannou, Charalambos D outage probability defined as min (over the input distribution) -max (over the channel distribution class to the case of partial channel distribution information. Compound outage probability characterization via one

  4. CLIMATE AND CLIMATE CHANGE CERTAINTIES AND UNCERTAINTIES

    E-print Network

    Schwartz, Stephen E.

    CLIMATE AND CLIMATE CHANGE CERTAINTIES AND UNCERTAINTIES Stephen E. Schwartz http concentrations of "greenhouse gases" · Radiative forcing of climate change · Climate system response: Observations of temperature change on various time scales · Climate system sensitivity: Models and Observations

  5. Uncertainty in Mixtures and Cumulative Risk Assessment

    EPA Science Inventory

    Uncertainty in Mixtures and Cumulative Risk Assessment JC Lipscomb and GE Rice U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, Cincinnati, Ohio, USA Humans and environmental species are rarely exposed to sing...

  6. Applying Calibration to Improve Uncertainty Assessment 

    E-print Network

    Fondren, Mark Edward

    2013-08-02

    is nearly universal. The cost associated with underestimating uncertainty, or overconfidence, can be substantial. Studies have shown that moderate overconfidence and optimism can result in expected portfolio disappointment of more than 30%. It has been shown...

  7. Multifidelity approaches for optimization under uncertainty

    E-print Network

    Ng, Leo W. T.

    It is important to design robust and reliable systems by accounting for uncertainty and variability in the design process. However, performing optimization in this setting can be computationally expensive, requiring many ...

  8. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T [ORNL] [ORNL; Mueller, Don [ORNL] [ORNL

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.

  9. Improving the uncertainty of photomask linewidth measurements

    NASA Astrophysics Data System (ADS)

    Pedulla, J. M.; Potzick, James; Silver, Richard M.

    2004-05-01

    The National Institute of Standards and Technology (NIST) is currently developing a photomask linewidth standard (SRM 2059) with a lower expected uncertainty of calibration than the previous NIST standards (SRMs 473, 475, 476). In calibrating these standards, optical simulation modeling has been used to predict the microscope image intensity profiles, which are then compared to the experimental profiles to determine the certified linewidths. Consequently, the total uncertainty in the linewidth calibration is a result of uncertainty components from the optical simulation modeling and uncertainty due to experimental errors or approximations (e.g., tool imaging errors and material characterization errors). Errors of approximation in the simulation model and uncertainty in the parameters used in the model can contribute a large component to the total linewidth uncertainty. We have studied the effects of model parameter variation on measurement uncertainty using several different optical simulation programs that utilize different mathematical techniques. We have also evaluated the effects of chrome edge runout and varying indices of refraction on the linewidth images. There are several experimental parameters that are not ordinarily included in the modeling simulation. For example, the modeling programs assume a uniform illuminating field (e.g., Koehler illumination), ideal optics and perfect optical alignment. In practice, determining whether Koehler illumination has been achieved is difficult, and the optical components and their alignments are never ideal. We will present some techniques for evaluating Koehler illumination and methods to compensate for scattered (flare) light. Any such experimental elements, that are assumed accurate in the modeling, may actually present significant components to the uncertainty and need to be quantitatively estimated. The present state of metrology does not permit the absolute calibration of linewidth standards to the level of uncertainty called for in the semiconductor roadmap. However, there are applications for a linewidth standard and calibration strategies which do not require a NIST certified calibration (e.g., determining measurement precision). In this paper we present various critical elements of a systematic and thorough evaluation of the key components of linewidth uncertainty as well as our methods for evaluating and reducing modeling and experimental uncertainties.

  10. Geometric uncertainty relation for quantum ensembles

    NASA Astrophysics Data System (ADS)

    Heydari, Hoshang; Andersson, Ole

    2015-02-01

    Geometrical structures of quantum mechanics provide us with new insightful results about the nature of quantum theory. In this work we consider mixed quantum states represented by finite rank density operators. We review our geometrical framework that provide the space of density operators with Riemannian and symplectic structures, and we derive a geometric uncertainty relation for observables acting on mixed quantum states. We also give an example that visualizes the geometric uncertainty relation for spin-\\frac{1}{2} particles.

  11. Interpolation Method Needed for Numerical Uncertainty

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, Marcel; Schallhorn, Paul A.

    2014-01-01

    Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors.

  12. Measurement uncertainty of silicone oil leakage testing

    SciTech Connect

    Holland, W.E.

    1991-11-01

    An evaluation has been performed to determine the uncertainty of silicone tracer fluid leakage measurements for an environmental sensing device. The units are tested with an instrument which can measure silicone tracer fluid vapor by the gas chromatography method. An analysis has shown that the measurement uncertainty can be maintained at {plus minus}20% when the volumes used in the procedure are held to a given set of tolerances. 1 fig.

  13. UNCERTAINTY QUANTIFICATION FOR MODULAR AND HIERARCHICAL MODELS

    Microsoft Academic Search

    L. J. LUCAS; M. ORTIZ; H. OWHADI; U. TOPCUy

    We propose a modular\\/hierarchical uncertainty quantication framework based on a recently developed methodology using concentration-of-measure inequalities for probability-of-failure upper bound calculations. In this framework, the relations between the variables of the underlying input-output model are represented by directed, acyclic graphs and the bounded uncertainty in the input variables is propagated to the output variable (performance measure) in an inductive manner

  14. Visualisation of Information Uncertainty: Progress and Challenges

    Microsoft Academic Search

    Binh Pham; Alex Streit; Ross Brown

    2009-01-01

    \\u000a Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem.\\u000a Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter\\u000a are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and\\u000a their dimensions of complexity; (2)

  15. Chemical Principles Revisited: Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Mickey, Charles D.

    1980-01-01

    Describes: (1) Law of Mass Action; (2) equilibrium constant and ideal behavior; (3) general form of the equilibrium constant; (4) forward and reverse reactions; (5) factors influencing equilibrium; (6) Le Chatelier's principle; (7) effects of temperature, changing concentration, and pressure on equilibrium; and (8) catalysts and equilibrium. (JN)

  16. 1 Introduction The scanline principle

    E-print Network

    Gordon, Dan

    1 Introduction The scanline principle: efficient conversion of display algorithms into scanline mode Ella Barkan1, Dan Gordon2 1 IBM Haifa Research Lab, Matam Technology Center, Haifa 31905, Israel 2 is a general technique for efficiently converting any display algo- rithm that is based on polygon scan con

  17. Demonstrating Fermat's Principle in Optics

    ERIC Educational Resources Information Center

    Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

    2011-01-01

    We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either…

  18. Developmental Cell Determining Physical Principles

    E-print Network

    Needleman, Daniel

    Developmental Cell Forum Determining Physical Principles of Subcellular Organization Dan Needleman1 Institute of Molecular Cell Biology and Genetics, Pfotenhauerstrasse 108, 01307 Dresden, Germany 3Max Planck transformed our understanding of cell biology, but we are still unable to predict the behaviors

  19. The Minimum Description Length Principle

    Microsoft Academic Search

    Peter D. Grünwald

    2007-01-01

    The minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation

  20. Principles of Nano-Optics

    Microsoft Academic Search

    Lukas Novotny; Bert Hecht

    2006-01-01

    Nano-optics is the study of optical phenomena and techniques on the nanometer scale, that is, near or beyond the diffraction limit of light. It is an emerging field of study, motivated by the rapid advance of nanoscience and nanotechnology which require adequate tools and strategies for fabrication, manipulation and characterization at this scale. In Principles of Nano-Optics the authors provide

  1. Interpersonal psychotherapy: principles and applications

    Microsoft Academic Search

    JOHN C. MARKOWITZ; MYRNA M. WEISSMAN

    2004-01-01

    This article briefly describes the fundamental principles and some of the clinical applications of interpersonal psychotherapy (IPT), a time-limited, empirically validated treatment for mood disorders. IPT has been tested with general success in a series of clinical trials for mood and, increasingly, non-mood disorders; as both an acute and maintenance treatment; and in differing treatment formats. As a result of

  2. Principles of programming languages: design

    SciTech Connect

    MacLennan, B.J.

    1983-01-01

    An excellent pedagogically oriented text on programming languages in which principles are emphasized more than details, methods more than results, and semantics more than syntax. Many chapter exercises, uniquely constructed and worded. Contents: Emphasis on efficiency: FORTRAN. Elegance and generality: ALGOL-60. Syntactic issues: ALGOL-60. Modularity and data abstraction: ADA. List processing LISP. Object-oriented programming. SMALLTALK. Logic programming: PROLOG. INDEX.

  3. Principles and Guidelines for Transfer

    ERIC Educational Resources Information Center

    British Columbia Council on Admissions and Transfer, 2003

    2003-01-01

    Transfer relationships in British Columbia (BC) are governed by statements which were adopted by the Council in 1993 after consultation with the institutions of the BC Transfer System. Principles and guidelines in this document are based on those formulated by the British Columbia Post-Secondary Coordinating Committee and approved by university…

  4. Classroom Demonstrations of Polymer Principles.

    ERIC Educational Resources Information Center

    Rodriguez, F.; And Others

    1987-01-01

    Describes several techniques to help students visualize the principles of polymer chemistry. Outlines demonstrations using simple materials (such as pop-it beads, thread, and wire) to illustrate the size of macromolecules, the composition of copolymers, and the concept of molecular mass. (TW)

  5. Principles of Critical Discourse Analysis

    Microsoft Academic Search

    Teun A. van Dijk

    1993-01-01

    This paper discusses some principles of critical discourse analysis, such as the explicit sociopolitical stance of discourse analysts, and a focus on dominance relations by elite groups and institutions as they are being enacted, legitimated or otherwise reproduced by text and talk. One of the crucial elements of this analysis of the relations between power and discourse is the patterns

  6. Bioengineering 280A Principles of

    E-print Network

    Liu, Thomas T.

    shows twin gestation sacs (s) and bladder (B). Ultrasound System Acuson Sequoia Doppler Ultrasound #12 principles underlying the major modalities, including X-ray, computed tomography, MRI, and ultrasound. Basic of Medical Imaging 1895 - Roentgen discovers X-rays 1942 - Dussik demonstrates transmission ultrasound

  7. Bioengineering 280A Principles of

    E-print Network

    Liu, Thomas T.

    gestation sacs (s) and bladder (B). Ultrasound Systems Acuson Sequoia Sonosite 180 #12;6 Doppler Ultrasound principles underlying the major modalities, including X-ray, computed tomography, MRI, and ultrasound. Basic of Medical Imaging 1895 - Roentgen discovers X-rays 1942 - Dussik demonstrates transmission ultrasound

  8. Bioengineering 280A Principles of

    E-print Network

    Liu, Thomas T.

    Sonosite 180 #12;6 Doppler Ultrasound 3D Ultrasound History of MRI 1946: Felix Bloch (Stanford principles underlying the major modalities, including X-ray, computed tomography, MRI, and ultrasound. Basic of Medical Imaging 1895 - Roentgen discovers X-rays 1942 - Dussik demonstrates transmission ultrasound

  9. Bioengineering 280A Principles of

    E-print Network

    Liu, Thomas T.

    . Image shows twin gestation sacs (s) and bladder (B). Ultrasound System Acuson Sequoia #12;11 Doppler principles underlying the major modalities, including X-ray, computed tomography, MRI, and ultrasound. #12 of Medical Imaging 1895 - Roentgen discovers X-rays 1942 - Dussik demonstrates transmission ultrasound

  10. The Zero Product Principle Error.

    ERIC Educational Resources Information Center

    Padula, Janice

    1996-01-01

    Argues that the challenge for teachers of algebra in Australia is to find ways of making the structural aspects of algebra accessible to a greater percentage of students. Uses the zero product principle to provide an example of a common student error grounded in the difficulty of understanding the structure of algebra. (DDR)

  11. PRINCIPLES OF ARCHIVES & RECORDS MANAGEMENT

    E-print Network

    New Mexico, University of

    PRINCIPLES OF ARCHIVES & RECORDS MANAGEMENT Archives and records management is based upon: they are destroyed, reformatted, transferred to inactive storage, or transferred to the University Archives. During is purchased. Will the filing system be University Archives' Collections University Archives Guidelines 1

  12. [Technical principles of laparoscopic cholecystectomy].

    PubMed

    Kurdo, S A; Ga?dukov, V N

    1995-01-01

    The technical principles of laparoscopic cholecystectomy are described from experience in 87 operations in acute and chronic appendicitis. The authors discuss the stages of the operation and the peculiarities of the technical procedures at each stage, and give recommendations on the use of the instruments and indications for abdominal drainage. PMID:7474695

  13. Principles for Teaching Problem Solving

    NSDL National Science Digital Library

    Rob Foshay and Jamie Kirkley

    2003-01-01

    This 14-page monograph addresses the need to teach problem solving and other higher order thinking skills. After summarizing research and positions of various organizations, it defines several models and describes cognitive and attitudinal components of problem solving and the types of knowledge that are required. The authors provide a list of principles for teaching problem solving and include a list of references.

  14. On the Dirichlet's Box Principle

    ERIC Educational Resources Information Center

    Poon, Kin-Keung; Shiu, Wai-Chee

    2008-01-01

    In this note, we will focus on several applications on the Dirichlet's box principle in Discrete Mathematics lesson and number theory lesson. In addition, the main result is an innovative game on a triangular board developed by the authors. The game has been used in teaching and learning mathematics in Discrete Mathematics and some high schools in…

  15. Sampling uncertainty in satellite rainfall estimates

    NASA Astrophysics Data System (ADS)

    Itkin, M.; Loew, A.

    2012-04-01

    Accurate estimates of global precipitation patterns are essential for a better understanding of the hydrological cycle. Satellite observations allow for large scale estimates of rainfall intensities. Uncertainties in current satellite based rainfall estimates are due to uncertainties in the retrieval process as well as the different temporal and spatial sampling patterns of the observation systems. The focus of this study is set on analyzing sampling associated uncertainty for thirteen low Earth orbiting satellites carrying microwave instruments suitable for rainfall measurement. Satellites were grouped by the types of microwave sensors, where NOAA satellites with cross-track sounders and DMSP satellites with conical scanners make the core part of the constellations. The effect of three hourly geostationary measurements on the sampling uncertainty was evaluated as well. A precise orbital model SGP4 was used to generate realistic satellite overpasses database where orbital shifts are taken into account. Using the overpasses database we resampled rain gauge timeseries to simulate satellites rainfall estimates free of retrieval and calibration errors. We look at two regions, Germany and Benin, areas with different precipitation regimes . Our analysis show that sampling uncertainty for all available satellites may differ up to 100% for different latitudes and precipitation regimes. However the performance of various satellite groups is similar to each other, with greater differences in higher latitudes. Addition of three hourly geostationary observations reduces the sampling uncertainty but only to a limited extent.

  16. Estimating Uncertainty in Brain Region Delineations

    PubMed Central

    Beutner, Karl R.; Prasad, Gautam; Fletcher, Evan; DeCarli, Charles; Carmichael, Owen T.

    2010-01-01

    This paper presents a method for estimating uncertainty in MRI-based brain region delineations provided by fully-automated segmentation methods. In large data sets, the uncertainty estimates could be used to detect fully-automated method failures, identify low-quality imaging data, or endow downstream statistical analyses with per-subject uncertainty in derived morphometric measures. Region segmentation is formulated in a statistical inference framework; the probability that a given region-delineating surface accounts for observed image data is quantified by a distribution that takes into account a prior model of plausible region shape and a model of how the region appears in images. Region segmentation consists of finding the maximum a posteriori (MAP) parameters of the delineating surface under this distribution, and segmentation uncertainty is quantified in terms of how sharply peaked the distribution is in the vicinity of the maximum. Uncertainty measures are estimated through Markov Chain Monte Carlo (MCMC) sampling of the distribution in the vicinity of the MAP estimate. Experiments on real and synthetic data show that the uncertainty measures automatically detect when the delineating surface of the entire brain is unclear due to poor image quality or artifact; the experiments cover multiple appearance models to demonstrate the generality of the method. The approach is also general enough to accommodate a wide range of shape models and brain regions. PMID:19694287

  17. The interactive visualization of spatial uncertainty

    SciTech Connect

    Srivastava, R.M.

    1994-12-31

    Though stochastic and geostatistical methods can produce numerous equally likely models of a petroleum reservoir, most practical case studies that use such probabilistic methods do not explore the entire space of uncertainty but focus instead on a single outcome. When displayed using powerful visualization tools, this single outcome is often mistaken for reality. The traditional graphical displays that geostatisticians use to combat this false belief in the certainty of a single image do not convey uncertainty in an intuitive manner; these traditional displays therefore tend to be less compelling than the slick and polished graphical displays that modern visualization tools provide. The visualization of spatial uncertainty can be made more intuitive with dynamic displays in which each frame is a plausible outcome that differs only slightly from the previous one. The same approach can be enhanced to allow interaction with the display so that users have a more direct sense of exploring the space of uncertainty. Interactive visualization tools improve decision-making in the face of uncertainty by confronting engineers and geologists with the spatial uncertainty that is often critical in a comprehensive risk analysis of EOR projects.

  18. A procedure for assessing uncertainty in models

    SciTech Connect

    McKay, M.D.; Beckman, R.J.

    1993-11-01

    This paper discusses uncertainty in the output calculation of a model due to uncertainty in inputs values. Uncertainty in input values, characterized by suitable probability distributions, propagates through the model to a probability distribution of an output. Our objective in studying uncertainty is to identify a subset of inputs as being important in the sense that fixing them greatly reduces the uncertainty, or variability, in the output. The procedures we propose are demonstrated with an application of the model called MELCOR Accident Consequence Code System (MACCS), described in Helton et al. (1992). The purpose of MACCS is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. In any particular application of MACCS there are likely to be many possible inputs and outputs of interest. In this paper, attention focuses on a single output and 36 inputs. Our objective is to determine a subset of the 36 model inputs that can be said to be dominant, or important, in the sense that they are the principal contributors to uncertainty in the output.

  19. Atmospheric feedback uncertainty dominates ocean heat uptake uncertainty for the transient climate response

    NASA Astrophysics Data System (ADS)

    MacDougall, Andrew H.; Swart, Neil C.; Knutti, Reto

    2015-04-01

    By absorbing heat and carbon the world ocean acts to slow the transient rate of climate change and to a great extent determines the magnitude of warming given a fixed budget of carbon emissions. The projected magnitude of future ocean heat uptake (OHU) varies substantially between the climate model simulations stored in the CMIP5 archive. In this study analytical and statistical methods, in addition to climate model simulations with an intermediate complexity climate model are used to partition the uncertainty in future OHU in CMIP5 models into uncertainty in radiative forcing, the climate feedback parameter, ocean surface wind fields, and the structure of ocean models. We estimate that if only uncertainty in ocean model structure remained then the uncertainty in OHU would be reduced by 61%, and if only uncertainty in ocean surface wind field remained then OHU uncertainty would be reduced by 87%. The regression method used to simultaneously estimate radiative forcing and the climate feedback parameter from climate model output leaves these parameters with anti-correlated uncertainty. If only uncertainty in radiative forcing and the climate feedback parameter remain then the uncertainty in OHU would be reduced by 9%. These results suggest that most of the uncertainty in OHU seen in CMIP5 models originates in uncertainties in how the atmosphere will respond to anthropogenic increases in greenhouse gas concentrations. Therefore, efforts to improve the representation of the ocean in climate models will have only a limited effect on reducing the uncertainty in the rate of transient climate change unless concurrent improvements are made in constraining atmospheric feedbacks.

  20. Number-operator-annihilation-operator uncertainty as an alternative for the number-phase uncertainty relation

    NASA Astrophysics Data System (ADS)

    Urizar-Lanz, Iñigo; Tóth, Géza

    2010-05-01

    We consider a number-operator-annihilation-operator uncertainty as a well-behaved alternative to the number-phase uncertainty relation, and examine its properties. We find a formulation in which the bound on the product of uncertainties depends on the expectation value of the particle number. Thus, while the bound is not a constant, it is a quantity that can easily be controlled in many systems. The uncertainty relation is approximately saturated by number-phase intelligent states. This allows us to define amplitude squeezing, connecting coherent states to Fock states, without a reference to a phase operator. We propose several setups for an experimental verification.

  1. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. PMID:25890086

  2. Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty

    SciTech Connect

    Mensing, R.W.

    1985-01-01

    This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program.

  3. Principles of silicon surface chemistry from first principles

    SciTech Connect

    Doren, D.J. [Univ. of Delaware Newark, DE (United States)

    1996-10-01

    First principles theoretical studies of dissociative adsorption of H{sub 2}, H{sub 2}O, SiH{sub 4} and other species on Si(100)-2x1 demonstrate some common principles that permit qualitative understanding of the mechanisms of reactive adsorption on Si. The structures of transition states and the interactions among surface sites can also be understood in terms of correlations between surface structure and local electron density. For example, the transition states for dissociative adsorption involve buckled surface dimers, which present both electrophilic and nucleophilic reaction sites and allow efficient addition across the dimer. A surface Diels-Alder reaction will also be described, in which symmetric addition to an unbuckled surface dimer is allowed by orbital symmetry. The Diets-Alder product establishes novel reactive surface sites that may be useful for subsequent surface modification. This work has been done in collaboration with Sharmila Pai, Robert Konecny and Anita Robinson Brown.

  4. Estimating uncertainty of inference for validation

    SciTech Connect

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the first in a series of inference uncertainty estimations. While the methods demonstrated are primarily statistical, these do not preclude the use of nonprobabilistic methods for uncertainty characterization. The methods presented permit accurate determinations for validation and eventual prediction. It is a goal that these methods establish a standard against which best practice may evolve for determining degree of validation.

  5. Marcus Hutter -1 -The Loss Rank Principle for Model Selection The Loss Rank Principle

    E-print Network

    Hutter, Marcus

    Information Criterion (AIC), · Bayesian Information Criterion (BIC), · Minimum Description Length (MDLMarcus Hutter - 1 - The Loss Rank Principle for Model Selection The Loss Rank Principle for Model Hutter - 2 - The Loss Rank Principle for Model Selection Contents · Model Complexity Selection

  6. Uncertainty of Fixed Depth Seismic Event Locations

    NASA Astrophysics Data System (ADS)

    Ballard, S.

    2006-05-01

    In seismic nuclear explosion monitoring, accurate determination of the depth of a seismic event is an extremely important but elusive goal. Important because events with hypocenters deeper than a few km can be ruled out as potential nuclear explosions, and elusive because the inherent tradeoff between the depth of an event and its origin time makes tight constraint on the depth of an event difficult if depth phases are not observed. Given these considerations, proper formulation of the uncertainty of a computed seismic event location takes on increased significance. A routine task in seismic event location is to compute a location with depth fixed at some particular depth, typically the surface of the Earth. Generally, this is done because depth is acknowledged to be poorly constrained by the available observations but one would nonetheless like an answer to the question "Assuming that the event occurred at the surface, where did it occur?" This is certainly a valid question and calculation of the answer is straightforward. Care must be exercised, however, when formulating the uncertainty of the computed fixed depth location. A naïve approach, which is frequently reported in practice, assumes that the depth of the event is known with perfect certainty, yielding an uncertainty ellipse at the desired depth. This can lead to the absurd result that a well constrained event with a 3 dimensional uncertainty that indicates that the event occurred at great depth, can have a surface, fixed depth solution with a perfectly reasonable looking uncertainty ellipse. An uncertainty estimate that better addresses the needs of the person requesting the fixed depth solution is the intersection of the 3-dimensional uncertainty boundary with the horizontal plane at the depth in question. It would be nice if the linear 3 dimensional ellipsoid could be used for this purpose, but vertical non-linearity of the Earth models generally used to compute seismic event locations makes this approach inaccurate. A far better approach would be to use the non- elliptical intersection of the surface with the 3D nonlinear uncertainty bounds determined with a grid search algorithm. Unfortunately, such a result is too computationally expensive to compute and too difficult to store in relational databases for routine work. There is an alternative approach that yields a convenient, computationally inexpensive uncertainty estimate that is comparable to the grid search results for earth models where the horizontal non-linearity is not severe. Essentially, the uncertainty ellipse is calculated as before but the dimensions of the ellipse are rescaled to reflect 3D rather than 2D uncertainty statistics, and the minimum sum squared weighted residual that is used as the reference point of the uncertainty ellipse is shifted from the minimum found at the depth of interest to the minimum found by the full 3D event location solution. For events with infinite uncertainty in the vertical direction and events whose 3D solutions occur at the depth of the fixed depth solution, the alternative uncertainty ellipse has linear dimensions about 15% larger than those calculated assuming depth is known perfectly. If the 3D solution indicates that the event could not have occurred at the depth in question, the alternative approach will return an answer of null, appropriately indicating that the event did not occur at the depth of the fixed depth solution, at the indicated confidence level. This work was supported by the United States Department of Energy under Contract DE-AC04-94AL85000. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy.

  7. Adverse outcome pathway (AOP) development I: strategies and principles.

    PubMed

    Villeneuve, Daniel L; Crump, Doug; Garcia-Reyero, Natàlia; Hecker, Markus; Hutchinson, Thomas H; LaLone, Carlie A; Landesmann, Brigitte; Lettieri, Teresa; Munn, Sharon; Nepelska, Malgorzata; Ottinger, Mary Ann; Vergauwen, Lucia; Whelan, Maurice

    2014-12-01

    An adverse outcome pathway (AOP) is a conceptual framework that organizes existing knowledge concerning biologically plausible, and empirically supported, links between molecular-level perturbation of a biological system and an adverse outcome at a level of biological organization of regulatory relevance. Systematic organization of information into AOP frameworks has potential to improve regulatory decision-making through greater integration and more meaningful use of mechanistic data. However, for the scientific community to collectively develop a useful AOP knowledgebase that encompasses toxicological contexts of concern to human health and ecological risk assessment, it is critical that AOPs be developed in accordance with a consistent set of core principles. Based on the experiences and scientific discourse among a group of AOP practitioners, we propose a set of five fundamental principles that guide AOP development: (1) AOPs are not chemical specific; (2) AOPs are modular and composed of reusable components-notably key events (KEs) and key event relationships (KERs); (3) an individual AOP, composed of a single sequence of KEs and KERs, is a pragmatic unit of AOP development and evaluation; (4) networks composed of multiple AOPs that share common KEs and KERs are likely to be the functional unit of prediction for most real-world scenarios; and (5) AOPs are living documents that will evolve over time as new knowledge is generated. The goal of the present article was to introduce some strategies for AOP development and detail the rationale behind these 5 key principles. Consideration of these principles addresses many of the current uncertainties regarding the AOP framework and its application and is intended to foster greater consistency in AOP development. PMID:25466378

  8. Principles for system level electrochemistry

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1986-01-01

    The higher power and higher voltage levels anticipated for future space missions have required a careful review of the techniques currently in use to preclude battery problems that are related to the dispersion characteristics of the individual cells. Not only are the out-of-balance problems accentuated in these larger systems, but the thermal management considerations also require a greater degree of accurate design. Newer concepts which employ active cooling techniques are being developed which permit higher rates of discharge and tighter packing densities for the electrochemical components. This paper will put forward six semi-independent principles relating to battery systems. These principles will progressively address cell, battery and finally system related aspects of large electrochemical storage systems.

  9. Artificial intelligence: Principles and applications

    SciTech Connect

    Yazdami, M.

    1985-01-01

    The book covers the principles of AI, the main areas of application, as well as considering some of the social implications. The applications chapters have a common format structured as follows: definition of the topic; approach with conventional computing techniques; why 'intelligence' would provide a better approach; and how AI techniques would be used and the limitations. The contents discussed are: Principles of artificial intelligence; AI programming environments; LISP, list processing and pattern-making; AI programming with POP-11; Computer processing of natural language; Speech synthesis and recognition; Computer vision; Artificial intelligence and robotics; The anatomy of expert systems - Forsyth; Machine learning; Memory models of man and machine; Artificial intelligence and cognitive psychology; Breaking out of the chinese room; Social implications of artificial intelligence; and Index.

  10. Principles of Charged Particle Acceleration

    NSDL National Science Digital Library

    This learning resources comprise a healthy introduction to charged particle acceleration. The site, by Stanley Humphries, a professor of electrical and computer engineering at University of New Mexico, amounts to an online textbook (.pdf) introducing the theory of charged particle acceleration. The book's fifteen chapters (with bibliography) summarize "the principles underlying all particle accelerators" and provide "a reference collection of equations and material essential to accelerator development and beam applications."

  11. Investigation on the holographic principle

    Microsoft Academic Search

    Li Jiang

    2003-01-01

    The holographic principle asserts that any given codimension two space-like surface limits the information content of adjacent regions. We first review various entropy bounds which lead to the formulation of this conjecture, putting great emphasis on the UV-IR connection. We propose to use non-commutative field theory as a toy model to study the holographic mapping mechanism. In particular, we investigate

  12. Applications of Bohr's correspondence principle

    Microsoft Academic Search

    Frank S. Crawford

    1989-01-01

    The Bohr correspondence-principle (cp) formula \\/ital dE\\/\\/\\/ital dn\\/=\\/h bar\\/..omega.. is presented (..omega.. is the classical angular frequency) and its predicted energy levels \\/ital E\\/\\/sub \\/ital n\\/\\/ are compared to those given by the stationary state solutions of the Schr\\/umlt o\\/dinger equation, first for several examples in one dimension (1D), including the ''quantum bouncer,'' and then for several examples in three

  13. Risk management principles for physicians.

    PubMed

    Paterick, Timothy E

    2014-01-01

    The swift pace of medical practice today makes it imperative for physicians to carry a toolbox jam-packed with risk management principles. The toolbox must be overflowing with utensils that allow a complete execution of the physician's fiduciary responsibility to the patient: all-inclusive informed consent, comprehensive documentation, fulfilling the standard of care, the significance of second opinions, transparency, crisis-management skills, and how to discuss an unfortunate result/outcome. PMID:24696957

  14. Assessment of neutronic parameter's uncertainties obtained within the reactor dosimetry framework: Development and application of the stochastic methods of analysis

    SciTech Connect

    Destouches, C.; Beretz, D. [Service de Physique Experimentale, CEA-CAD/DEN/DER/SPEx, Departement d'Etudes des Reacteurs, 13108 St-Paul lez Durance Cedex (France); Devictor, N. [Service d'Etude des Systemes Innovant, CEA-CAD/DEN/DER/SESI, Departement d'Etudes des Reacteurs, 13108 St-Paul lez Durance Cedex (France); Gregoire, G. [Service de Physique Experimentale, CEA-CAD/DEN/DER/SPEx, Departement d'Etudes des Reacteurs, 13108 St-Paul lez Durance Cedex (France)

    2006-07-01

    One of the main objectives of reactor dosimetry is the determination of the physical parameters characterizing the neutronic field in which the studied sample is irradiated. The knowledge of the associated uncertainties represents a significant stake for nuclear industry as shows the high uncertainty value of 15% (k=1) commonly allowed for the calculated neutron flux (E> 1 MeV) on the vessel and internal structures. The study presented in this paper aims at determining then reducing uncertainties associated with the reactor dosimetry interpretation process. After a brief presentation of the interpretation process, input data uncertainties identification and quantification are performed in particular with regard to covariances. Then uncertainties propagation is carried out and analyzed by deterministic and stochastic methods on a representative case. Finally, a Monte Carlo sensitivity study based on Sobol indices is achieved on a case leading to derive the most penalizing input uncertainties. This paper concludes rising improvement axes to be studied for the input data knowledge. It highlights for example the need for having realistic variance-covariance matrices associated with input data (cross sections libraries, neutron computation code's outputs, ...). Lastly, the methodology principle presented in this paper is enough general to be easily transposable for other measurements data interpretation processes. (authors)

  15. States of uncertainty: governing the empire of biotechnology.

    PubMed

    Forbes, Ian

    2006-04-01

    The biotechnological revolution presents states and governments with a set of challenges that they have difficulty meeting. Part of the problem is associated with common perceptions of the speed, volume and the radical uncertainty of the new developments. Globalisation is also implicated, especially in relation to the development of the knowledge economy and the role of multinational actors. This in turn contributes to the apparent decline in the confidence of the public that national governments will be effective in addressing mounting concern about the dangers inherent in new techniques and products. Under these circumstances, 'normal' governance begins to look more like 'failure' governance. This article asks whether the effects of the biotechnological revolution on governance can adequately be explained by the critique of imperialism proposed by Michael Hardt and Antonio Negri, and whether the state is in danger of becoming implicated in sponsorship of modernist schemes to improve the human condition of the kind analysed by James E Scott. Biotechnology does appear to have imperial qualities, while there are strong reasons for states to see biotechnology as a feasible and desirable set of developments. For some critics of biotechnology, like Francis Fukuyama, this is a lethal combination, and the powers of the state should be used to stop biotechnological development. Others, by contrast and more pragmatically, propose a check on what the state will support by the application of precautionary principles. The article concludes that the association between the biotechnology empire and the state, combined with the inescapable duty of the state to be the risk manager of last resort, alerts us to the complexities of uncertainty at the same time as it renders a merely restrictive precautionary approach impracticable. PMID:17312633

  16. ERDCTR-10-12 Decision Making Under Uncertainty

    E-print Network

    US Army Corps of Engineers

    ERDCTR-10-12 Decision Making Under Uncertainty EngineerResearchandDevelopment Center Martin T release; distribution is unlimited. #12;ERDC TR-10-12 November 2010 Decision Making Under Uncertainty, and approaches for addressing uncertainty in decision making. The sources of uncertainty in decision making

  17. Uncertainties in the Finnish greenhouse gas emission inventory

    Microsoft Academic Search

    Suvi Monni; Sanna Syri; Ilkka Savolainen

    2004-01-01

    Reliable uncertainty estimates are a tool for increasing the quality of national emission inventories which are essential for the implementation of the Kyoto Protocol. The first detailed uncertainty assessment was performed for the Finnish greenhouse gas emission inventory considering the years 1990 and 2001 using Monte Carlo simulation to combine uncertainties. In this work, uncertainty estimates were based on available

  18. Fuzzy logic: application for audit risk and uncertainty

    Microsoft Academic Search

    George Thomas Friedlob; Lydia L. F. Schleifer

    1999-01-01

    Auditors generally describe risk in terms of probabilities. Risk arises from lack of information which in turn leads to uncertainty. Since uncertainty exists when information is deficient and information can be deficient in different ways, it follows that auditors deal with different types of uncertainty. This article describes different types of uncertainty and a relatively new method of dealing with

  19. Comparison of Evidence Theory and Bayesian Theory for Uncertainty Modeling

    E-print Network

    Nikolaidis, Efstratios

    Comparison of Evidence Theory and Bayesian Theory for Uncertainty Modeling Prabhu Soundappan Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty

  20. The Truncated Tornado in TMBB: A Spatiotemporal Uncertainty Model for

    E-print Network

    Bae, Wan

    The Truncated Tornado in TMBB: A Spatiotemporal Uncertainty Model for Moving Objects Shayma and system performance. In this paper, we propose an uncertainty model called the Truncated Tornado model as a significant advance in minimizing uncertainty re- gion sizes. The Truncated Tornado model removes uncertainty