Science.gov

Sample records for uncertainty principle

  1. A new uncertainty principle

    E-print Network

    C. Y. Chen

    2008-12-23

    By examining two counterexamples to the existing theory, it is shown, with mathematical rigor, that as far as scattered particles are concerned the true distribution function is in principle not determinable (indeterminacy principle or uncertainty principle) while the average distribution function over each predetermined finite velocity solid-angle element can be calculated.

  2. Uncertainty Principle Respects Locality

    E-print Network

    Dongsheng Wang

    2015-04-19

    The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one. The origin of the quantum structure of nature still remains to be explained, some post-quantum theory which is more complete in some sense than quantum mechanics is possible and might not necessarily be a hidden variable theory.

  3. Uncertainty principles and vector quantization

    E-print Network

    Vershynin, Roman

    1 Uncertainty principles and vector quantization Yurii Lyubarskii and Roman Vershynin Abstract of the state-of-the-art of quantization prior to 1998 as well as outline of its nu- merous applications can

  4. Review on Generalized Uncertainty Principle

    E-print Network

    Tawfik, Abdel Nasser

    2015-01-01

    Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.

  5. Review on Generalized Uncertainty Principle

    E-print Network

    Abdel Nasser Tawfik; Abdel Magied Diab

    2015-09-22

    Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.

  6. Uncertainty Principles for Compact Groups

    E-print Network

    Gorjan Alagic; Alexander Russell

    2008-08-29

    We establish an operator-theoretic uncertainty principle over arbitrary compact groups, generalizing several previous results. As a consequence, we show that if f is in L^2(G), then the product of the measures of the supports of f and its Fourier transform ^f is at least 1; here, the dual measure is given by the sum, over all irreducible representations V, of d_V rank(^f(V)). For finite groups, our principle implies the following: if P and R are projection operators on the group algebra C[G] such that P commutes with projection onto each group element, and R commutes with left multiplication, then the squared operator norm of PR is at most rank(P)rank(R)/|G|.

  7. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  8. Quantum Mechanics and the Generalized Uncertainty Principle

    E-print Network

    Jang Young Bang; Micheal S. Berger

    2006-11-30

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  9. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  10. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  11. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  12. Curriculum in Art Education: The Uncertainty Principle.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1989-01-01

    Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the…

  13. Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.

    ERIC Educational Resources Information Center

    McKerrow, K. Kelly; McKerrow, Joan E.

    1991-01-01

    The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)

  14. An uncertainty principle for unimodular quantum groups

    SciTech Connect

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  15. An uncertainty principle for unimodular quantum groups

    E-print Network

    Jason Crann; Mehrdad Kalantar

    2014-11-02

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to normal central states of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the central state.

  16. A Principle of Uncertainty for Information Seeking.

    ERIC Educational Resources Information Center

    Kuhlthau, Carol C.

    1993-01-01

    Proposes an uncertainty principle for information seeking based on the results of a series of studies that investigated the user's perspective of the information search process. Constructivist theory is discussed as a conceptual framework for studying the user's perspective, and areas for further research are suggested. (Contains 44 references.)…

  17. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  18. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  19. Space tests of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Khodadi, M.

    2015-08-01

    A generalized uncertainty principle admitting a minimal measurable length contains a parameter of which the numerical value needs to be fixed. In fact, the application of the Generalized Uncertainty Principle (GUP) to some quantum mechanical problems offers different values for the upper bound of the GUP dimensionless parameter . In this work, by applying a GUP which is linear and quadratic in the correction to Newton's law of gravity, and then using the stability condition of the circular orbits of the planets, we propose an upper bound for . By using the astronomical data of the Solar System objects, a new and severe constraint on the upper bound of the parameter is derived. Also, using the modified Newtonian potential, inspired by a GUP which is linear and quadratic in , we investigate the possibility of measuring the relevant parameter through observables provided by the Galileo Navigation Satellite System.

  20. Generalized uncertainty principle and quantum gravitational friction

    NASA Astrophysics Data System (ADS)

    Bargueño, Pedro

    2013-12-01

    The Generalized Uncertainty Principle gives place to deformed commutation relations which are linear or quadratic in particle momenta. In this article we show that, in the linear case, which corresponds to double special relativity theories, this deformation is equivalent to a gravitationally-induced damping process in an Ohmic environment at zero temperature. Therefore, both minimum length and maximum momentum give place to quantum gravitational friction.

  1. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  2. Least uncertainty principle in deformation quantization

    SciTech Connect

    Gerstenhaber, Murray

    2007-02-15

    Deformation quantization generally produces families of cohomologically equivalent quantizations of a single physical system. We conjecture that the physically meaningful ones (i) allow enough observable energy distributions, i.e., ones for which no pure quantum state has negative probability, and (ii) reduce the uncertainty in the probability distribution of the resulting quantum states. For the simple harmonic oscillator this principle selects the classic Groenewold-Moyal (or Weyl) product on phase space while for the free particle, in which there is no real quantization, all cohomologically equivalent quantizations are equally good.

  3. Dilaton cosmology, noncommutativity, and generalized uncertainty principle

    SciTech Connect

    Vakili, Babak

    2008-02-15

    The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

  4. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  5. Lorentz Invariance Violation and Generalized Uncertainty Principle

    E-print Network

    Abdel Nasser Tawfik; H. Magdy; A. Farag Ali

    2015-12-20

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay $\\Delta t$ comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, $\\Delta t$, and the relative change in the speed of muon neutrino $\\Delta v$ in dependence on redshift $z$ turn to be wrong, we utilize its main features to estimate $\\Delta v$. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, $\\alpha$ that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  6. Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael

    1993-01-01

    Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)

  7. Lorentz Invariance Violation and Generalized Uncertainty Principle

    E-print Network

    A. Tawfik; H. Magdy; A. Farag Ali

    2012-05-27

    Recent approaches for quantum gravity are conjectured to give predictions for a minimum measurable length, a maximum observable momentum and an essential generalization for the Heisenberg uncertainty principle (GUP). The latter is based on a momentum-dependent modification in the standard dispersion relation and leads to Lorentz invariance violation (LIV). The main features of the controversial OPERA measurements on the faster-than-light muon neutrino anomaly are used to calculate the time of flight delays $\\Delta t$ and the relative change $\\Delta v$ in the speed of neutrino in dependence on the redshift $z$. The results are compared with the OPERA measurements. We find that the measurements are too large to be interpreted as LIV. Depending on the rest mass, the propagation of high-energy muon neutrino can be superluminal. The comparison with the ultra high energy cosmic rays seems to reveals an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a pertubative departure from exact Lorentz invariance.

  8. Open timelike curves violate Heisenberg's uncertainty principle.

    PubMed

    Pienaar, J L; Ralph, T C; Myers, C R

    2013-02-01

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity. PMID:23432226

  9. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  10. String Theory, Scale Relativity and the Generalized Uncertainty Principle

    E-print Network

    Carlos Castro

    1996-11-06

    Extensions (modifications) of the Heisenberg Uncertainty principle are derived within the framework of the theory of Special Scale-Relativity proposed by Nottale. In particular, generalizations of the Stringy Uncertainty Principle are obtained where the size of the strings is bounded by the Planck scale and the size of the Universe. Based on the fractal structures inherent with two dimensional Quantum Gravity, which has attracted considerable interest recently, we conjecture that the underlying fundamental principle behind String theory should be based on an extension of the Scale Relativity principle where both dynamics as well as scales are incorporated in the same footing.

  11. Entropic Uncertainty Principle and Information Exclusion Principle for multiple measurements in the presence of quantum memory

    E-print Network

    Jun Zhang; Yang Zhang; Chang-shui Yu

    2015-09-20

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion principle for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details.

  12. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures. PMID:19705643

  13. Microscopic black hole stabilization via the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Vayenas, Constantinos G.; Grigoriou, Dimitrios

    2015-01-01

    Due to the Heisenberg uncertainty principle, gravitational confinement of two- or three-rotating particle systems can lead to microscopic Planckian or sub-Planckian black holes with a size of order their Compton wavelength. Some properties of such states are discussed in terms of the Schwarzschild geodesics of general relativity and compared with properties computed via the combination of special relativity, equivalence principle, Newton's gravitational law and Compton wavelength. It is shown that the generalized uncertainty principle (GUP) provides a satisfactory fit of the Schwarzschild radius and Compton wavelength of such microscopic, particle-like, black holes.

  14. The Entropic Uncertainty Principle for Decaying Systems and CP violation

    E-print Network

    Beatrix C. Hiesmayr

    2011-03-17

    Employing an effective formalism for decaying system we are able to investigate Heisenberg's uncertainty relation for observables measured at accelerator facilities. In particular we investigate the neutral K--meson system and show that, firstly, due to the time evolution an uncertainty between strangeness measurements at different times is introduced and, secondly, due to the imbalance of matter and antimatter (CP violation) an uncertainty in the evolution of the eigenstates of the effective Hamiltonian of the system. Consequently, the existence of CP violation is linked to uncertainties of observables, i.e. the outcomes cannot be predicted even in principle to arbitrary precisions.

  15. Self-Completeness and the Generalized Uncertainty Principle

    E-print Network

    Maximiliano Isi; Jonas Mureika; Piero Nicolini

    2013-11-06

    The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.

  16. The Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson

    E-print Network

    Ziv, Hadar

    The Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson Information of uncertainty in select software engineering do­ mains. We present three common sources of uncertainty aspects of modeling and managing uncertainty in software engineering in general. Keywords Software

  17. Single-Slit Diffraction and the Uncertainty Principle

    ERIC Educational Resources Information Center

    Rioux, Frank

    2005-01-01

    A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

  18. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

  19. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  20. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  1. String Theory and Space-Time Uncertainty Principle

    E-print Network

    Yoneya, T

    2000-01-01

    The notion of space-time uncertainty principle in string theory is clarified and further developed. The motivation and the derivation of the principle are first reviewed in a reasonably self-contained way. It is then shown that the nonperturbative (Borel summed) high-energy and high-momentum transfer behaviors of string scattering are consistent with the space-time uncertainty principle. It is also shown that, in consequence of the principle, string theories in 10 dimensions generically exhibit a characteristic length scale which is equal to the well-known 11 dimensional Planck length $g_s^{1/3}\\ell_s$ of M-theory as the scale at which stringy effects take over the effects of classical supergravity, even without involving D-branes directly. The meanings of the space-time uncertainty relation in connection with D-branes and black holes are discussed and reinterpreted. Finally, we present a novel interpretation of the Schild-gauge action for strings from a viewpoint of noncommutative geometry, which conforms to...

  2. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4?). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  3. Uncertainty principle for Gabor systems and the Zak transform

    SciTech Connect

    Czaja, Wojciech; Zienkiewicz, Jacek

    2006-12-15

    We show that if g(set-membership sign)L{sup 2}(R) is a generator of a Gabor orthonormal basis with the lattice ZxZ, then its Zak transform Z(g) satisfies {nabla}Z(g)(negated-set-membership sign)L{sup 2}([0,1){sup 2}). This is a generalization and extension of the Balian-Low uncertainty principle.

  4. Quantum black hole in the generalized uncertainty principle framework

    SciTech Connect

    Bina, A.; Moslehi, A.; Jalalzadeh, S.

    2010-01-15

    In this paper we study the effects of the generalized uncertainty principle (GUP) on canonical quantum gravity of black holes. Through the use of modified partition function that involves the effects of the GUP, we obtain the thermodynamical properties of the Schwarzschild black hole. We also calculate the Hawking temperature and entropy for the modification of the Schwarzschild black hole in the presence of the GUP.

  5. Space-time uncertainty relation from quantum and gravitational principles

    E-print Network

    Yi-Xin Chen; Yong Xiao

    2008-09-24

    By collecting both quantum and gravitational principles, a space-time uncertainty relation $(\\delta t)(\\delta r)^{3}\\geqslant\\pi r^{2}l_{p}^{2}$ is derived. It can be used to facilitate the discussion of several profound questions, such as computational capacity and thermodynamic properties of the universe and the origin of holographic dark energy. The universality and validity of the proposed relation are illustrated via these examples.

  6. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  7. The Perihelion Precession of Mercury and the Generalized Uncertainty Principle

    E-print Network

    Barun Majumder

    2011-05-12

    Very recently authors in [1] proposed a new Generalized Uncertainty Principle (or GUP) with a linear term in Plank length. In this Letter the effect of this linear term is studied perturbatively in the context of Keplerian orbits. The angle by which the perihelion of the orbit revolves over a complete orbital cycle is computed. The result is applied in the context of the precession of the perihelion of Mercury. As a consequence we get a lower bound of the new intermediate length scale offered by the GUP which is approximately 40 orders of magnitude below Plank length.

  8. Minisuperspace dynamics in a generalized uncertainty principle framework

    SciTech Connect

    Battisti, Marco Valerio; Montani, Giovanni

    2008-01-03

    The minisuperspace dynamics of the Friedmann-Robertson-Walker (FRW) and of the Taub Universes in the context of a Generalized Uncertainty Principle is analyzed in detail. In particular, the motion of the wave packets is investigated and, in both the models, the classical singularity appear to be probabilistic suppressed. Moreover, the FRW wave packets approach the Planckian region in a stationary way and no evidences for a Big-Bounce, as predicted in Loop Quantum Cosmology, appear. On the other hand, the Taub wave packets provide the right behavior in predicting an isotropic Universe.

  9. Spectral gaps, additive energy, and a fractal uncertainty principle

    E-print Network

    Semyon Dyatlov; Joshua Zahl

    2015-08-14

    We obtain an essential spectral gap for $n$-dimensional convex co-compact hyperbolic manifolds with the dimension $\\delta$ of the limit set close to $(n-1)/2$. The size of the gap is expressed using the additive energy of stereographic projections of the limit set. This additive energy can in turn be estimated in terms of the constants in Ahlfors-David regularity of the limit set. Our proofs use new microlocal methods, in particular a notion of a fractal uncertainty principle.

  10. Universal uncertainty principle and quantum state control under conservation laws

    E-print Network

    Masanao Ozawa

    2004-11-10

    Heisenberg's uncertainty principle, exemplified by the gamma ray thought experiment, suggests that any finite precision measurement disturbs any observables noncommuting with the measured observable. Here, it is shown that this statement contradicts the limit of the accuracy of measurements under conservation laws originally found by Wigner in 1950s, and should be modified to correctly derive the unavoidable noise caused by the conservation law induced decoherence. The obtained accuracy limit leads to an interesting conclusion that a widely accepted, but rather naive, physical encoding of qubits for quantum computing suffers significantly from the decoherence induced by the angular momentum conservation law.

  11. Generalized uncertainty principle in Bianchi type I quantum cosmology

    NASA Astrophysics Data System (ADS)

    Vakili, B.; Sepangi, H. R.

    2007-07-01

    We study a quantum Bianchi type I model in which the dynamical variables of the corresponding minisuperspace obey the generalized Heisenberg algebra. Such a generalized uncertainty principle has its origin in the existence of a minimal length suggested by quantum gravity and sting theory. We present approximate analytical solutions to the corresponding Wheeler DeWitt equation in the limit where the scale factor of the universe is small and compare the results with the standard commutative and noncommutative quantum cosmology. Similarities and differences of these solutions are also discussed.

  12. Hawking temperature for various kinds of black holes from Heisenberg uncertainty principle

    E-print Network

    Fabio Scardigli

    2006-07-04

    Hawking temperature is computed for a large class of black holes (with spherical, toroidal and hyperboloidal topologies) using only laws of classical physics plus the "classical" Heisenberg Uncertainty Principle. This principle is shown to be fully sufficient to get the result, and there is no need to this scope of a Generalized Uncertainty Principle.

  13. Scalar field cosmology modified by the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Paliathanasis, Andronikos; Pan, Supriya; Pramanik, Souvik

    2015-12-01

    We consider quintessence scalar field cosmology in which the Lagrangian of the scalar field is modified by the generalized uncertainty principle. We show that the perturbation terms that arise from the deformed algebra are equivalent with the existence of a second scalar field, where the two fields interact in the kinetic part. Moreover, we consider a spatially flat Friedmann–Lemaître–Robertson–Walker spacetime, and we derive the gravitational field equations. We show that the modified equation of state parameter w GUP can cross the phantom divide line; that is w GUP < ?1. Furthermore, we derive the field equations in the dimensionless parameters, the dynamical system that arises is a singular perturbation system in which we study the existence of the fixed points in the slow manifold. Finally, we perform numerical simulations for some well known models and we show that for these models with the specific initial conditions, the parameter w GUP crosses the phantom barrier.

  14. Generalized Uncertainty Principle and Thermostatistics: A Semiclassical Approach

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, Mohammad; Pedram, Pouria

    2015-10-01

    We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.

  15. Long-range mutual information and topological uncertainty principle

    E-print Network

    Chao-Ming Jian; Isaac H. Kim; Xiao-Liang Qi

    2015-09-10

    Ordered phases in Landau paradigm can be diagnosed by a local order parameter, whereas topologically ordered phases cannot be detected in such a way. In this paper, we propose long-range mutual information(LRMI) as a unified diagnostic for both conventional long-range order and topological order. Using the LRMI, we characterize orders in $n+1$D gapped systems as $m$-membrane condensates with $ 0 \\leq m \\leq n-1$. The familiar conventional order and 2+1D topological orders are respectively identified as $0$-membrane and $1$-membrane condensates. We propose and study the topological uncertainty principle, which describes the non-commuting nature of non-local order parameters in topological orders.

  16. MaxEnt Principle for Handling Uncertainty with Qualitative Values

    NASA Astrophysics Data System (ADS)

    Pappalardo, Michele

    2006-11-01

    Bayesian mathematical model is the oldest method for modelling subjective degree of belief. If we have probabilistic measures with unknown values, then we must choose a different and appropriate model. The belief functions are a bridge between various models handling different forms of uncertainty. The conjunctive rule of Bayes builds a new set of a posteriori probability when two independent and accepted sets of random variable make inference. When two pieces of evidence are accepted with unknown values, the Dempster-Shafer's rule suggests a model for fusion of different degree of belief. In this paper we want to submit the use of MaxEnt principle for modelling the belief. Dealing with non-Bayesian sets, in which the piece of evidence represents the belief instead of the knowledge, the MaxEnt principle gives a tool to reduce the number of subsets representing the frame of discernment. The fusion a focal set with a set of max entropy cause a Bayesian approximation reducing mass function to a probabilistic distribution.

  17. Effect of the Generalized Uncertainty Principle on post-inflation preheating

    SciTech Connect

    Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr

    2011-12-01

    We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.

  18. Corrections to the Cardy-Verlinde formula from the generalized uncertainty principle

    SciTech Connect

    Setare, M.R.

    2004-10-15

    In this Letter, we compute the corrections to the Cardy-Verlinde formula of the d-dimensional Schwarzschild black hole. These corrections stem from the generalized uncertainty principle. Then we show one can take into account the generalized uncertainty principle corrections of the Cardy-Verlinde entropy formula by just redefining the Virasoro operator L{sub 0} and the central charge c.

  19. Provable Virus Detection: Using the Uncertainty Principle to Protect Against Malware

    E-print Network

    International Association for Cryptologic Research (IACR)

    Provable Virus Detection: Using the Uncertainty Principle to Protect Against Malware [Extended Uncertainty Principle. The attackers, no matter how clever, no matter when or how they insert their malware in its simplified form presented here, where various optimizations have been avoided for the sake

  20. Uncertainty principle for Wigner-Yanase-Dyson information in semifinite von Neumann algebras

    E-print Network

    Paolo Gibilisco; Tommaso Isola

    2008-04-16

    Recently Kosaki proved an uncertainty principle for matrices, related to Wigner-Yanase-Dyson information, and asked if a similar inequality could be proved in the von Neumann algebra setting. In this paper we prove such an uncertainty principle in the semifinite case.

  1. Molecular Response Theory in Terms of the Uncertainty Principle.

    PubMed

    Harde, Hermann; Grischkowsky, Daniel

    2015-08-27

    We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared. PMID:26280761

  2. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  3. Effect of uncertainty principle on the Wigner function-based simulation of quantum transport

    NASA Astrophysics Data System (ADS)

    Kim, Kyoung-Youm; Kim, Saehwa

    2015-09-01

    We investigate the effect of uncertainty principle on the simulation of quantum transport based on the Wigner function. We show that due to the positional uncertainty of electrons within the device, which bounds the region for nonlocal potential correlation, a constraint is imposed via the uncertainty principle on the possible momentum resolution of the Wigner function. It is numerically demonstrated that its violation deteriorates the simulation results significantly in configurations where the quantum effects are crucial.

  4. Violation of the Robertson-Schrödinger uncertainty principle and non-commutative quantum mechanics

    E-print Network

    Catarina Bastos; Orfeu Bertolami; Nuno Costa Dias; João Nuno Prata

    2012-11-26

    We show that a possible violation of the Robertson-Schr\\"odinger uncertainty principle may signal the existence of a deformation of the Heisenberg-Weyl algebra. More precisely, we prove that any Gaussian in phase-space (even if it violates the Robertson-Schr\\"odinger uncertainty principle) is always a quantum state of an appropriate non-commutative extension of quantum mechanics. Conversely, all canonical non-commutative extensions of quantum mechanics display states that violate the Robertson-Schr\\"odinger uncertainty principle.

  5. SOURCE ASSESSMENT: ANALYSIS OF UNCERTAINTY--PRINCIPLES AND APPLICATIONS

    EPA Science Inventory

    This report provides the results of a study that was conducted to analyze the uncertainties involved in the calculation of the decision parameters used in the Source Assessment Program and to determine the effect of these uncertainties on the decision-making procedure. A general ...

  6. A Computational Model of Limb Impedance Control Based on Principles of Internal Model Uncertainty 

    E-print Network

    Mitrovic, Djordje; Klanke, Stefan; Osu, Rieko; Kawato, Mitsuo; Vijayakumar, Sethu

    -known impedance control phenomena naturally emerge from the first principles of a stochastic optimization process that minimizes for internal model prediction uncertainties, along with energy and accuracy demands. The insights from this computational model could...

  7. Entropy bound of local quantum field theory with generalized uncertainty principle

    E-print Network

    Yong-Wan Kim; Hyung Won Lee; Yun Soo Myung

    2009-02-25

    We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound $A^{3/4}$ rather than $A$ with $A$ the boundary area.

  8. Entropy of the Randall-Sundrum brane world with the generalized uncertainty principle

    SciTech Connect

    Kim, Wontae; Park, Young-Jai; Kim, Yong-Wan

    2006-11-15

    By introducing the generalized uncertainty principle, we calculate the entropy of the bulk scalar field on the Randall-Sundrum brane background without any cutoff. We obtain the entropy of the massive scalar field proportional to the horizon area. Here, we observe that the mass contribution to the entropy exists in contrast to all previous results of the usual black hole cases with the generalized uncertainty principle.

  9. A Discussion on Heisenberg Uncertainty Principle in the Picture of Special Relativity

    E-print Network

    Luca Nanni

    2015-01-09

    In this note the formulation of the Heisenberg uncertainty principle (HUP) in the picture of the special relativity is given. The inequality shows that the product of quantum conjugate variables uncertainties is greater than an amount that is not more a constant but depends on the speed of the system on which the measurement is taken.

  10. Uncertainty Principle--Limited Experiments: Fact or Academic Pipe-Dream?

    ERIC Educational Resources Information Center

    Albergotti, J. Clifton

    1973-01-01

    The question of whether modern experiments are limited by the uncertainty principle or by the instruments used to perform the experiments is discussed. Several key experiments show that the instruments limit our knowledge and the principle remains of strictly academic concern. (DF)

  11. Gauss Linking Number and Electro-magnetic Uncertainty Principle

    E-print Network

    Abhay Ashtekar; Alejandro Corichi

    1997-01-24

    It is shown that there is a precise sense in which the Heisenberg uncertainty between fluxes of electric and magnetic fields through finite surfaces is given by (one-half $\\hbar$ times) the Gauss linking number of the loops that bound these surfaces. To regularize the relevant operators, one is naturally led to assign a framing to each loop. The uncertainty between the fluxes of electric and magnetic fields through a single surface is then given by the self-linking number of the framed loop which bounds the surface.

  12. Uncertainties.

    PubMed

    Dalla Chiara, Maria Luisa

    2010-09-01

    In contemporary science uncertainty is often represented as an intrinsic feature of natural and of human phenomena. As an example we need only think of two important conceptual revolutions that occurred in physics and logic during the first half of the twentieth century: (1) the discovery of Heisenberg's uncertainty principle in quantum mechanics; (2) the emergence of many-valued logical reasoning, which gave rise to so-called 'fuzzy thinking'. I discuss the possibility of applying the notions of uncertainty, developed in the framework of quantum mechanics, quantum information and fuzzy logics, to some problems of political and social sciences. PMID:19859828

  13. Path Integral for Dirac oscillator with generalized uncertainty principle

    SciTech Connect

    Benzair, H.; Boudjedaa, T.; Merad, M.

    2012-12-15

    The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.

  14. Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students' depictions

    NASA Astrophysics Data System (ADS)

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-12-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students’ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students’ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students’ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students’ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

  15. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  16. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  17. Experimental verification of the Heisenberg uncertainty principle for hot fullerene molecules

    E-print Network

    Olaf Nairz; Markus Arndt; Anton Zeilinger

    2001-05-14

    The Heisenberg uncertainty principle for material objects is an essential corner stone of quantum mechanics and clearly visualizes the wave nature of matter. Here we report a demonstration of the Heisenberg uncertainty principle for the most massive, complex and hottest single object so far, the fullerene molecule C70 at a temperature of 900 K. We find a good quantitative agreement with the theoretical expectation: dx * dp = h, where dx is the width of the restricting slit, dp is the momentum transfer required to deflect the fullerene to the first interference minimum and h is Planck's quantum of action.

  18. D-Particles, D-Instantons, and A Space-Time Uncertainty Principle in String Theory

    E-print Network

    Yoneya, T

    1998-01-01

    The purpose of this talk is to review some considerations by the present author on the possible role of a simple space-time uncertainty relation toward nonperturbative string theory. We first motivate the space-time uncertainty relation as a simple space-time characterization of the fundamental string theory. We then argue that the relation captures some of the important aspects of the short-distance dynamics of D-particles described by the effective super Yang-Mills matrix quantum mechanics, and also that the recently proposed type IIB matrix model can be regarded as a possible realization of the space-time uncertainty principle.

  19. Spin Squeezing, Macrorealism and the Heisenberg uncertainty principle

    E-print Network

    Giuseppe Vitagliano

    2015-11-25

    The work is organized in two main topics. At first we will outline the relation between spin squeezing, quantum metrology and entanglement detection, with a particular focus on the last. We will derive spin squeezing criteria for the detection of entanglement and its depth that outperform past approaches, especially for unpolarized states, recently produced in experiments and object of increasing interest in the community. Furthermore, we will extend the original definition of spin squeezed states by providing a new parameter that is thought to embrace different classes of states in a unified framework. Afterwards we consider a test of quantum principles in macroscopic objects originally designed by Leggett and Garg. In this case the scenario consists of a single party that is probed at different time instants and the quantum effect detected is the violation of Macrorealism (MR), due to strong correlations in time, rather than in space, between non-compatible observables. We will look at the problems of inconclusiveness of the LG tests arising from possible explanations of the results in terms of ``clumsy'' measurements, what has been termed ``clumsiness loophole''. We propose first a scheme to test and possibly falsify (MR) in macroscopic ensembles of cold atoms based on alternating Quantum Non-Demolition measurements and coherent, magnetically-driven collective spin rotations. Then we also propose a way to address the clumsiness loophole by introducing and computing an invasivity quantifier to add to the original LG expression. We provide numerical evidence that such a clumsiness-free test is feasible under state-of-the-art realistic experimental parameters and imperfections.

  20. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  1. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  2. Hydrogen Atom and Helium Ion Spatial and Momentum Distribution Functions Illustrate the Uncertainty Principle

    E-print Network

    Rioux, Frank

    Hydrogen Atom and Helium Ion Spatial and Momentum Distribution Functions Illustrate for oneelectron species such as the hydrogen atom and the helium ion. The coordinate 1s wave function for the hydrogen atom (z=1) and helium ion (z=2) clearly illustrate the uncertainty principle. 0 2 4 6 r 2 1 r

  3. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  4. The uncertainty principle does not entirely determine the non-locality of quantum theory

    E-print Network

    Ravishankar Ramanathan; Dardo Goyeneche; Piotr Mironowicz; Pawe? Horodecki

    2015-06-16

    One of the most intriguing discoveries regarding quantum non-local correlations in recent years was the establishment of a direct correspondence between the quantum value of non-local games and the strength of the fine-grained uncertainty relations in \\textit{Science, vol. 330, no. 6007, 1072 (2010)}. It was shown that while the degree of non-locality in any theory is generally determined by a combination of two factors - the strength of the uncertainty principle and the degree of steering allowed in the theory, the most paradigmatic games in quantum theory have degree of non-locality purely determined by the uncertainty principle alone. In this context, the fundamental question arises: is this a universal property of optimal quantum strategies for all non-local games? Indeed, the above mentioned feature occurs in surprising situations, even when the optimal strategy for the game involves non-maximally entangled states. However, here we definitively prove that the answer to the question is negative, by presenting explicit counter-examples of non-local games and fully analytical optimal quantum strategies for these, where a definite trade-off between steering and uncertainty is absolutely necessary. We provide an intuitive explanation in terms of the Hughston-Jozsa-Wootters theorem for when the relationship between the uncertainty principle and the quantum game value breaks down.

  5. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  6. Impacts of Generalized Uncertainty Principle on Black Hole Thermodynamics and Salecker-Wigner Inequalities

    E-print Network

    A. Tawfik

    2013-07-07

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  7. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    SciTech Connect

    Tawfik, A.

    2013-07-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  8. Integrating Leonardo da Vinci's principles of demonstration, uncertainty, and cultivation in contemporary nursing education.

    PubMed

    Story, Lachel; Butts, Janie

    2014-03-01

    Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosità principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience. PMID:23830068

  9. Quantum dynamics of the Taub universe in a generalized uncertainty principle framework

    SciTech Connect

    Battisti, Marco Valerio; Montani, Giovanni

    2008-01-15

    The implications of a generalized Uncertainty principle on the Taub cosmological model are investigated. The model is studied in the Arnowitt-Deser-Misner reduction of the dynamics and therefore a time variable is ruled out. Such a variable is quantized in a canonical way and the only physical degree of freedom of the system (related to the universe anisotropy) is quantized by means of a modified Heisenberg algebra. The analysis is performed at both the classical and quantum level. In particular, at quantum level, the motion of wave packets is investigated. The two main results obtained are as follows: (i) The classical singularity is probabilistically suppressed. The universe exhibits a stationary behavior and the probability amplitude is peaked in a determinate region. (ii) The generalized uncertainty principle wave packets provide the right behavior in the establishment of a quasi-isotropic configuration for the universe.

  10. The Quark-Gluon Plasma Equation of State and The Generalized Uncertainty Principle

    E-print Network

    L. I. AbouSalem; N. M. El Naggar; I. A. Elmashad

    2015-09-30

    The quark-gluon plasma (QGP) equation of state within a minimal length scenario or Generalized Uncertainty Principle (GUP) is studied. The Generalized Uncertainty Principle is implemented on deriving the thermodynamics of ideal QGP at a vanishing chemical potential. We find a significant effect for the GUP term. The main features of QCD lattice results were quantitatively achieved in case of $n_{f}=0$, $n_{f}=2$ and $n_{f}=2+1$ flavors for the energy density, the pressure and the interaction measure. The exciting point is the large value of bag pressure especially in case of $n_{f}=2+1$ flavor which reflects the strong correlation between quarks in this bag which is already expected. One can notice that, the asymptotic behavior which is characterized by Stephan-Boltzmann limit would be satisfied.

  11. An Effect of Quantum Gravity: Generalized Uncertainty Principle Removes The Chandrasekhar Limit

    E-print Network

    Rashidi, Reza

    2015-01-01

    The effect of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane-Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.

  12. Energy-time uncertainty principle and lower bounds on sojourn time

    E-print Network

    Joachim Asch; Olivier Bourget; Victor Cortes; Claudio Fernandez

    2015-07-23

    One manifestation of quantum resonances is a large sojourn time, or autocorrelation, of states which are initially localized. We elaborate on Lavine's time-energy uncertainty principle and give an estimate on the sojourn time. The bound involves Fermi's Golden Rule for the case of perturbed embedded eigenstates. Only very mild regularity is required. We illustrate the theory by applications to resonances for time dependent- and multistate systems .

  13. Energy distribution of massless particles on black hole backgrounds with generalized uncertainty principle

    SciTech Connect

    Li Zhongheng

    2009-10-15

    We derive new formulas for the spectral energy density and total energy density of massless particles in a general spherically symmetric static metric from a generalized uncertainty principle. Compared with blackbody radiation, the spectral energy density is strongly damped at high frequencies. For large values of r, the spectral energy density diminishes when r grows, but at the event horizon, the spectral energy density vanishes and therefore thermodynamic quantities near a black hole, calculated via the generalized uncertainty principle, do not require any cutoff parameter. We find that the total energy density can be expressed in terms of Hurwitz zeta functions. It should be noted that at large r (low local temperature), the difference between the total energy density and the Stefan-Boltzmann law is too small to be observed. However, as r approaches an event horizon, the effect of the generalized uncertainty principle becomes more and more important, which may be observable. As examples, the spectral energy densities in the background metric of a Schwarzschild black hole and of a Schwarzschild black hole plus quintessence are discussed. It is interesting to note that the maximum of the distribution shifts to higher frequencies when the quintessence equation of state parameter w decreases.

  14. The uncertainty principle enables non-classical dynamics in an interferometer.

    PubMed

    Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

    2014-01-01

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics. PMID:25105741

  15. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGESBeta

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore »achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  16. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    SciTech Connect

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.

  17. Remnant mass and entropy of black holes and modified uncertainty principle

    NASA Astrophysics Data System (ADS)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2014-06-01

    In this paper, we study the thermodynamics of black holes using a generalized uncertainty principle (GUP) with a correction term linear order in the momentum uncertainty. The mass-temperature relation and heat capacity are calculated from which critical and remnant masses are obtained. The results are exact and are found to be identical. The entropy expression gives the famous area theorem upto leading order corrections from GUP. In particular, the linear order term in GUP leads to a correction to the area theorem. Finally, the area theorem can be expressed in terms of a new variable termed as reduced horizon area only when the calculation is done to the next higher order correction from GUP.

  18. Non-Equilibrium Fluctuation-Dissipation Inequality and Non-Equilibrium Uncertainty Principle

    E-print Network

    C. H. Fleming; B. L. Hu; Albert Roura

    2010-12-03

    The fluctuation-dissipation relation is usually formulated for a system interacting with a heat bath at finite temperature in the context of linear response theory, where only small deviations from the mean are considered. We show that for an open quantum system interacting with a non-equilibrium environment, where temperature is no longer a valid notion, a fluctuation-dissipation inequality exists. Clearly stated, quantum fluctuations are bounded below by quantum dissipation, whereas classically the fluctuations can be made to vanish. The lower bound of this inequality is exactly satisfied by (zero-temperature) quantum noise and is in accord with the Heisenberg uncertainty principle, both in its microscopic origins and its influence upon systems. Moreover, it is shown that the non-equilibrium fluctuation-dissipation relation determines the non-equilibrium uncertainty relation in the weak-damping limit.

  19. Modified uncertainty principle from the free expansion of a Bose-Einstein Condensate

    E-print Network

    Elías Castellanos; Celia Escamilla-Rivera

    2015-09-21

    We develop a theoretical and numerical analysis of the free expansion of a Bose-Einstein condensate, in which we assume that the single particle energy spectrum is deformed due to a possible quantum structure of space time. Also we consider the presence of inter particle interactions in order to study more realistic and specific scenarios. The modified free velocity expansion of the condensate leads in a natural way to a modification of the uncertainty principle, which allows us to investigate some possible features of the Planck scale regime in low-energy earth-based experiments.

  20. The Generalized Uncertainty Principle in f(R) Gravity for a Charged Black Hole

    E-print Network

    Jackson Levi Said; Kristian Zarb Adami

    2011-03-20

    Using f (R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole, through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini black holes.

  1. Using the uncertainty principle to design simple interactions for targeted self-assembly

    E-print Network

    Erik Edlund; Oskar Lindgren; Martin Nilsson Jacobi

    2012-11-23

    We present a method that systematically simplifies isotropic interactions designed for targeted self-assembly. The uncertainty principle is used to show that an optimal simplification is achieved by a combination of heat kernel smoothing and Gaussian screening. We use this method to design isotropic interactions for self-assembly of complex lattices and of materials with functional properties. The interactions we derive are significantly simpler than those previously published, and it is realistic to discuss explicit experimental implementation of the designed self-assembling components.

  2. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect

    Said, Jackson Levi; Adami, Kristian Zarb

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  3. Uncertainty Principle for Control of Ensembles of Oscillators Driven by Common Noise

    E-print Network

    Denis S. Goldobin

    2014-04-28

    We discuss control techniques for noisy self-sustained oscillators with a focus on reliability, stability of the response to noisy driving, and oscillation coherence understood in the sense of constancy of oscillation frequency. For any kind of linear feedback control--single and multiple delay feedback, linear frequency filter, etc.--the phase diffusion constant, quantifying coherence, and the Lyapunov exponent, quantifying reliability, can be efficiently controlled but their ratio remains constant. Thus, an "uncertainty principle" can be formulated: the loss of reliability occurs when coherence is enhanced and, vice versa, coherence is weakened when reliability is enhanced. Treatment of this principle for ensembles of oscillators synchronized by common noise or global coupling reveals a substantial difference between the cases of slightly non-identical oscillators and identical ones with intrinsic noise.

  4. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    SciTech Connect

    Tallacchini, Mariachiara . E-mail: mariachiara.tallacchini@unimi.it

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  5. Covariant energy–momentum and an uncertainty principle for general relativity

    SciTech Connect

    Cooperstock, F.I.; Dupre, M.J.

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  6. Space-Time Uncertainty Principle and Conformal Symmetry in D-Particle Dynamics

    E-print Network

    Jevicki, A; Jevicki, Antal; Yoneya, Tamiaki

    1998-01-01

    Motivated by the space-time uncertainty principle, we establish a conformal symmetry in the dynamics of D-particles. The conformal symmetry, combined with the supersymmetric non-renormalization theorem, uniquely determines the classical form of the effective action for a probe D-particle in the background of a heavy D-particle source, previously constructed by Becker-Becker-Polchinski-Tseytlin. Our results strengthen the conjecture proposed by Maldacena on the correspondence, in the case of D-particles, between the supergravity and the supersymmetric Yang-Mills matrix models in the large $N$-limit, the latter being the boundary conformal field theory of the former in the classical D-particle background in the near horizon limit.

  7. Linear and nonlinear response functions of the Morse oscillator: Classical divergence and the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Wu, Jianlan; Cao, Jianshu

    2001-09-01

    The algebraic structure of the quantum Morse oscillator is explored to formulate the coherent state, the phase-space representations of the annihilation and creation operators, and their classical limits. The formulation allows us to calculate the linear and nonlinear quantum response functions for microcanonical Morse systems and to demonstrate the linear divergence in the corresponding classical response function. On the basis of the uncertainty principle, the classical divergence is removed by phase-space averaging around the microcanonical energy surface. For the Morse oscillator, the classical response function averaged over quantized phase space agrees exactly with the quantum response function for a given eigenstate. Thus, phase-space averaging and quantization provide a useful way to establish the classical-quantum correspondence of anharmonic systems.

  8. Corrections to entropy and thermodynamics of charged black hole using generalized uncertainty principle

    E-print Network

    Abdel Nasser Tawfik; Eiman Abou El Dahab

    2015-02-19

    Recently, there has been much attention devoted to resolving the quantum corrections to the Bekenstein-Hawking (black hole) entropy, which relates the entropy to the cross-sectional area of the black hole horizon. Using generalized uncertainty principle (GUP), corrections to the geometric entropy and thermodynamics of black hole will be introduced. The impact of GUP on the entropy near the horizon of three types of black holes; Schwarzschild, Garfinkle-Horowitz-Strominger and Reissner-Nordstr\\"om is determined. It is found that the logarithmic divergence in the entropy-area relation turns to be positive. The entropy $S$, which is assumed to be related to horizon's two-dimensional area, gets an additional terms, for instance $2\\, \\sqrt{\\pi}\\, \\alpha\\, \\sqrt{S}$, where $\\alpha$ is the GUP parameter.

  9. Constraints on the Generalized Uncertainty Principle from black-hole thermodynamics

    NASA Astrophysics Data System (ADS)

    Gangopadhyay, Sunandan; Dutta, Abhijit; Faizal, Mir

    2015-10-01

    In this paper, we calculate the modification to the thermodynamics of a Schwarzschild black hole in higher dimensions due to the Generalized Uncertainty Principle (GUP). We use the fact that the leading-order corrections to the entropy of a black hole has to be logarithmic in nature to restrict the form of the GUP. We observe that in six dimensions, the usual GUP produces the correct form for the leading-order corrections to the entropy of a black hole. However, in five and seven dimensions a linear GUP, which is obtained by a combination of DSR with the usual GUP, is needed to produce the correct form of the corrections to the entropy of a black hole. Finally, we demonstrate that in five dimensions, a new form of GUP containing quadratic and cubic powers of the momentum also produces the correct form for the leading-order corrections to the entropy of a black hole.

  10. Planck's uncertainty principle and the saturation of Lorentz boosts by Planckian black holes

    E-print Network

    A. Aurilia; E. spallucci

    2013-09-27

    A basic inconsistency arises when the Theory of Special Relativity meets with quantum phenomena at the Planck scale. Specifically, the Planck length is Lorentz invariant and should not be affected by a Lorentz boost. We argue that Planckian relativity must necessarily involve the effect of black hole formation. Recent proposals for resolving the noted inconsistency seem unsatisfactory in that they ignore the crucial role of gravity in the saturation of Lorentz boosts. Furthermore, an invariant length at he Planck scale amounts to a universal quantum of resolution in the fabric of spacetime. We argue, therefore, that the universal Planck length requires an extension of the Uncertainty Principle as well. Thus, the noted inconsistency lies at the core of Quantum Gravity. In this essay we reflect on a possible resolution of these outstanding problems.

  11. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  12. f (R )-modified gravity, Wald entropy, and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Hammad, Fayçal

    2015-08-01

    Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an f (R )-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking modified gravity theories. Two different versions of the GUP are used, and it is found that only one of them yields a viable modified gravity model. Conversely, it is possible to find a general formulation of the GUP that would reproduce Wald entropy formula for any f (R ) theory of gravity.

  13. Kratzer's molecular potential in quantum mechanics with a generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bouaziz, Djamil

    2015-04-01

    The Kratzer's potential V(r) =g1 /r2 -g2 / r is studied in quantum mechanics with a generalized uncertainty principle, which includes a minimal length (?X)min = ??{ 5 ? }. In momentum representation, the Schrödinger equation is a generalized Heun's differential equation, which reduces to a hypergeometric and to a Heun's equations in special cases. We explicitly show that the presence of this finite length regularizes the potential in the range of the coupling constant g1 where the corresponding Hamiltonian is not self-adjoint. In coordinate space, we perturbatively derive an analytical expression for the bound states spectrum in the first order of the deformation parameter ?. We qualitatively discuss the effect of the minimal length on the vibration-rotation energy levels of diatomic molecules, through the Kratzer interaction. By comparison with an experimental result of the hydrogen molecule, an upper bound for the minimal length is found to be of about 0.01 Å. We argue that the minimal length would have some physical importance in studying the spectra of such systems.

  14. f(R) in Holographic and Agegraphic Dark Energy Models and the Generalized Uncertainty Principle

    E-print Network

    Barun Majumder

    2013-07-16

    We studied a unified approach with the holographic, new agegraphic and the $f(R)$ dark energy model to construct the form of $f(R)$ which in general responsible for the curvature driven explanation of the very early inflation along with presently observed late time acceleration. We considered the generalized uncertainty principle in our approach which incorporated the corrections in the entropy area relation and thereby modified the energy densities for the cosmological dark energy models considered. We found that holographic and new agegraphic $f(R)$ gravity models can behave like phantom or quintessence models in the spatially flat FRW universe. We also found a distinct term in the form of $f(R)$ which goes as $R^{\\frac{3}{2}}$ due to the consideration of the GUP modified energy densities. Although the presence of this term in the action can have its importance in explaining the early inflationary scenario but Capozziello {\\it et.al.} recently showed that $f(R) \\sim R^{\\frac{3}{2}}$ leads to an accelerated expansion, {\\it i.e.}, a negative value for the deceleration parameter $q$ which fit well with SNeIa and WMAP data.

  15. Revisiting the Calculation of I/V Profiles in Molecular Junctions Using the Uncertainty Principle.

    PubMed

    Ramos-Berdullas, Nicolás; Mandado, Marcos

    2014-04-17

    Ortiz and Seminario (J. Chem. Phys. 2007, 127, 111106/1-3) proposed some years ago a simple and direct approach to obtain I/V profiles from the combination of ab initio equilibrium electronic structure calculations and the uncertainty principle as an alternative or complementary tool to more sophisticated nonequilibrium Green's functions methods. In this work, we revisit the fundamentals of this approach and reformulate accordingly the expression of the electric current. By analogy to the spontaneous electron decay process in electron transitions, in our revision, the current is calculated upon the relaxing process from the "polarized" state induced by the external electric field to the electronic ground state. The electric current is obtained from the total charge transferred through the molecule and the corresponding electronic energy relaxation. The electric current expression proposed is more general compared with the previous expression employed by Ortiz and Seminario, where the charge variation must be tested among different slabs of atoms at the contact. This new approach has been tested on benzene-1,4-dithiolate attached to different gold clusters that represent the contact with the electrodes. Analysis of the total electron deformation density induced by the external electric voltage and properties associated with the electron deformation orbitals supports the conclusions obtained from the I/V profiles. PMID:24689867

  16. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  17. A Dark Energy Model with Generalized Uncertainty Principle in the Emergent, Intermediate and Logamediate Scenarios of the Universe

    E-print Network

    Rahul Ghosh; Surajit Chattopadhyay; Ujjal Debnath

    2011-10-22

    This work is motivated by the work of Kim et al (2008), which considered the equation of state parameter for the new agegraphic dark energy based on generalized uncertainty principle coexisting with dark matter without interaction. In this work, we have considered the same dark energy inter- acting with dark matter in emergent, intermediate and logamediate scenarios of the universe. Also, we have investigated the statefinder, kerk and lerk parameters in all three scenarios under this inter- action. The energy density and pressure for the new agegraphic dark energy based on generalized uncertainty principle have been calculated and their behaviors have been investigated. The evolu- tion of the equation of state parameter has been analyzed in the interacting and non-interacting situations in all the three scenarios. The graphical analysis shows that the dark energy behaves like quintessence era for logamediate expansion and phantom era for emergent and intermediate expansions of the universe.

  18. Corrected Bekenstein-Hawking entropy of warped AdS3 rotating black hole with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Mahanta, Chandra Rekha; Misra, Rajesh

    2015-08-01

    In the Generalized Uncertainty Principle (GUP), there should be a minimal black hole whose size is comparable to the minimal length so that it cannot evaporate completely through the thermal radiation. Again, the black hole is not allowed to have a mass less than a scale of order Planck mass, which suggested a black hole remnant. We study the warped AdS3 rotating black hole and calculate the entropy, heat capacity and critical mass with the help of GUP. We compute the area theorem with GUP correction.

  19. The effect of generalized uncertainty principle on square well, a case study

    SciTech Connect

    Ma, Meng-Sen; Zhao, Ren

    2014-08-15

    According to a special case (? = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except ?. But the eigenstates will depend on another two parameters besides ?.

  20. Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators

    SciTech Connect

    Marchiolli, M.A.; Mendonça, P.E.M.F.

    2013-09-15

    We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.

  1. The Multi-Dimensional Hardy Uncertainty Principle and its Interpretation in Terms of the

    E-print Network

    Feichtinger, Hans Georg

    integrable function and its Fourier transform to the n-dimensional case using a symplectic diagonalization is of compact support: in this case the Fourier transform F can be extended into an entire function­Lindelöf principle), that if 2 L2(R) and its Fourier transform F (p) = 1 p 2 ~ Z 1 1 e i ~ px (x)dx satisfy, for jxj

  2. Maximally localized states and quantum corrections of black hole thermodynamics in the framework of a new generalized uncertainty principle

    E-print Network

    Yan-Gang Miao; Ying-Jie Zhao; Shao-Jun Zhang

    2015-09-22

    As a generalized uncertainty principle (GUP) leads to the effects of the minimal length of the order of the Planck scale and UV/IR mixing, some significant physical concepts and quantities are modified or corrected correspondingly. On the one hand, we derive the maximally localized states --- the physical states displaying the minimal length uncertainty associated with a new GUP proposed in our previous work. On the other hand, in the framework of this new GUP we calculate quantum corrections to the thermodynamic quantities of the Schwardzschild black hole, such as the Hawking temperature, the entropy, and the heat capacity, and give a remnant mass of the black hole at the end of the evaporation process. Moreover, we compare our results with that obtained in the frameworks of several other GUPs. In particular, we observe a significant difference between the situations with and without the consideration of the UV/IR mixing effect in the quantum corrections to the evaporation rate and the decay time. That is, the decay time can greatly be prolonged in the former case, which implies that the quantum correction from the UV/IR mixing effect may give rise to a radical rather than a tiny influence to the Hawking radiation.

  3. Chaos and the way of Zen: psychiatric nursing and the 'uncertainty principle'.

    PubMed

    Barker, P J

    1996-08-01

    The biological sciences have been dominated by 'classicist' science-predicated on the post-Enlightenment belief that a real world exists, which behaves according to notions of causality and consistency. Although medicine, and by implication psychiatric nursing, derives its explanatory power from such a science, much of its focus-illness-is not amenable to causal explanation or prediction. The theoretical developments of the 'new physics' have been used to redefine science and, as a result, have challenged traditional constructions of reality. The new physics are usually framed in terms of the physical world, or to construe consciousness. In this paper I shall consider the implications of chaos-a relative of the new physics-for psychiatric nursing practice. As nursing appears to crave a 'certainty principle' to govern the theoretical underpinnings of practice, this study considers how chaos might contribute to a metaparadigm of nursing. PMID:8997984

  4. Causal Wave Mechanics and the Advent of Complexity. II. Dynamic uncertainty in quantum systems and the correspondence principle

    E-print Network

    Andrei P. Kirilyuk

    1999-03-25

    The intrinsic multivaluedness of interaction process, revealed in Part I of this series of papers, is interpreted as the origin of the true dynamical (in particular, quantum) chaos. The latter is causally deduced as unceasing series of transitions, dynamically probabilistic by their origin, between the equally real, but incompatible 'realisations' (modes of interaction) of a system. The obtained set of realisations form the causally derived, intrinsically complete "space of events" providing the crucial extension of the notion of probability and the method of its first-principle calculation. The fundamental dynamic uncertainty thus revealed is specified for Hamiltonian quantum systems and applied to quantum chaos description in periodically perturbed systems. The ordinary semiclassical transition in our quantum-mechanical results leads to exact reproduction of the main features of chaotic behaviour of the system known from classical mechanics, which permits one to "re-establish" the correspondence principle for chaotic systems (inevitably lost in any their conventional, single-valued description). The causal dynamical randomness in the extended quantum mechanics is not restricted, however, to semiclassical conditions and generically occurs also in essentially quantum regimes, even though partial "quantum suppression of chaos" does exist and is specified in our description, as well as other particular types of the quantum (truly) chaotic behaviour.

  5. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  6. Femtoscopic scales in $p+p$ and $p+$Pb collisions in view of the uncertainty principle

    E-print Network

    V. M. Shapoval; P. Braun-Munzinger; Iu. A. Karpenko; Yu. M. Sinyukov

    2013-07-26

    A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in $p+p$ collisions at LHC energy ($\\sqrt{s}=7$ TeV). A prediction is also presented of pion interferometric radii for $p+$Pb collisions at $\\sqrt{s}=5.02$ TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as 'afterburner' is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of $p+p$ collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic $A+A$ collisions, also for high multiplicity $p+p$ and $p+$Pb events.

  7. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Technical Reports Server (NTRS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  8. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

  9. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    PubMed

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124

  10. A device-independent test of an entropic uncertainty relation

    E-print Network

    Vallette, Bruno

    but the premise. On an implication of the uncertainty principle. Werner Heisenberg Traditional version #12 of the uncertainty principle. Werner Heisenberg Robertson­Schrödinger Uncertainty Principle OX OZ 1 2 |h |[OZ, OX--it is not the conclusion that is wrong but the premise. On an implication of the uncertainty principle. Werner Heisenberg

  11. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Gour, Gilad

    2014-03-01

    Uncertainty relations are a distinctive characteristic of quantum theory that imposes intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring non-commuting observables. However, I will show here that there is no fundamental reason for using entropies as quantifiers; in fact, any functional relation that characterizes the uncertainty of the measurement outcomes can be used to define an uncertainty relation. Starting from a simple assumption that any measure of uncertainty is non-decreasing under mere relabeling of the measurement outcomes, I will show that Schur-concave functions are the most general uncertainty quantifiers. I will then introduce a novel fine-grained uncertainty relation written in terms of a majorization relation, which generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary measures of uncertainty. This infinite family of uncertainty relations includes all the known entropic uncertainty relations, but is not limited to them. In this sense, the relation is universally valid and captures the essence of the uncertainty principle in quantum theory. This talk is based on a joint work with Shmuel Friedland and Vlad Gheorghiu. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada and by the Pacific Institute for Mathematical Sciences (PIMS).

  12. Pauli effects in uncertainty relations

    E-print Network

    Toranzo, I V; Esquivel, R O; Dehesa, J S

    2014-01-01

    In this letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information- based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  13. Pauli effects in uncertainty relations

    NASA Astrophysics Data System (ADS)

    Toranzo, I. V.; Sánchez-Moreno, P.; Esquivel, R. O.; Dehesa, J. S.

    2014-10-01

    In this Letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information-based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  14. Pauli effects in uncertainty relations

    E-print Network

    I. V. Toranzo; P. Sánchez-Morenob; R. O. Esquivel; J. S. Dehesa

    2014-09-03

    In this letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information- based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  15. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  16. Variance-based uncertainty relations

    E-print Network

    Yichen Huang

    2010-12-14

    It is hard to overestimate the fundamental importance of uncertainty relations in quantum mechanics. In this work, I propose state-independent variance-based uncertainty relations for arbitrary observables in both finite and infinite dimensional spaces. We recover the Heisenberg uncertainty principle as a special case. By studying examples, we find that the lower bounds provided by our new uncertainty relations are optimal or near-optimal. I illustrate the uses of our new uncertainty relations by showing that they eliminate one common obstacle in a sequence of well-known works in entanglement detection, and thus make these works much easier to access in applications.

  17. Entropic uncertainty relations for multiple measurements

    NASA Astrophysics Data System (ADS)

    Liu, Shang; Mu, Liang-Zhu; Fan, Heng

    2015-04-01

    We present the entropic uncertainty relations for multiple measurement settings which demonstrate the uncertainty principle of quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory, and can be proven by a unified method. Our results recover some well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. The bounds of those relations which quantify the extent of the uncertainty take concise forms and are easy to calculate. Those uncertainty relations might play important roles in the foundations of quantum theory. Potential experimental demonstration of those entropic uncertainty relations is discussed.

  18. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  19. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  20. Uncertainties in Long-Term Geologic Offset Rates of Faults: General Principles Illustrated With Data From California and Other Western States

    NASA Astrophysics Data System (ADS)

    Bird, P.

    2006-12-01

    Because the slip rates of seismic faults are highly variable, a better target for statistical estimation is the long- term offset rate, which can be defined as the rate of one component of the slip which would be measured between any two times when fault-plane shear tractions are equal. The probability density function for the sum of elastic offset plus fault slip offset since a particular geologic event includes uncertainties associated with changes in elastic strain between that event and the present, which are estimated from the sizes of historic earthquake offsets on other faults of similar type. The probability density function for the age of a particular geologic event may be non-Gaussian, especially if it is determined from cross-cutting relations, or from radiocarbon or cosmogenic-nuclide ages containing inheritance. Two alternate convolution formulas relating the distributions for offset and age give the probability density function for long-term offset rate; these are computed for most published cases of dated offset features along active faults in California and other western states. After defining a probabilistic measure of disagreement between two long-term offset rate distributions measured on the same fault section, I investigate how disagreement varies with geologic time (difference in age of the offset features) and with publication type (primary, secondary, or tertiary). Patterns of disagreement suggest that at least 4.3% of offset rates in primary literature are incorrect (due to failure to span the whole fault, undetected complex initial shapes of offset features, or faulty correlation in space or in geologic time) or unrepresentative (due to variations in offset rate along the trace). Tertiary (third-hand) literature sources have a higher error rate of 14.5%. In the western United States, it appears that rates from offset features as old as 3 Ma can be averaged without introducing age-dependent bias. Offsets of older features can and should be used as well, but it is necessary to make allowance for the increased risk, rising to rapidly to 48%, that they are inapplicable to neotectonics. Based on these results, best-estimate combined probability density functions are computed for the long-term offset rates of all active faults in California and other conterminous western states, and described in tables using several scalar measures. Of 849 active and potentially-active faults in the conterminous western United States, only 48 are "well-constrained" (having combined probability density functions for long-term offset rate in which the width of the 95%-confidence range is smaller than the median). It appears to require about 4 offset features to give an even chance of achieving a well-constrained combined rate, and at least 7 offset features to guarantee it.

  1. Minimum uncertainty filters for pulses

    SciTech Connect

    Trantham, E.C. )

    1993-06-01

    The objective of this paper is to calculate filters with a minimum uncertainty, the product of filter length and bandwidth. The method is applicable to producing minimum uncertainly filters with time or frequency domain constraints on the filter. The calculus of variations is used to derive the conditions that minimize a filter's uncertainly. The general solution is a linear combination of Hermite functions, where the Hermite functions are summed from low to high order until the filter's constraints are met. Filters constrained to have zero amplitude at zero hertz have an uncertainty at least three times greater than expected from the uncertainty principle, and the minimum uncertainty filter is a first derivative Gaussian. For the previous filter, the minimum uncertainty high cut filter is a Gaussian function of frequency, but the minimum uncertainty low cut filter is a linear function of frequency.

  2. Quantal localization and the uncertainty principle

    SciTech Connect

    Leopold, J.G.; Richards, D.

    1988-09-01

    We give a dynamical explanation for the localization of the wave function for the one-dimensional hydrogen atom, with the Coulomb singularity, in a high-frequency electric field, which leads to a necessary condition for classical dynamics to be valid. Numerical tests confirm the accuracy of the condition. Our analysis is relevant to the comparison between the classical and quantal dynamics of the kicked rotor and standard map.

  3. Nab: Measurement Principles, Apparatus and Uncertainties

    E-print Network

    D. Pocanic; R. Alarcon; L. P. Alonzi; S. Baessler; S. Balascuta; J. D. Bowman; M. A. Bychkov; J. Byrne; J. R. Calarco; V. Cianciolo; C. Crawford; E. Frlez; M. T. Gericke; G. L. Greene; R. K. Grzywacz; V. Gudkov; F. W. Hersman; A. Klein; J. Martin; S. A. Page; A. Palladino; S. I. Penttila; K. P. Rykaczewski; W. S. Wilburn; A. R. Young; G. R. Young

    2008-10-01

    The Nab collaboration will perform a precise measurement of 'a', the electron-neutrino correlation parameter, and 'b', the Fierz interference term in neutron beta decay, in the Fundamental Neutron Physics Beamline at the SNS, using a novel electric/magnetic field spectrometer and detector design. The experiment is aiming at the 10^{-3} accuracy level in (Delta a)/a, and will provide an independent measurement of lambda = G_A/G_V, the ratio of axial-vector to vector coupling constants of the nucleon. Nab also plans to perform the first ever measurement of 'b' in neutron decay, which will provide an independent limit on the tensor weak coupling.

  4. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  5. Entropic uncertainty relations in multidimensional position and momentum spaces

    E-print Network

    Yichen Huang

    2011-01-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. The lower bound in the new relation is optimal, and the new entropic uncertainty relation implies the famous variance-based uncertainty principle for multimode states. The article concludes with an open conjecture.

  6. Reformulating the Quantum Uncertainty Relation

    PubMed Central

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  7. Reformulating the Quantum Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  8. Reformulating the Quantum Uncertainty Relation

    E-print Network

    Jun-Li Li; Cong-Feng Qiao

    2015-09-17

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in $N$-dimensional Hilbert space.

  9. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  10. Majorization formulation of uncertainty in quantum mechanics

    SciTech Connect

    Partovi, M. Hossein

    2011-11-15

    Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.

  11. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  12. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  13. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  14. Uncertainty Probabilistic

    E-print Network

    Roweis, Sam

    by hand a set of rules for examining inputs, updating internal states and generating outputs #15; Learning with strong statistical regularity. Also useful in adaptive or dynamic situations when the task (or its both utility and uncertainty optimally, e.g. in uence diagrams #15; Adaptive software agents / auctions

  15. Uncertainty Techniques Statistics, Least Squares, Regression,

    E-print Network

    Rostock, Universität

    Uncertainty Techniques Statistics, Least Squares, Regression, ML-Estimation, Stochastic Processes Objectives 5 The Fourier series is an example for function approximation using the orthogonality principle. The least-squares principle does not require a statistical framework to make sense. Maximum likelihood

  16. Uncertainty in the Classroom--Teaching Quantum Physics

    ERIC Educational Resources Information Center

    Johansson, K. E.; Milstead, D.

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how…

  17. Holographic position uncertainty and the quantum-classical transition

    E-print Network

    C. L. Herzenberg

    2010-04-16

    Arguments based on general principles of quantum mechanics have suggested that a minimum length associated with Planck-scale unification may in the context of the holographic principle entail a new kind of observable uncertainty in the transverse position of macroscopically separated objects. Here, we address potential implications of such a position uncertainty for establishing an additional threshold between quantum and classical behavior.

  18. Heisenberg, uncertainty, and the scanning tunneling microscope

    E-print Network

    Werner A Hofer

    2012-03-06

    We show by a statistical analysis of high-resolution scanning tunneling microscopy (STM) experiments, that the interpretation of the density of electron charge as a statistical quantity leads to a conflict with the Heisenberg uncertainty principle. Given the precision in these experiments we find that the uncertainty principle would be violated by close to two orders of magnitude, if this interpretation were correct. We are thus forced to conclude that the density of electron charge is a physically real, i.e., in principle precisely measurable quantity.

  19. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Landé, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ? (x) and ? (p); 11. Complementarity; 12. Mathematical relation between ? (x) and ? (p) for free particles; 13. General relation between ? (q) and ? (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ? (t) and ? (?); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ? and ?; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for ?p (q) and Xq (p); 39. Differential equation for ?? (q); 40. The general probability amplitude ??' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schrödinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrödinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.

  20. Entropic uncertainty relation in de Sitter space

    E-print Network

    Lijuan Jia; Zehua Tian; Jiliang Jing

    2015-01-04

    The uncertainty principle restricts our ability to simultaneously predict the measurement outcomes of two incompatible observables of a quantum particle. However, this uncertainty could be reduced and quantified by a new Entropic Uncertainty Relation (EUR). By the open quantum system approach, we explore how the nature of de Sitter space affects the EUR. When the quantum memory $A$ freely falls in the de Sitter space, we demonstrate that the entropic uncertainty acquires an increase resulting from a thermal bath with the Gibbons-Hawking temperature. And for the static case, we find that the temperature coming from both the intrinsic thermal nature of the de Sitter space and the Unruh effect associated with the proper acceleration of $A$ also brings effect on entropic uncertainty, and the higher temperature, the greater uncertainty and the quicker the uncertainty reaches the maxima value. And finally the possible mechanism behind this phenomenon is also explored.

  1. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  2. Minimum uncertainty states of angular momentum and angular position

    E-print Network

    Zambrini, Roberta

    Minimum uncertainty states of angular momentum and angular position David T Pegg1 , Stephen M of linear momentum that satisfy the equality in the Heisenberg uncertainty principle for position for position and momentum. The corresponding uncertainty relation for angular momentum and angular position

  3. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  4. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  5. Angular performance measure for tighter uncertainty relations

    SciTech Connect

    Hradil, Z.; Rehacek, J.; Klimov, A. B.; Rigas, I.; Sanchez-Soto, L. L.

    2010-01-15

    The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

  6. Uncertainty and Methods for Dealing with Uncertainty.

    E-print Network

    Wiederhold, Gio

    Uncertainty and Methods for Dealing with Uncertainty. Gio Wiederhold Claus­Rainer Rollinger Josef the variety of uncertainty algebras is large, and it appears that no single algebra will satisfy all applications. Not having a right solution is not a reason to avoid the issue. The power of dealing

  7. The physical origins of the uncertainty theorem

    NASA Astrophysics Data System (ADS)

    Giese, Albrecht

    2013-10-01

    The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

  8. Time Crystals from Minimum Time Uncertainty

    E-print Network

    Mir Faizal; Mohammed M. Khalil; Saurya Das

    2014-12-29

    Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra, and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal.

  9. Uncertainty Relation for Mutual Information

    E-print Network

    James Schneeloch; Curtis J. Broadbent; John C. Howell

    2014-12-17

    We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.

  10. Anti Heisenberg—Refutation Of Heisenberg's Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Baruk?i?, Ilija

    2011-03-01

    The quantum mechanical uncertainty principle for position and momentum plays an important role in many treatments on the (philosophical, physical and other) implications of quantum mechanics. Roughly speaking, the more precisely the momentum (position) of a (quantum mechanical) object is given, the less precisely can one say what its position (momentum) is. This quantum mechanical measurement problem is not just an interpretational difficulty, it raises broader issues as well. The measurement (of a property) of a (quantum mechanical) object determines the existence of the measured. In brief, the quantum mechanical uncertainty principle challenges some fundamental principles of Science and especially the principle of causality. In particular, an independently existing (external) objective reality is denied. As we shall see, that the quantum mechanical uncertainty principle for position and momentum is based on the assumption that 1?0, which is a logical contradiction.

  11. Role of information theoretic uncertainty relations in quantum theory

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson-Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  12. Experimental testing of entropic uncertainty relations with multiple measurements in pure diamond

    E-print Network

    Jian Xing; Yu-Ran Zhang; Shang Liu; Yan-Chun Chang; Jie-Dong Yue; Heng Fan; Xin-Yu Pan

    2015-10-13

    One unique feature of quantum mechanics is the Heisenberg uncertainty principle, which states that the outcomes of two incompatible measurements cannot simultaneously achieve arbitrary precision. In an information-theoretic context of quantum information, the uncertainty principle can be formulated as entropic uncertainty relations with two measurements for a quantum bit (qubit) in two-dimensional system. New entropic uncertainty relations are studied for a higher-dimensional quantum state with multiple measurements, the uncertainty bounds can be tighter than that expected from two measurements settings and cannot result from qubits system with or without a quantum memory. Here we report the first room-temperature experimental testing of the entropic uncertainty relations with three measurements in a natural three-dimensional solid-state system: the nitrogen-vacancy center in pure diamond. The experimental results confirm the entropic uncertainty relations for multiple measurements. Our result represents a more precise demonstrating of the fundamental uncertainty principle of quantum mechanics.

  13. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  14. Uncertainty as knowledge.

    PubMed

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D

    2015-11-28

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  15. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  16. Direct Aerosol Forcing Uncertainty

    SciTech Connect

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  17. Position Uncertainty in the Heisenberg Uncertainty Relation

    E-print Network

    Seiji Kosugi

    2010-02-26

    Position measurements are examined under the assumption that object position x_t and probe position X_t just after the measurement are expressed by a linear combination of positions x_0 and X_0 just before the measurement. The Heisenberg uncertainty relation between the position uncertainty and momentum disturbance holds when the measurement error \\epsilon(x_t) for the object position x_t is adopted as the position uncertainty. However, the uncertainty in the measurement result obtained for x_0 is the standard deviation of the measurement result, and not the measurement error \\epsilon(x_0). This difference is due to the reduction of a wave packet. The validity of the linearity assumption is examined in detail.

  18. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Friedland, Shmuel; Gheorghiu, Vlad; Gour, Gilad

    2013-12-01

    Uncertainty relations are a distinctive characteristic of quantum theory that impose intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring noncommuting observables. However, there is no fundamental reason for using entropies as quantifiers; any functional relation that characterizes the uncertainty of the measurement outcomes defines an uncertainty relation. Starting from a very reasonable assumption of invariance under mere relabeling of the measurement outcomes, we show that Schur-concave functions are the most general uncertainty quantifiers. We then discover a fine-grained uncertainty relation that is given in terms of the majorization order between two probability vectors, significantly extending a majorization-based uncertainty relation first introduced in M. H. Partovi, Phys. Rev. A 84, 052117 (2011). Such a vector-type uncertainty relation generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary uncertainty quantifiers. Our relation is therefore universal and captures the essence of uncertainty in quantum theory.

  19. Uncertainties in Gapped Graphene

    E-print Network

    Eylee Jung; Kwang S. Kim; DaeKil Park

    2012-03-20

    Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.

  20. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  1. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  2. Information-Disturbance theorem and Uncertainty Relation

    E-print Network

    Takayuki Miyadera; Hideki Imai

    2007-07-31

    It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.

  3. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  4. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine. PMID:21181616

  5. Controlling entropic uncertainty bound through memory effects

    NASA Astrophysics Data System (ADS)

    Karpat, Göktu?; Piilo, Jyrki; Maniscalco, Sabrina

    2015-09-01

    One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.

  6. Controlling entropic uncertainty bound through memory effects

    E-print Network

    Göktu? Karpat; Jyrki Piilo; Sabrina Maniscalco

    2015-09-19

    One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.

  7. Deriving the Qubit from Entropy Principles

    E-print Network

    Adam Brandenburger; Pierfrancesco La Mura

    2015-01-22

    The Heisenberg uncertainty principle is one of the most famous features of quantum mechanics. However, the non-determinism implied by the Heisenberg uncertainty principle --- together with other prominent aspects of quantum mechanics such as superposition, entanglement, and nonlocality --- poses deep puzzles about the underlying physical reality, even while these same features are at the heart of exciting developments such as quantum cryptography, algorithms, and computing. These puzzles might be resolved if the mathematical structure of quantum mechanics were built up from physically interpretable axioms, but it is not. We propose three physically-based axioms which together characterize the simplest quantum system, namely the qubit. Our starting point is the class of all no-signaling theories. Each such theory can be regarded as a family of empirical models, and we proceed to associate entropies, i.e., measures of information, with these models. To do this, we move to phase space and impose the condition that entropies are real-valued. This requirement, which we call the Information Reality Principle, arises because in order to represent all no-signaling theories (including quantum mechanics itself) in phase space, it is necessary to allow negative probabilities (Wigner [1932]). Our second and third principles take two important features of quantum mechanics and turn them into deliberately chosen physical axioms. One axiom is an Uncertainty Principle, stated in terms of entropy. The other axiom is an Unbiasedness Principle, which requires that whenever there is complete certainty about the outcome of a measurement in one of three mutually orthogonal directions, there must be maximal uncertainty about the outcomes in each of the two other directions.

  8. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  9. Heisenberg uncertainty principle and economic analogues of basic physical quantities

    E-print Network

    Soloviev, Vladimir

    2011-01-01

    From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.

  10. Heisenberg uncertainty principle and economic analogues of basic physical quantities

    E-print Network

    Vladimir Soloviev; Vladimir Saptsin

    2011-11-10

    From positions, attained by modern theoretical physics in understanding of the universe bases, the methodological and philosophical analysis of fundamental physical concepts and their formal and informal connections with the real economic measurings is carried out. Procedures for heterogeneous economic time determination, normalized economic coordinates and economic mass are offered, based on the analysis of time series, the concept of economic Plank's constant has been proposed. The theory has been approved on the real economic dynamic's time series, including stock indices, Forex and spot prices, the achieved results are open for discussion.

  11. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

  12. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  13. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  14. Living With Radical Uncertainty. The Exemplary case of Folding Protein

    E-print Network

    Ignazio Licata

    2010-04-21

    Laplace's demon still makes strong impact on contemporary science, in spite of the fact that Logical Mathematics outcomes, Quantum Physics advent and more recently Complexity Science have pointed out the crucial role of uncertainty in the World's descriptions. We focus here on the typical problem of folding protein as an example of uncertainty, radical emergence and a guide to the "simple" principles for studying complex systems.

  15. The Social Safety Net: An Alternative to Rawl's Two Principles of Justice

    E-print Network

    Chiong, J. Winston

    two principles of justice represent the maximin solution to the choice of principles under uncertainty. When applying the maximin rule to several alternative sets of possible outcomes, individuals assume that the least favorable outcome in each... the alternative of the two principles of justice, they can in large part sidestep the uncertainties of the original position. They can guarantee the protection of their liberties and a reasonably satisfactory standard of life as the conditions...

  16. Uncertainty vs. Interindividual variability

    SciTech Connect

    Bogen, K.T.

    1993-04-01

    Distinct treatment of uncertainty and interindividual variability in variates used to model risk ensures that quantitative assessments of these attributes in modeled risk are maximally relevant to potential regulatory concerns. For example, such a distinction is required for quantitative characterization of uncertainty in population risk or in individual risk. Yet, most quantitative uncertainty analyses undertaken as part of environmental health risk assessments have failed to systematically maintain this distinction among modeled distributed input variates, and so have had limited relevance to reasonable concerns that regulators may have about how uncertainty and variability ought to relate to risk acceptability. The distinction is of course impossible if quantitative treatment of distributed input variates is rejected in favor of using single-point estimates due to the perceived impracticality of complex Monte Carlo analyses that might erroneously be thought of as being necessarily involved. Here, some practical methods are presented that facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical alternative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment.

  17. Picture independent quantum action principle

    SciTech Connect

    Mantke, W.J.

    1992-01-01

    The Schwinger action principle for quantum mechanics is extended into a picture independent form. This displays the quantum connection. Time variations are formulated as variations of a time variable and included into the kinematical variations. Kets and bras represent experimental operations. Experimental operations at different times cannot be identified. The ket and the bra spaces are fiber bundles over time. The same applies to the classical configuration space. For the classical action principle the action can be varied by changing the path or the classical variables. The latter variation of classical functions corresponds to kinematical variations of quantum variables. The picture independent formulation represents time evolution by a connection. A standard experiment is represented by a ket, a connection and a bra. For particular start and end times of experiments, the action and the contraction into a transition amplitude are elements of a new tensor space of quantum correspondents of path functionals. The classical correspondent of the transition amplitude is the probability for a specified state to evolve along a particular path segment. The elements of the dual tensor space represent standard experiments or superpositions thereof. The kinematical variations of the quantum variables are commuting numbers. Variations that include the effect of Poincare or gauge transformations have different commutator properties. The Schwinger action principle is derived from the Feynman path integral formulation. The limitations from the time-energy uncertainty relation might be accommodated by superposing experiments that differ in their start- and end-times. In its picture independent form the action principle can be applied to all superpositions of standard experiments. This may involve superpositions of different connections. The extension of the superposition principle to connections allows representation of a quantum field by a part of the connection.

  18. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  19. Principles and Methods Chromatography

    E-print Network

    Lebendiker, Mario

    Edition AC 18-1022-29 Principles and Methods Affinity Chromatography #12;Antibody Purification-1142-75 Protein Purification Handbook 18-1132-29 Ion Exchange Chromatography Principles and Methods 18-1114-21 Affinity Chromatography Principles and Methods 18-1022-29 Hydrophobic Interaction Chromatography Principles

  20. Dasymetric Modeling and Uncertainty

    PubMed Central

    Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth

    2014-01-01

    Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846

  1. The equivalence principle in a quantum world

    NASA Astrophysics Data System (ADS)

    Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre

    2015-09-01

    We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).

  2. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  3. The propagation of uncertainty for humidity calculations

    NASA Astrophysics Data System (ADS)

    Lovell-Smith, J.

    2009-12-01

    This paper addresses the international humidity community's need for standardization of methods for propagation of uncertainty associated with humidity generators and for handling uncertainty associated with the reference water vapour-pressure and enhancement-factor equations. The paper outlines uncertainty calculations for the mixing ratio, dew-point temperature and relative humidity output from humidity generators, and in particular considers controlling equations for a theoretical hybrid humidity generator combining single-pressure (1-P), two-pressure (2-P) and two-flow (2-F) principles. Also considered is the case where the humidity generator is used as a stable source with traceability derived from a reference hygrometer, i.e. a dew-point meter, a relative humidity meter or a wet-bulb psychrometer. Most humidity generators in use at national metrology institutes can be considered to be special cases of those considered here and sensitivity coefficients for particular types may be extracted. The ability to account for correlations between input variables and between different instances of the evaluation of the reference equations is discussed. The uncertainty calculation examples presented here are representative of most humidity calculations.

  4. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  5. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  6. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  7. On the Role of Information Theoretic Uncertainty Relations in Quantum Theory

    E-print Network

    Petr Jizba; Jacob A. Dunningham; Jaewoo Joo

    2014-06-26

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) R\\'{e}nyi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schr\\"{o}dinger uncertainty relation and Kraus-Maassen Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schr\\"odinger cat states. Again, improvement over both the Robertson-Schr\\"{o}dinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  8. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  9. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  10. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  11. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  12. Equivalence of wave-particle duality to entropic uncertainty.

    PubMed

    Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie

    2014-01-01

    Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138

  13. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  14. Multiresolutional models of uncertainty generation and reduction

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.

  15. Comment on "Uncertainty in measurements of distance"

    E-print Network

    Y. Jack Ng; H. van Dam

    2002-09-06

    We have argued that quantum mechanics and general relativity give a lower bound $\\delta l \\gtrsim l^{1/3} l_P^{2/3}$ on the measurement uncertainty of any distance $l$ much greater than the Planck length $l_P$. Recently Baez and Olson have claimed that one can go below this bound by attaching the measuring device to a massive elastic rod. Here we refute their claim. We also reiterate (and invite our critics to ponder on) the intimate relationship and consistency between black hole physics (including the holographic principle) and our bound on distance measurements.

  16. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  17. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  18. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  19. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  20. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  1. Prediction uncertainty evaluation methods of core performance parameters in large liquid-metal fast breeder reactors

    SciTech Connect

    Takeda, T.; Yoshimura, A. . Faculty of Engineering); Kamei, T. ); Shirakata, K. )

    1989-10-01

    Formulas for predicting the uncertainty of neutronic performance parameters are derived for three methods: the bias factor method, the adjustment method, and the combined method. The prediction uncertainties are obtained by including both experimental and method errors. The adjustment method, in principle, yields the same uncertainty as the combined method. The derived formulas are applied to a large homogeneous 1000-MW (electric) liquid-metal fast breeder reactor core.

  2. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory.

    PubMed

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details. PMID:26118488

  3. Position-momentum uncertainty relations based on moments of arbitrary order

    SciTech Connect

    Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.

    2011-05-15

    The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

  4. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  5. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  6. ESTIMATING UNCERTAINTIES FOR GEOPHYSICAL

    E-print Network

    Kreinovich, Vladik

    and interval techniques for evaluating the uncertainties associ- ated with geophysical tomographic inversion inversion of the data. 1 #12;2 Chapter 1 1 INTRODUCTION: GOALS OF GEOPHYSICS, AND HOW STATISTICAL To A Tomographic Inverse Problem In geophysics, we usually know the equations that describe the propagation

  7. ESTIMATING UNCERTAINTIES FOR GEOPHYSICAL

    E-print Network

    Kreinovich, Vladik

    and interval techniques for evaluating the uncertainties associ­ ated with geophysical tomographic inversion inversion of the data. 1 #12; 2 Chapter 1 1 INTRODUCTION: GOALS OF GEOPHYSICS, AND HOW STATISTICAL To A Tomographic Inverse Problem In geophysics, we usually know the equations that describe the propagation

  8. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  9. The precautionary principle and ecological hazards of genetically modified organisms.

    PubMed

    Giampietro, Mario

    2002-09-01

    This paper makes three points relevant to the application of the precautionary principle to the regulation of GMOs. i) The unavoidable arbitrariness in the application of the precautionary principle reflects a deeper epistemological problem affecting scientific analyses of sustainability. This requires understanding the difference between the concepts of "risk", "uncertainty" and "ignorance". ii) When dealing with evolutionary processes it is impossible to ban uncertainty and ignorance from scientific models. Hence, traditional risk analysis (probability distributions and exact numerical models) becomes powerless. Other forms of scientific knowledge (general principles or metaphors) may be useful alternatives. iii) The existence of ecological hazards per se should not be used as a reason to stop innovations altogether. However, the precautionary principle entails that scientists move away from the concept of "substantive rationality" (trying to indicate to society optimal solutions) to that of "procedural rationality" (trying to help society to find "satisficing" solutions). PMID:12436844

  10. Deterministic Elaboration of Heisenberg's Uncertainty Relation and the Nowhere Differentiability

    E-print Network

    Faycal Ben Adda; Helene Porchon

    2013-08-18

    In this paper the uncertainty principle is found via characteristics of continuous and nowhere differentiable functions. We prove that any physical system that has a continuous and nowhere differentiable position function is subject to an uncertainty in the simultaneous determination of values of its physical properties. The uncertainty in the simultaneous knowledge of the position deviation and the average rate of change of this deviation is found to be governed by a relation equivalent to the one discovered by Heisenberg in 1925. Conversely, we prove that any physical system with a continuous position function that is subject to an uncertainty relation must have a nowhere differentiable position function, which makes the set of continuous and nowhere differentiable functions a candidate for the quantum world.

  11. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health. PMID:14585517

  12. Implementing the Precautionary Principle: Incorporting Science, Technology, Fairness, and Accountability in Environmental, Health and Safety Decisions

    E-print Network

    Ashford, Nicholas

    2005-01-01

    The precautionary principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on allegedly more "rational" ...

  13. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  14. Essays on uncertainty in economics

    E-print Network

    Simsek, Alp

    2010-01-01

    This thesis consists of four essays about "uncertainty" and how markets deal with it. Uncertainty is about subjective beliefs, and thus it often comes with heterogeneous beliefs that may be present temporarily or even ...

  15. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  16. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  17. Deterministic uncertainty analysis

    SciTech Connect

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs.

  18. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  19. Uncertainty Estimates of the EOF-derived North Atlantic Oscillation

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Magnusdottir, G.; Stern, H.; Tian, X.; Yu, Y.

    2013-12-01

    Different approaches to obtaining uncertainty estimates of the North Atlantic Oscillation (NAO) are explored. The resulting estimates are used to enhance our understanding of spatial variability of the NAO over different time periods. Among the parametric and non-parametric approaches investigated in this study, the bootstrap is non-parametric and not confined to the assumption of normally distributed data. It gives physically plausible uncertainty estimates. The NAO uncertainty estimates depend on sample sizes with greater sampling error as sample size is smaller. The NAO uncertainty varies with time but common features include that the most uncertain values are centered between the centers of action of the NAO and are asymmetric in the zonal direction (more uncertainty in the eastward direction or downstream). The bootstrap can also be used to provide direct measures of uncertainty regarding the location of the NAO action centers. The uncertainty of the location of the NAO action centers not only helps assess the shift in the NAO but also provides evidence of more than two action centers. The methods reported on here could in principle be applied to any EOF-derived climate pattern.

  20. Using Models that Incorporate Uncertainty

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.

    2002-01-01

    In this article, the author discusses the use in policy analysis of models that incorporate uncertainty. He believes that all models should consider incorporating uncertainty, but that at the same time it is important to understand that sampling variability is not usually the dominant driver of uncertainty in policy analyses. He also argues that…

  1. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  2. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  3. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  4. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables. PMID:25763941

  5. Instructional Software Design Principles.

    ERIC Educational Resources Information Center

    Hazen, Margret

    1985-01-01

    Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)

  6. Defending principlism well understood.

    PubMed

    Quante, Michael; Vieth, Andreas

    2002-12-01

    After presenting the current version of principlism, in the process repudiating a widespread deductivist misinterpretation, a fundamental metaethical disagreement is developed by outlining the deductivistic critique of principlism. Once the grounds for this critique have been understood, the dispute between casuistry, deductivism and principlism can be restructured, and the model of "application" proven to be the central difference. In the concluding section it is argued that principlism is the most attractive position, if the perceptual model of weak intuitionism is made more explicit. PMID:12607161

  7. Physical principles of hearing

    NASA Astrophysics Data System (ADS)

    Martin, Pascal

    2015-10-01

    The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography

  8. Group environmental preference aggregation: the principle of environmental justice

    SciTech Connect

    Davos, C.A.

    1986-01-01

    The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.

  9. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  10. Evaluation of measurement uncertainty based on Bayesian information fusion

    NASA Astrophysics Data System (ADS)

    Wang, Shan; Chen, Xiaohuai; Yang, Qiao

    2013-10-01

    This paper raises a new method for evaluating uncertainty that taking count of both the record and the data. By using Bayesian Statistical Principle, the prior distribution and the posterior one, provided by the record and the data, were combined together. The statistical characteristics parameter estimation was descended from the posterior distribution, so that a formula of the uncertainty, which combined the advantages of type A and B, was acquired. By simulation and verification, this measurement shows great advantages compared with the others, especially to small size of data analysis.

  11. Title Principles of Finance Lecturer Dick Davies Tutor Athanasios Tsekeris

    E-print Network

    Judd, Martin

    % Finance Compulsory Int. Banking & Fin. Compulsory Investment & Fin. Compulsory Int. Accounting of financial decision taking and the theory of finance. It will develop the basic principles of valuation, the nature of risk and uncertainty and the relationship between risk and returns. While

  12. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  13. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

  14. Mach's holographic principle

    SciTech Connect

    Khoury, Justin; Parikh, Maulik

    2009-10-15

    Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.

  15. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  16. Generalizing Landauer's principle

    NASA Astrophysics Data System (ADS)

    Maroney, O. J. E.

    2009-03-01

    In a recent paper [Stud. Hist. Philos. Mod. Phys. 36, 355 (2005)] it is argued that to properly understand the thermodynamics of Landauer’s principle it is necessary to extend the concept of logical operations to include indeterministic operations. Here we examine the thermodynamics of such operations in more detail, extending the work of Landauer to include indeterministic operations and to include logical states with variable entropies, temperatures, and mean energies. We derive the most general statement of Landauer’s principle and prove its universality, extending considerably the validity of previous proofs. This confirms conjectures made that all logical operations may, in principle, be performed in a thermodynamically reversible fashion, although logically irreversible operations would require special, practically rather difficult, conditions to do so. We demonstrate a physical process that can perform any computation without work requirements or heat exchange with the environment. Many widespread statements of Landauer’s principle are shown to be special cases of our generalized principle.

  17. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  18. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  19. Uncertainty relation in Schwarzschild spacetime

    E-print Network

    Jun Feng; Yao-Zhong Zhang; Mark D. Gould; Heng Fan

    2015-02-27

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit $-\\log_2c$. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  20. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ? c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  1. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  2. Multi-band pyrometer uncertainty analysis and improvement

    NASA Astrophysics Data System (ADS)

    Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

    2011-05-01

    According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

  3. Multi-band pyrometer uncertainty analysis and improvement

    NASA Astrophysics Data System (ADS)

    Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

    2010-12-01

    According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

  4. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  5. Parameter uncertainty for ASP models

    SciTech Connect

    Knudsen, J.K.; Smith, C.L.

    1995-10-01

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more than one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.

  6. Space-Time Uncertainty and Noncommutativity in String Theory

    E-print Network

    Yoneya, T

    2001-01-01

    We analyze the nature of space-time nonlocality in string theory. After giving a brief overview on the conjecture of the space-time uncertainty principle, a (semi-classical) reformulation of string quantum mechanics, in which the dynamics is represented by the noncommutativity between temporal and spatial coordinates, is outlined. The formalism is then compared to the space-time noncommutative field theories associated with nonzero electric B-fields.

  7. How uncertainty bounds the shape index of simple cells.

    PubMed

    Barbieri, D; Citti, G; Sarti, A

    2014-01-01

    We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex.Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15. PMID:24742044

  8. How Uncertainty Bounds the Shape Index of Simple Cells

    PubMed Central

    2014-01-01

    We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex. Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15. PMID:24742044

  9. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1971-01-01

    Satellite altitude determination uncertainties are discussed from the standpoint of the GEOS-C satellite. GEOS-C will be tracked by a number of the conventional satellite tracking systems, as well as by two advanced systems; a satellite-to-satellite tracking system and lasers capable of decimeter accuracies which are being developed in connection with the Goddard Earth and Ocean Dynamics Applications program. The discussion is organized in terms of a specific type of GEOS-C orbit which would satisfy a number of scientific objectives including the study of the gravitational field by means of both the altimeter and the satellite-to-satellite tracking system, studies of tides, and the Gulf Stream meanders.

  10. Uncertainty Relations and Indistinguishable Particles

    E-print Network

    Cael L. Hasse

    2012-12-06

    We show that for fermion states, measurements of any two finite outcome particle quantum numbers (e.g.\\ spin) are not constrained by a minimum total uncertainty. We begin by defining uncertainties in terms of the outputs of a measurement apparatus. This allows us to compare uncertainties between multi-particle states of distinguishable and indistinguishable particles. Entropic uncertainty relations are derived for both distinguishable and indistinguishable particles. We then derive upper bounds on the minimum total uncertainty for bosons and fermions. These upper bounds apply to any pair of particle quantum numbers and depend only on the number of particles N and the number of outcomes n for the quantum numbers. For general N, these upper bounds necessitate a minimum total uncertainty much lower than that for distinguishable particles. The fermion upper bound on the minimum total uncertainty for N an integer multiple of n, is zero. Our results show that uncertainty limits derived for single particle observables are valid only for particles that can be effectively distinguished. Outside this range of validity, the apparent fundamental uncertainty limits can be overcome.

  11. On Uncertainties in Successive Measurements

    E-print Network

    Distler, Jacques

    2012-01-01

    When you measure an observable, A, in Quantum Mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some non-commuting observable, B. The standard Uncertainty Relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the post-measurement state. We make some remarks on the latter problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum.

  12. Evaluating uncertainty in simulation models

    SciTech Connect

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  13. Quantum uncertainty in distance measurement: Holography and black hole thermodynamics

    E-print Network

    Michael Maziashvili

    2006-04-01

    Exact conditions on the clock parameters corresponding to the minimal uncertainty in distance measurement are derived in uniform manner for any number of space time dimensions. The result espouses the holography principle no matter what the number of space time dimensions is. In this context the ADD braneworld model is considered. Some remarks are made on deviation of holography as well as of special relativity at the scales provided by the cosmological constant. We also comment on the potential influence of the background radiation on the uncertainty in length measurement. The presence of unavoidable quantum uncertainty in length measurement results in fluctuations of the black hole thermodynamics that can be interested to address the information loss problem. The quantum corrections to the black hole entropy obtained in various scenarios are imperceptible because of these fluctuations. At the Planck scale the fluctuations destroy the thermodynamic picture of the black hole.

  14. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  15. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  16. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  17. Planning ATES systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.

  18. Generation under Uncertainty Oliver Lemon

    E-print Network

    Generation under Uncertainty Oliver Lemon Heriot-Watt University Edinburgh, United Kingdom o.lemon and Lemon, 2010). The issue of uncertainty for refer- ring expression generation has been discussed be- fore you cannot know with certainty how they will respond to it (Rieser and Lemon, 2009; Rieser et al

  19. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  20. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  1. Robust SAR ATR by hedging against uncertainty

    NASA Astrophysics Data System (ADS)

    Hoffman, John R.; Mahler, Ronald P. S.; Ravichandran, Ravi B.; Huff, Melvyn; Musick, Stanton

    2002-07-01

    For the past two years in this conference, we have described techniques for robust identification of motionless ground targets using single-frame Synthetic Aperture Radar (SAR) data. By robust identification, we mean the problem of determining target ID despite the existence of confounding statistically uncharacterizable signature variations. Such variations can be caused by effects such as mud, dents, attachment of nonstandard equipment, nonstandard attachment of standard equipment, turret articulations, etc. When faced with such variations, optimal approaches can often behave badly-e.g., by mis-identifying a target type with high confidence. A basic element of our approach has been to hedge against unknowable uncertainties in the sensor likelihood function by specifying a random error bar (random interval) for each value of the likelihood function corresponding to any given value of the input data. Int his paper, we will summarize our recent results. This will include a description of the fuzzy maximum a posteriori (MAP) estimator. The fuzzy MAP estiamte is essentially the set of conventional MAP estimates that are plausible, given the assumed uncertainty in the problem. Despite its name, the fuzzy MAP is derived rigorously from first probabilistic principles based on random interval theory.

  2. Uncertainty relation in Schwarzschild spacetime

    E-print Network

    Feng, Jun; Gould, Mark D; Fan, Heng

    2015-01-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduce a nontrivial modification on the uncertainty bound for particular observer, therefore could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitably increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole, which triggers an effectively reduced uncert...

  3. Uncertainty in spatial data mining

    NASA Astrophysics Data System (ADS)

    Mei, Kun; Tian, Yangge; Bian, Fulin

    2007-11-01

    Spatial data mining, i.e., mining knowledge from large amounts of spatial data, is a demanding field since huge amounts of spatial data have been collected in various applications. The collected data far exceeds people's ability to analyze it. Thus, new and efficient methods are needed to discover knowledge from large spatial databases. Most of the spatial data mining methods do not take into account the uncertainty of spatial information. In our work we use objects with broad boundaries, the concept that absorbs all the uncertainty by which spatial data is commonly affected and allows computations in the presence of uncertainty without rough simplifications of the reality. And we propose an uncertainty model that enables efficient analysis of such data. The study case of suitable flounder fishery search indicates the benefit of uncertainty research in spatial data mining.

  4. Uncertainties in Arctic precipitation

    NASA Astrophysics Data System (ADS)

    Majhi, Ipshita; Alexeev, Vladimir; Cherry, Jessica; Groisman, Pasha; Cohen, Judah

    2013-04-01

    Precipitation is an essential and highly variable component of the freshwater budget, and solid precipitation in particular, has a major impact on the local and global climate. The impacts of snow on the surface energy balance are tremendous, as snow has a higher albedo than any other naturally occurring surface condition. Documenting the instrumentally observed precipitation climate records presents its own challenges since the stations themselves undergo many changes in the course of their operation.Though it is crucial to accurately measure precipitation as a means to predict change in future water budgets, estimates of long-term precipitation are riddled with measurement biases. Some of the challenges facing reliable measurement of solid precipitation include missing data, gage change, discontinued stations, trace precipitation, blizzards, wetting losses when emptying the gage, and evaporation between the time of event and the time of measurement. Rain measurements likewise face uncertainties such as splashing of rain out of the gage, evaporation, and extreme events, though the magnitude of these impacts on overall measurement is less than that faced by solid precipitation. In all, biases can be so significant that they present major problems for the use of precipitation data in climate studies.

  5. Uncertainties in Arctic Precipitation

    NASA Astrophysics Data System (ADS)

    Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

    2012-12-01

    Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

  6. The Bayesian brain: phantom percepts resolve sensory uncertainty.

    PubMed

    De Ridder, Dirk; Vanneste, Sven; Freeman, Walter

    2014-07-01

    Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the Bayesian updating of knowledge via active sensory exploration of the environment, driven by the Shannonian free-energy principle, provides an explanation for the generation of phantom percepts, as a way to reduce uncertainty, to make sense of the world. PMID:22516669

  7. Applying the four principles.

    PubMed

    Macklin, R

    2003-10-01

    Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle. PMID:14519836

  8. One-dimensional hydrogen atom with minimal length uncertainty and maximal momentum

    E-print Network

    Pouria Pedram

    2013-02-16

    We present exact energy eigenvalues and eigenfunctions of the one-dimensional hydrogen atom in the framework of the Generalized (Gravitational) Uncertainty Principle (GUP). This form of GUP is consistent with various theories of quantum gravity such as string theory, loop quantum gravity, black-hole physics, and doubly special relativity and implies a minimal length uncertainty and a maximal momentum. We show that the quantized energy spectrum exactly agrees with the semiclassical results.

  9. A Higher Order GUP with Minimal Length Uncertainty and Maximal Momentum

    E-print Network

    Pouria Pedram

    2012-10-19

    We present a higher order generalized (gravitational) uncertainty principle (GUP) in the form $[X,P]=i\\hbar/(1-\\beta P^2)$. This form of GUP is consistent with various proposals of quantum gravity such as string theory, loop quantum gravity, doubly special relativity, and predicts both a minimal length uncertainty and a maximal observable momentum. We show that the presence of the maximal momentum results in an upper bound on the energy spectrum of the momentum eigenstates and the harmonic oscillator.

  10. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

  11. Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation

    SciTech Connect

    Smith, D.L.; Otuka, N.

    2012-12-15

    This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

  12. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  13. Uncertainties in 4??-? coincidence counting

    NASA Astrophysics Data System (ADS)

    Fitzgerald, R.; Bailat, C.; Bobin, C.; Keightley, J. D.

    2015-06-01

    The 4??-? coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.

  14. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  15. Uncertainty in perception and the Hierarchical Gaussian Filter

    PubMed Central

    Mathys, Christoph D.; Lomakina, Ekaterina I.; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H.; Friston, Karl J.; Stephan, Klaas E.

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder–Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient—but at the same time intuitive—framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  16. The principle of agency.

    PubMed

    Rachels, James

    1998-04-01

    The Principle of Agency says that if it would be good for a state of affairs to occur "naturally," then it is permissable to take action to bring it about. This contradicts the views of some bioethicists, who object to euthanasia, in vitro fertilization, and cloning, even though they acknowledge that the states of affairs produced are good. But the principle, or some form of it, seems inescapable. The opposite view -- that we may not, by our action, reproduce "natural" goods -- may owe its appeal to an implicitly religious view of nature. PMID:11655330

  17. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  18. Uncertainty in emissions projections for climate models

    E-print Network

    Webster, Mort David.; Babiker, Mustafa H.M.; Mayer, Monika.; Reilly, John M.; Harnisch, Jochen.; Hyman, Robert C.; Sarofim, Marcus C.; Wang, Chien.

    Future global climate projections are subject to large uncertainties. Major sources of this uncertainty are projections of anthropogenic emissions. We evaluate the uncertainty in future anthropogenic emissions using a ...

  19. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  20. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  1. Climate targets: Values and uncertainty

    NASA Astrophysics Data System (ADS)

    Lempert, Robert J.

    2015-10-01

    Policymakers know that the risks associated with climate change mean they need to cut greenhouse-gas emissions. But uncertainty surrounding the likelihood of different scenarios makes choosing specific policies difficult.

  2. Estimations of uncertainties of frequencies

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan

    2015-08-01

    Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common.The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints.We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.

  3. Uncertainty Relation for Smooth Entropies

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Renner, Renato

    2011-03-01

    Uncertainty relations give upper bounds on the accuracy by which the outcomes of two incompatible measurements can be predicted. While established uncertainty relations apply to cases where the predictions are based on purely classical data (e.g., a description of the system’s state before measurement), an extended relation which remains valid in the presence of quantum information has been proposed recently [Berta et al., Nature Phys.NPAHAX1745-2473 6, 659 (2010)10.1038/nphys1734]. Here, we generalize this uncertainty relation to one formulated in terms of smooth entropies. Since these entropies measure operational quantities such as extractable secret key length, our uncertainty relation is of immediate practical use. To illustrate this, we show that it directly implies security of quantum key distribution protocols. Our security claim remains valid even if the implemented measurement devices deviate arbitrarily from the theoretical model.

  4. Uncertainty relation for smooth entropies.

    PubMed

    Tomamichel, Marco; Renner, Renato

    2011-03-18

    Uncertainty relations give upper bounds on the accuracy by which the outcomes of two incompatible measurements can be predicted. While established uncertainty relations apply to cases where the predictions are based on purely classical data (e.g., a description of the system's state before measurement), an extended relation which remains valid in the presence of quantum information has been proposed recently [Berta et al., Nature Phys. 6, 659 (2010)]. Here, we generalize this uncertainty relation to one formulated in terms of smooth entropies. Since these entropies measure operational quantities such as extractable secret key length, our uncertainty relation is of immediate practical use. To illustrate this, we show that it directly implies security of quantum key distribution protocols. Our security claim remains valid even if the implemented measurement devices deviate arbitrarily from the theoretical model. PMID:21469854

  5. Predicting System Performance with Uncertainty 

    E-print Network

    Yan, B.; Malkawi, A.

    2012-01-01

    System Performance with Uncertainty Bin Yan and Ali Malkawi T.C. Chan Center for Building Simulation and Energy Studies, University of Pennsylvania 220 South 34th Street, Philadelphia, PA 19104, United States Email: binyan@design.upenn.edu Abstract...

  6. Policy Uncertainty and Household Savings

    E-print Network

    Giavazzi, Francesco

    Using German microdata and a quasi-natural experiment, we provide evidence on how households respond to an increase in uncertainty. We find that household saving increases significantly following the increase in political ...

  7. PRINCIPLES OF MODELLING

    EPA Science Inventory

    The scope of modelling the behavior of pollutants in the aquatic environment is now immense. n many practical applications, there are effectively no computational constraints on what is possible. here is accordingly an increasing need for a set of principles of modelling that in ...

  8. PRINCIPLES OF WATER FILTRATION

    EPA Science Inventory

    This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...

  9. Pattern recognition principles

    NASA Technical Reports Server (NTRS)

    Tou, J. T.; Gonzalez, R. C.

    1974-01-01

    The present work gives an account of basic principles and available techniques for the analysis and design of pattern processing and recognition systems. Areas covered include decision functions, pattern classification by distance functions, pattern classification by likelihood functions, the perceptron and the potential function approaches to trainable pattern classifiers, statistical approach to trainable classifiers, pattern preprocessing and feature selection, and syntactic pattern recognition.

  10. Principles of Cancer Screening.

    PubMed

    Pinsky, Paul F

    2015-10-01

    Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike. PMID:26315516

  11. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  12. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  13. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  14. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

  15. Cosmic rays and tests of fundamental principles

    E-print Network

    Luis Gonzalez-Mestres

    2011-09-22

    It is now widely acknowledged that cosmic rays experiments can test possible new physics directly generated at the Planck scale or at some other fundamental scale. By studying particle properties at energies far beyond the reach of any man-made accelerator, they can yield unique checks of basic principles. A well-known example is provided by possible tests of special relativity at the highest cosmic-ray energies. But other essential ingredients of standard theories can in principle be tested: quantum mechanics, uncertainty principle, energy and momentum conservation, effective space-time dimensions, hamiltonian and lagrangian formalisms, postulates of cosmology, vacuum dynamics and particle propagation, quark and gluon confinement, elementariness of particles... Standard particle physics or string-like patterns may have a composite origin able to manifest itself through specific cosmic-ray signatures. Ultra-high energy cosmic rays, but also cosmic rays at lower energies, are probes of both "conventional" and new Physics. Status, prospects, new ideas, and open questions in the field are discussed. The Post Scriptum shows that several basic features of modern cosmology naturally appear in a SU(2) spinorial description of space-time without any need for matter, relativity or standard gravitation. New possible effects related to the spinorial space-time structure can also be foreseen. Similarly, the existence of spin-1/2 particles can be naturally related to physics beyond Planck scale and to a possible pre-Big Bang era.

  16. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  17. The Principle of Maximum Conformality

    SciTech Connect

    Brodsky, Stanley J; Giustino, Di; /SLAC

    2011-04-05

    A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

  18. Principles of Natural Photosynthesis.

    PubMed

    Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A

    2016-01-01

    Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285

  19. Principles of Buddhist Tantrism

    E-print Network

    Govinda, Lama Anagarika

    Gilrj) if :q fincest and licentiousness is as ridiculous as accusing the Thtral'odins of condoning matricide and patricide... , a short quotation may suffice to prove our point. "The vital force of the Five Aggregates (Tib. SkI. fq;::a) in its real nature, pertaineth to the masculine aspect of the Buddha-principle manifesting through the left psychic nerve (Tib. Skt tiS1...

  20. Principles of plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Hutchinson, Ian H.

    The physical principles, techniques, and instrumentation of plasma diagnostics are examined in an introduction and reference work for students and practicing scientists. Topics addressed include basic plasma properties, magnetic diagnostics, plasma particle flux, and refractive-index measurements. Consideration is given to EM emission by free and bound electrons, the scattering of EM radiation, and ion processes. Diagrams, drawings, graphs, sample problems, and a glossary of symbols are provided.

  1. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2?, where h is ...

  2. Principles of nuclear geology

    SciTech Connect

    Aswathanarayana, U.

    1985-01-01

    This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

  3. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  4. The Top Mass: Interpretation and Theoretical Uncertainties

    E-print Network

    André H. Hoang

    2014-12-11

    Currently the most precise LHC measurements of the top quark mass are determinations of the top quark mass parameter of Monte-Carlo (MC) event generators reaching uncertainties of well below $1$ GeV. However, there is an additional theoretical problem when using the MC top mass $m_t^{\\rm MC}$ as an input for theoretical predictions, because a rigorous relation of $m_t^{\\rm MC}$ to a renormalized field theory mass is, at the very strict level, absent. In this talk I show how - nevertheless - some concrete statements on $m_t^{\\rm MC}$ can be deduced assuming that the MC generator behaves like a rigorous first principles QCD calculator for the observables that are used for the analyses. I give simple conceptual arguments showing that in this context $m_t^{\\rm MC}$ can be interpreted like the mass of a heavy-light top meson, and that there is a conversion relation to field theory top quark masses that requires a non-perturbative input. The situation is in analogy to B physics where a similar relation exists between experimental B meson masses and field theory bottom masses. The relation gives a prescription how to use $m_t^{\\rm MC}$ as an input for theoretical predictions in perturbative QCD. The outcome is that at this time an additional uncertainty of about $1$ GeV has to be accounted for. I discuss limitations of the arguments I give and possible ways to test them, or even to improve the current situation.

  5. Uncertainty Quantification in Solidification Modelling

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  6. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  7. Robust optimization of nonlinear impulsive rendezvous with uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, YaZhong; Yang, Zhen; Li, HengNian

    2014-04-01

    The optimal rendezvous trajectory designs in many current research efforts do not incorporate the practical uncertainties into the closed loop of the design. A robust optimization design method for a nonlinear rendezvous trajectory with uncertainty is proposed in this paper. One performance index related to the variances of the terminal state error is termed the robustness performance index, and a two-objective optimization model (including the minimum characteristic velocity and the minimum robustness performance index) is formulated on the basis of the Lambert algorithm. A multi-objective, non-dominated sorting genetic algorithm is employed to obtain the Pareto optimal solution set. It is shown that the proposed approach can be used to quickly obtain several inherent principles of the rendezvous trajectory by taking practical errors into account. Furthermore, this approach can identify the most preferable design space in which a specific solution for the actual application of the rendezvous control should be chosen.

  8. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  9. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  10. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ??HL?1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty ???1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ??HL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  11. Davis-Besse uncertainty study

    SciTech Connect

    Davis, C B

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.

  12. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  13. Uncertainty in Integrative Structural Modeling

    PubMed Central

    Schneidman-Duhovny, Dina; Pellarin, Riccardo; Sali, Andrej

    2014-01-01

    Integrative structural modelling uses multiple types of input information and proceeds in four stages: (i) gathering information, (ii) designing model representation and converting information into a scoring function, (iii) sampling good-scoring models, and (iv) analyzing models and information. In the first stage, uncertainty originates from data that are sparse, noisy, ambiguous, or derived from heterogeneous samples. In the second stage, uncertainty can originate from a representation that is too coarse for the available information or a scoring function that does not accurately capture the information. In the third stage, the major source of uncertainty is insufficient sampling. In the fourth stage, clustering, cross-validation, and other methods are used to estimate the precision and accuracy of the models and information. PMID:25173450

  14. Uncertainty in Measurements of Distance

    E-print Network

    John C. Baez; S. Jay Olson

    2002-01-09

    Ng and van Dam have argued that quantum theory and general relativity give a lower bound of L^{1/3} L_P^{2/3} on the uncertainty of any distance, where L is the distance to be measured and L_P is the Planck length. Their idea is roughly that to minimize the position uncertainty of a freely falling measuring device one must increase its mass, but if its mass becomes too large it will collapse to form a black hole. Here we show that one can go below the Ng-van Dam bound by attaching the measuring device to a massive elastic rod. Relativistic limitations on the rod's rigidity, together with the constraint that its length exceeds its Schwarzschild radius, imply that zero-point fluctuations of the rod give an uncertainty greater than or equal to L_P.

  15. Principles for School Drug Education

    ERIC Educational Resources Information Center

    Meyer, Lois

    2004-01-01

    This document presents a revised set of principles for school drug education. The principles for drug education in schools comprise an evolving framework that has proved useful over a number of decades in guiding the development of effective drug education. The first edition of "Principles for Drug Education in Schools" (Ballard et al. 1994) has…

  16. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  17. Uncertainty in 3D gel dosimetry

    NASA Astrophysics Data System (ADS)

    De Deene, Yves; Jirasek, Andrew

    2015-01-01

    Three-dimensional (3D) gel dosimetry has a unique role to play in safeguarding conformal radiotherapy treatments as the technique can cover the full treatment chain and provides the radiation oncologist with the integrated dose distribution in 3D. It can also be applied to benchmark new treatment strategies such as image guided and tracking radiotherapy techniques. A major obstacle that has hindered the wider dissemination of gel dosimetry in radiotherapy centres is a lack of confidence in the reliability of the measured dose distribution. Uncertainties in 3D dosimeters are attributed to both dosimeter properties and scanning performance. In polymer gel dosimetry with MRI readout, discrepancies in dose response of large polymer gel dosimeters versus small calibration phantoms have been reported which can lead to significant inaccuracies in the dose maps. The sources of error in polymer gel dosimetry with MRI readout are well understood and it has been demonstrated that with a carefully designed scanning protocol, the overall uncertainty in absolute dose that can currently be obtained falls within 5% on an individual voxel basis, for a minimum voxel size of 5 mm3. However, several research groups have chosen to use polymer gel dosimetry in a relative manner by normalizing the dose distribution towards an internal reference dose within the gel dosimeter phantom. 3D dosimetry with optical scanning has also been mostly applied in a relative way, although in principle absolute calibration is possible. As the optical absorption in 3D dosimeters is less dependent on temperature it can be expected that the achievable accuracy is higher with optical CT. The precision in optical scanning of 3D dosimeters depends to a large extend on the performance of the detector. 3D dosimetry with X-ray CT readout is a low contrast imaging modality for polymer gel dosimetry. Sources of error in x-ray CT polymer gel dosimetry (XCT) are currently under investigation and include inherent limitations in dosimeter homogeneity, imaging performance, and errors induced through post-acquisition processing. This overview highlights a number of aspects relating to uncertainties in polymer gel dosimetry.

  18. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  19. LWR meltdown analyses and uncertainties

    SciTech Connect

    Rivard, J.B.; Haskin, F.E.

    1981-01-01

    The forthcoming rulemaking poses significant questions concerning the behavior of LWR fuel during accidents beyond the design bases. These questions cannot readily be addressed based upon experiment results which are currently projected to become available within the rulemaking time frame. Analyses based on available computer codes and experimental data have yielded new insights into the progression of in-vessel fuel degradation and materials redistribution during LWR meltdown accidents. Phenomenological and modeling uncertainties have been identified together with potential destabilizing mechanisms during both unperturbed meltdown sequences and sequences involving reflooding. Specific issues of greatest relative importance have been identified which sharpen the focus of efforts to quantify and reduce the uncertainties.

  20. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Fan, J. D.; Malozovsky, Yuriy M.

    2013-06-01

    In terms of an exact equation for the thermodynamic potential due to interaction between two particles and based on Green's function method; we have derived the Landau expansion of the thermodynamic potentials in terms of the variation of the quasiparticle distribution function. We have also derived the expansion of the thermodynamic potential in terms of the variation of an exact single particle (not quasiparticles), this derivations lead to the relationship between the interaction function for two quasiparticles and the interaction energy between two particles as shown. Further, in terms of the four-point vertex part we are led to the Pauli exclusion principle.

  1. Principles of Pituitary Surgery.

    PubMed

    Farrell, Christopher J; Nyquist, Gurston G; Farag, Alexander A; Rosen, Marc R; Evans, James J

    2016-02-01

    Since the description of a transnasal approach for treatment of pituitary tumors, transsphenoidal surgery has undergone continuous development. Hirsch developed a lateral endonasal approach before simplifying it to a transseptal approach. Cushing approached pituitary tumors using a transsphenoidal approach but transitioned to the transcranial route. Transsphenoidal surgery was not "rediscovered" until Hardy introduced the surgical microscope. An endoscopic transsphenoidal approach for pituitary tumors has been reported and further advanced. We describe the principles of pituitary surgery including the key elements of surgical decision making and discuss the technical nuances distinguishing the endoscopic from the microscopic approach. PMID:26614830

  2. Nonequilibrium quantum Landauer principle.

    PubMed

    Goold, John; Paternostro, Mauro; Modi, Kavan

    2015-02-13

    Using the operational framework of completely positive, trace preserving operations and thermodynamic fluctuation relations, we derive a lower bound for the heat exchange in a Landauer erasure process on a quantum system. Our bound comes from a nonphenomenological derivation of the Landauer principle which holds for generic nonequilibrium dynamics. Furthermore, the bound depends on the nonunitality of dynamics, giving it a physical significance that differs from other derivations. We apply our framework to the model of a spin-1/2 system coupled to an interacting spin chain at finite temperature. PMID:25723198

  3. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  4. Network Flow Optimization under Uncertainty

    E-print Network

    Tesfatsion, Leigh

    Solving stochastic programs - 2 · Decomposition ­ Iterate between solving the master (first stage) problem for the master problem ­ Can be more efficient than solving the DEP directly ­ Typically, special structure to optimization under uncertainty ­ Stochastic programming ­ Robust optimization · State of the art for network

  5. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  6. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  7. Quantification of entanglement via uncertainties

    SciTech Connect

    Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S.

    2007-03-15

    We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.

  8. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  9. Principles of Safety Pharmacology

    PubMed Central

    Pugsley, M K; Authier, S; Curtis, M J

    2008-01-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

  10. Principle of relative locality

    SciTech Connect

    Amelino-Camelia, Giovanni; Freidel, Laurent; Smolin, Lee; Kowalski-Glikman, Jerzy

    2011-10-15

    We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

  11. Great Lakes Literacy Principles

    NASA Astrophysics Data System (ADS)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  12. Dosimetric Uncertainties: Magnetic Field Coupling to Peripheral Nerve.

    PubMed

    Kavet, Robert

    2015-12-01

    The International Commission on Non-ionizing Radiation Protection (ICNIRP) and the Institute for Electrical and Electronic Engineers (IEEE) have established magnetic field exposure limits for the general public between 400 Hz (ICNIRP)/759 Hz (IEEE) and 100 kHz to protect against adverse effects associated with peripheral nerve stimulation (PNS). Despite apparent common purpose and similarly stated principles, the two sets of limits diverge between 3.35-100 kHz by a factor of about 7.7 with respect to PNS. To address the basis for this difference and the more general issue of dosimetric uncertainty, this paper combines experimental data of PNS thresholds derived from human subjects exposed to magnetic fields together with published estimates of induced in situ electric field PNS thresholds to evaluate dosimetric relationships of external magnetic fields to induced fields at the threshold of PNS and the uncertainties inherent to such relationships. The analyses indicate that the logarithmic range of magnetic field thresholds constrains the bounds of uncertainty of in situ electric field PNS thresholds and coupling coefficients related to the peripheral nerve (the coupling coefficients define the dosimetric relationship of external field to induced electric field). The general public magnetic field exposure limit adopted by ICNIRP uses a coupling coefficient that falls above the bounds of dosimetric uncertainty, while IEEE's is within the bounds of uncertainty toward the lower end of the distribution. The analyses illustrate that dosimetric estimates can be derived without reliance on computational dosimetry and the associated values of tissue conductivity. With the limits now in place, investigative efforts would be required if a field measurement were to exceed ICNIRP's magnetic field limit (the reference level), even when there is a virtual certainty that the dose limit (the basic restriction) has not been exceeded. The constraints on the range of coupling coefficients described in this paper could facilitate a re-evaluation of ICNIRP and IEEE dose and exposure limits and possibly lead toward harmonization. PMID:26509623

  13. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  14. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  15. On the impact of systematical uncertainties for the CP violation measurement in superbeam experiments

    E-print Network

    Huber, Patrick; Schwetz, Thomas

    2008-01-01

    Superbeam experiments can, in principle, achieve impressive sensitivities for CP violation in neutrino oscillations for large $\\theta_{13}$. We study how those sensitivities depend on assumptions about systematical uncertainties. We focus on the second phase of T2K, the so-called T2HK experiment, and we explicitly include a near detector in the analysis. Our main result is that even an idealised near detector cannot remove the dependence on systematical uncertainties completely. Thus additional information is required. We identify certain combinations of uncertainties, which are the key to improve the sensitivity to CP violation, for example the ratio of electron to muon neutrino cross sections and efficiencies. For uncertainties on this ratio larger than 2%, T2HK is systematics dominated. We briefly discuss how our results apply to a possible two far detector configuration, called T2KK. We do not find a significant advantage with respect to the reduction of systematical errors for the measurement of CP viola...

  16. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    E-print Network

    Xue, Zhenyu; Vlachos, Pavlos P

    2014-01-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations. In addition, the notion of a valid measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct ...

  17. Use of Combined Uncertainty of Pesticide Residue Results for Testing Compliance with Maximum Residue Limits (MRLs).

    PubMed

    Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, Árpád

    2015-05-13

    The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants. PMID:25658668

  18. Kepler and Mach's Principle

    NASA Astrophysics Data System (ADS)

    Barbour, Julian

    The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.

  19. The Thermodynamic Covariance Principle

    E-print Network

    Sonnino, Giorgio

    2014-01-01

    The concept of "equivalent systems" from the thermodynamic point of view, originally introduced by Th. De Donder and I. Prigogine, is deeply investigated and revised. From our point of view, two systems are thermodynamically equivalent if, under transformation of the thermodynamic forces, both the entropy production and the Glansdorff-Prigogine dissipative quantity remain unaltered. This kind of transformations are refereed to as the "Thermodynamic Coordinate Transformations" (TCT). The general class of transformations satisfying the TCT is determined. We shall see that, also in the nonlinear region (i.e., out of the Onsager region), the TCT preserve the reciprocity relations of the transport coefficients. The equivalent character of two transformation under TCT, leads to the concept of "Thermodynamic Covariance Principle" (TCP) stating that all thermodynamic equations involving the thermodynamic forces and flows (e.g., the closure flux-force relations) should be covariant under TCT.

  20. Dynamical principles in neuroscience

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  1. Fault Management Guiding Principles

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  2. Dynamical principles in neuroscience

    SciTech Connect

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-15

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

  3. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  4. Uncertainty in flood risk mapping

    NASA Astrophysics Data System (ADS)

    Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

    2014-05-01

    A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.

  5. Credible Software and Simulation Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; Nixon, David (Technical Monitor)

    1998-01-01

    The utility of software primarily depends on its reliability and performance; whereas, its significance depends solely on its credibility for intended use. The credibility of simulations confirms the credibility of software. The level of veracity and the level of validity of simulations determine the degree of credibility of simulations. The process of assessing this credibility in fields such as computational mechanics (CM) differs from that followed by the Defense Modeling and Simulation Office in operations research. Verification and validation (V&V) of CM simulations is not the same as V&V of CM software. Uncertainty is the measure of simulation credibility. Designers who use software are concerned with management of simulation uncertainty. Terminology and concepts are presented with a few examples from computational fluid dynamics.

  6. Modeling travel time uncertainty in traffic networks

    E-print Network

    Chen, Daizhuo

    2010-01-01

    Uncertainty in travel time is one of the key factors that could allow us to understand and manage congestion in transportation networks. Models that incorporate uncertainty in travel time need to specify two mechanisms: ...

  7. Multidelity approaches for design under uncertainty

    E-print Network

    Ng, Leo Wai-Tsun

    2013-01-01

    Uncertainties are present in many engineering applications and it is important to account for their effects during engineering design to achieve robust and reliable systems. One approach is to represent uncertainties as ...

  8. Generalized approach to minimal uncertainty products

    E-print Network

    Mendoza, Douglas M

    2013-01-01

    A general technique to construct quantum states that saturate uncertainty products using variational methods is developed. Such a method allows one to numerically compute uncertainties in cases where the Robertson-Schrodinger ...

  9. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  10. Towards first-principles electrochemistry

    E-print Network

    Dabo, Ismaila

    2008-01-01

    This doctoral dissertation presents a comprehensive computational approach to describe quantum mechanical systems embedded in complex ionic media, primarily focusing on the first-principles representation of catalytic ...

  11. Uncertainty propagation in nuclear forensics.

    PubMed

    Pommé, S; Jerome, S M; Venchiarutti, C

    2014-07-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined. PMID:24607529

  12. Fuzzy-algebra uncertainty assessment

    SciTech Connect

    Cooper, J.A.; Cooper, D.K.

    1994-12-01

    A significant number of analytical problems (for example, abnormal-environment safety analysis) depend on data that are partly or mostly subjective. Since fuzzy algebra depends on subjective operands, we have been investigating its applicability to these forms of assessment, particularly for portraying uncertainty in the results of PRA (probabilistic risk analysis) and in risk-analysis-aided decision-making. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only known (not assumed) information. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments; and therefore require an even more judicious approach. Fuzzy algebra matches these requirements well. One of the most useful aspects of this work is that we have shown the potential for significant differences (especially in perceived margin relative to a decision threshold) between fuzzy assessment and probabilistic assessment based on subtle factors inherent in the choice of probability distribution models. We have also shown the relation of fuzzy algebra assessment to ``bounds`` analysis, as well as a description of how analyses can migrate from bounds analysis to fuzzy-algebra analysis, and to probabilistic analysis as information about the process to be analyzed is obtained. Instructive examples are used to illustrate the points.

  13. Quantification of uncertainties in composites

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Singhal, S. N.; Murthy, P. L. N.; Chamis, Christos C.

    1993-01-01

    An integrated methodology is developed for computationally simulating the probabilistic composite material properties at all composite scales. The simulation requires minimum input consisting of the description of uncertainties at the lowest scale (fiber and matrix constituents) of the composite and in the fabrication process variables. The methodology allows the determination of the sensitivity of the composite material behavior to all the relevant primitive variables. This information is crucial for reducing the undesirable scatter in composite behavior at its macro scale by reducing the uncertainties in the most influential primitive variables at the micro scale. The methodology is computationally efficient. The computational time required by the methodology described herein is an order of magnitude less than that for Monte Carlo Simulation. The methodology has been implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of the methodology/code are demonstrated by simulating the uncertainties in the heat-transfer, thermal, and mechanical properties of a typical laminate and comparing the results with the Monte Carlo simulation method and experimental data. The important observation is that the computational simulation for probabilistic composite mechanics has sufficient flexibility to capture the observed scatter in composite properties.

  14. Uncertainties in dosemeter intercomparison techniques

    NASA Astrophysics Data System (ADS)

    Harrison, R. M.; Rawlings, D. J.

    1996-03-01

    The results of 180 intercomparisons of field dosemeters with an NPL secondary-standard exposure meter (type 2560, with a type 2561 chamber) have been analysed in order to study variations in readings when ionization chambers are interchanged during the comparison process. It is suggested that the percentage standard error of the mean of the combined series of chamber intercomparisons be used to set a target uncertainty (0.3%), action level (0.5%) and limit of acceptability (1.0%). The target uncertainty is consistent with IAEA estimates of uncertainty for this part of the calibration chain. It is shown that if the calibration procedure is repeated, the percentage difference between the geometric means calculated from each calibration (i.e. a single interchange of chambers) may be as high as . This reinforces the need for at least one repetition of the intercomparison procedure.

  15. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  16. Instantaneous quantities and uncertainty concepts for signal-dependent time-frequency distributions

    NASA Astrophysics Data System (ADS)

    Jones, Graeme; Boashash, Boualem

    1991-12-01

    This paper presents a review of some concepts associated with time-frequency distributions-- the instantaneous frequency, group delay, instantaneous bandwidth, and marginal properties-- and generalizes them in time-frequency via rotation of coordinates. This work emphasizes the need to examine time-frequency distributions in the general time-frequency plane, rather than restricting oneself to a time and/or frequency framework. This analysis leads to a generalized uncertainty principle, which has previously been introduced in radar theory. This uncertainty principle is invariant under rotation in the time-frequency plane, and should be used instead of the traditional definition of Gabor. It is desired to smooth a time-frequency distribution that is an energy density function into one that is an energy function. Most distributions are combinations of density and energy functions but the Wigner-Ville distribution is purely a density function. By using a local version of the generalized uncertainty principle, the Wigner- Ville distribution is smoothed into a signal dependent spectrogram using an iterative algorithm. It is believed that this procedure may represent, in some way an optimum removal of signal uncertainty in the time-frequency plane.

  17. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  18. Robot Motion Planning with Uncertainty The Challenge

    E-print Network

    Pollefeys, Marc

    . Considering uncertainty (right) more cleanly avoids obstacles and significantly increases the probabilityRobot Motion Planning with Uncertainty The Challenge The motion of a robot in response to commanded with deterministic motion, shortest paths may be highly sensitive to uncertainties: the robot may deviate from its

  19. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty

  20. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  1. Quantum principles and free particles. [evaluation of partitions

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.

  2. Few Group Collapsing of Covariance Matrix Data Based on a Conservation Principle

    SciTech Connect

    H. Hiruta; G. Palmiotti; M. Salvatores; R. Arcilla, Jr.; R. D. McKnight; G. Aliberti; P. Oblozinsky; W. S. Yang

    2008-12-01

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients, to be preserved at a broad energy group structure.

  3. Few group collapsing of covariance matrix data based on a conservation principle

    SciTech Connect

    Hiruta,H.; Palmiotti, G.; Salvatores, M.; Arcilla, Jr., R.; Oblozinsky, P.; McKnight, R.D.

    2008-06-24

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows preserving at a broad energy group structure the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients.

  4. Uncertainty and information in classical mechanics formulation. Common ground for thermodynamics and quantum mechanics

    E-print Network

    Adrian Faigon

    2007-11-01

    Mechanics can be founded on a principle relating the uncertainty delta-q in the trajectory of an observable particle to its motion relative to the observer. From this principle, p.delta-q=const., p being the q-conjugated momentum, mechanical laws are derived and the meaning of the Lagrangian and Hamiltonian functions are discussed. The connection between the presented principle and Hamilton's Least Action Principle is examined. Wave mechanics and Schrodinger equation appear without additional assumptions by choosing the representation for delta-q in the case the motion is not trajectory describable. The Cramer-Rao inequality serves that purpose. For a particle hidden from direct observation, the position uncertainty determined by the enclosing boundaries leads to thermodynamics in a straightforward extension of the presented formalism. The introduction of uncertainty in classical mechanics formulation enables the translation of mechanical laws into the wide ranging conceptual framework of information theory. The boundaries between classical mechanics, thermodynamics and quantum mechanics are defined in terms of informational changes associated with the system evolution. As a direct application of the proposed formulation upper bounds for the rate of information transfer are derived.

  5. Sticking to its principles.

    PubMed

    1992-03-27

    Planned Parenthood says that rather than accept the Bush administration's gag rule it will give up federal funding of its operations. The gag rule forbids professionals at birth control clinics from even referring to abortion as an option to a pregnant woman, much less recommending one. President Bush has agreed to a policy which allows physicians but no one else at clinics to discuss abortion in at least some cases. In his view, according to White House officials, this was an admitted attempt to straddle the issue. Why he would want to straddle is understandable. The right wing of his party, which has always been suspicious of Mr. Bush, is pushing him to uphold what it regards as the Reagan legacy on this issue. The original gag rule, which prevented even physicians from discussing abortion as an option in almost all cases, was issued in the last presidents's 2nd term and upheld last year by the Supreme Court. Give Planned Parenthood credit for sticking to its principles. A lot of recipients of all sorts of federal funds want it both ways, take the money but not accept federal policy guidelines. When they find they can't, many "rise above principle," take the money and adjust policy accordingly. It is not going to be easy for Planned Parenthood now. Federal funds account for a significant portion of the organizations's budgets. Planned Parenthood of Maryland, for example, gets about $500,000 a year from the federal government, or about 12-13% of its total budget. It will either have to cut back on its services or increase its fundraising from other sources or charge women more for services--or all of those things. This is not the end of the story. It is certainly not the end of the political story. Pat Buchanan said of the new regulations, "I like the old position, to be quite candid." Thank goodness he never won a primary. George Bush would not have moved even as far as he hid on the gag rule. There will be a lot of agreement with the Buchanan view at the Republican national convention. We can only hope that by then the president will be looking to the general election campaign and a Democratic opponent who will be appealing to Republican women on this issue. Perhaps then he will relax the gag order a little more. PMID:12317218

  6. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Optimal uncertainty quantification with model uncertainty and legacy data

    NASA Astrophysics Data System (ADS)

    Kamga, P.-H. T.; Li, B.; McKerns, M.; Nguyen, L. H.; Ortiz, M.; Owhadi, H.; Sullivan, T. J.

    2014-12-01

    We present an optimal uncertainty quantification (OUQ) protocol for systems that are characterized by an existing physics-based model and for which only legacy data is available, i.e., no additional experimental testing of the system is possible. Specifically, the OUQ strategy developed in this work consists of using the legacy data to establish, in a probabilistic sense, the level of error of the model, or modeling error, and to subsequently use the validated model as a basis for the determination of probabilities of outcomes. The quantification of modeling uncertainty specifically establishes, to a specified confidence, the probability that the actual response of the system lies within a certain distance of the model. Once the extent of model uncertainty has been established in this manner, the model can be conveniently used to stand in for the actual or empirical response of the system in order to compute probabilities of outcomes. To this end, we resort to the OUQ reduction theorem of Owhadi et al. (2013) in order to reduce the computation of optimal upper and lower bounds on probabilities of outcomes to a finite-dimensional optimization problem. We illustrate the resulting UQ protocol by means of an application concerned with the response to hypervelocity impact of 6061-T6 Aluminum plates by Nylon 6/6 impactors at impact velocities in the range of 5-7 km/s. The ability of the legacy OUQ protocol to process diverse information on the system and its ability to supply rigorous bounds on system performance under realistic-and less than ideal-scenarios demonstrated by the hypervelocity impact application is remarkable.

  8. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  9. [The principles of homeopathy].

    PubMed

    Hjelvik, M; Mørenskog, E

    1997-06-30

    Homeopathy is a gentle but effective form of treatment which stimulates the natural ability of the organism to heal itself. The word homoeopathy comes from the greek words "homoios" which means similar, and "pathos" which means disease. This reflects the main principle of homoeopathy, the law of similars, which predicts that a disease can be cured by a medicine, which in healthy people is able to produce a condition that resembles the disease. The law of similars is probably a basic law of nature. Therefore it is not surprising that examples can also be found in orthodox medicine, where the mode of functioning for some medicines probably can be ascribed the law of similars. Homoeopathic medicines are likely to work through the body's own curative powers in a way that is best explained by comparison with vaccination. Both the homoeopathic medicine and the vaccine constitute a mild stimulous that causes mobilisation of the body's defence mechanisms and thus increased ability to oppose a pathogenic influence. The homoeopathic medicine does not work at molecular level, but probably through non-materialistic qualities (possibly electromagnetic in nature) in the organism, which are so sensitive that even a mild stimulus is enough to cause a reaction. This means that homoeopathic preparations can still have an effect even when diluted beyond avogadro's number. PMID:9265314

  10. Interval Uncertainty as the Basis for a General Description of Uncertainty: A Position Paper

    E-print Network

    Kreinovich, Vladik

    inputs are known with different types of uncertainty. To avoid this problem, it is necessary to developInterval Uncertainty as the Basis for a General Description of Uncertainty: A Position Paper Vladik, USA vladik@utep.edu Abstract Uncertainty is ubiquitous. Depending on what information we have, we get

  11. Multimedia Principle in Teaching Lessons

    ERIC Educational Resources Information Center

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  12. Meaty Principles for Environmental Educators.

    ERIC Educational Resources Information Center

    Rockcastle, V. N.

    1985-01-01

    Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)

  13. Principles of Play for Soccer

    ERIC Educational Resources Information Center

    Ouellette, John

    2004-01-01

    Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…

  14. Validation of an Experimentally Derived Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.

    1996-01-01

    The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.

  15. D-Particle Dynamics and The Space-Time Uncertainty Relation

    E-print Network

    Li, Maozhen; Li, Miao; Yoneya, Tamiaki

    1997-01-01

    We argue that the space-time uncertainty relation of the form $\\Delta X to time, $\\Delta T$, and space, $\\Delta X$, is universally valid in string theory including D-branes. This relation has been previously proposed by one (T.Y.) of the present authors as a simple qualitative representation of the perturbative short distance structure of fundamental string theory. We show that the relation, combined with the usual quantum mechanical uncertainty principle, explains the key qualitative features of D-particle dynamics.

  16. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  17. Error models for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Josset, L.; Scheidt, C.; Lunati, I.

    2012-12-01

    In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing spatial uncertainty using distances and kernels", Math Geosci (2009) [2] P. Jenny et al., "Multi-Scale finite-volume method for elliptic problems in subsurface flow simulation", J. Comp. Phys., 187(1) (2003) [3] I. Lunati and S.H. Lee, "An operator formulation of the multiscale finite-volume method with correction function", Multiscale Model. Simul. 8(1) (2009) [4] G. Mariethoz, P. Renard, and J. Straubhaar "The Direct Sampling method to perform multiple-point geostatistical simulations", Water Resour. Res., 46 (2010) [5] P. Bayer et al., "Three-dimensional high resolution fluvio-glacial aquifer analog", J. Hydro 405 (2011) 19

  18. Physical Principles of Evolution

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    Theoretical biology is incomplete without a comprehensive theory of evolution, since evolution is at the core of biological thought. Evolution is visualized as a migration process in genotype or sequence space that is either an adaptive walk driven by some fitness gradient or a random walk in the absence of (sufficiently large) fitness differences. The Darwinian concept of natural selection consisting in the interplay of variation and selection is based on a dichotomy: All variations occur on genotypes whereas selection operates on phenotypes, and relations between genotypes and phenotypes, as encapsulated in a mapping from genotype space into phenotype space, are central to an understanding of evolution. Fitness is conceived as a function of the phenotype, represented by a second mapping from phenotype space into nonnegative real numbers. In the biology of organisms, genotype-phenotype maps are enormously complex and relevant information on them is exceedingly scarce. The situation is better in the case of viruses but so far only one example of a genotype-phenotype map, the mapping of RNA sequences into RNA secondary structures, has been investigated in sufficient detail. It provides direct information on RNA selection in vitro and test-tube evolution, and it is a basis for testing in silico evolution on a realistic fitness landscape. Most of the modeling efforts in theoretical and mathematical biology today are done by means of differential equations but stochastic effects are of undeniably great importance for evolution. Population sizes are much smaller than the numbers of genotypes constituting sequence space. Every mutant, after all, has to begin with a single copy. Evolution can be modeled by a chemical master equation, which (in principle) can be approximated by a stochastic differential equation. In addition, simulation tools are available that compute trajectories for master equations. The accessible population sizes in the range of 10^7le Nle 10^8 molecules are commonly too small for problems in chemistry but sufficient for biology.

  19. Principles of Rock Mechanics

    NASA Astrophysics Data System (ADS)

    Beeler, N. M.

    Imagine for a moment that you are a field structural geologist, and you have just realized that your star graduate student does not know how to estimate the failure strength of intact rock at 10 km depth in a normal faulting environment. Or perhaps you are a geophysicist with graduate students modeling mantle convection who, as you come to find out, do not know what a dislocation is. You might decide that your students need to take a course in basic rock mechanics, but, and this may be easiest to imagine, you are the only staff member in your department available to teach such a course.If you are developing an introductory course in rock mechanics or you have been teaching such a course without a suitable text, this new book by Ruud Wiejermars was written specifically for you and your students. Principles of Rock Mechanics is a textbook to a one-semester course for graduate students and advanced undergraduates. There are 13 chapters, a math review section, and the obligatory introduction and final overview chapters. Each chapter is designed to be covered in two 50-minute lectures and one laboratory session. Following a formal introduction to the topic, the subsequent seven chapters serve as an introduction to the physical concepts and processes; physical quantities in rock mechanics, force and pressure, stress, elasticity, brittle failure, and ductile creep, taking the students to midterm. An unusual and welcome feature appears at the midsemester point—a math review of notation and associated concepts: differentiation of vectors and scalars, differential equations, tensors, matrices and determinants, and complex variables. This review provides an indication of the rigor to follow.

  20. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  1. Concepts and Practice of Verification, Validation, and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Oberkampf, W. L.

    2014-12-01

    Verification and validation (V&V) are the primary means to assess the numerical and physics modeling accuracy, respectively, in computational simulation. Code verification assesses the reliability of the software coding and the numerical algorithms used in obtaining a solution, while solution verification addresses numerical error estimation of the computational solution of a mathematical model for a specified set of initial and boundary conditions. Validation assesses the accuracy of the mathematical model as compared to experimentally measured response quantities of the system being modeled. As these experimental data are typically available only for simplified subsystems or components of the system, model validation commonly provides limited ability to assess model accuracy directly. Uncertainty quantification (UQ), specifically in regard to predictive capability of a mathematical model, attempts to characterize and estimate the total uncertainty for conditions where no experimental data are available. Specific sources of uncertainty that can impact the total predictive uncertainty are: the assumptions and approximations in the formulation of the mathematical model, the error incurred in the numerical solution of the discretized model, the information available for stochastic input data for the system, and the extrapolation of the mathematical model to conditions where no experimental data are available. This presentation will briefly discuss the principles and practices of VVUQ from both the perspective of computational modeling and simulation, as well as the difficult issue of estimating predictive capability. Contrasts will be drawn between weak and strong code verification testing, and model validation as opposed to model calibration. Closing remarks will address what needs to be done to improve the value of information generated by computational simulation for improved decision-making.

  2. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-01-01

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643

  3. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for comparison of the measurement accuracy of existing or newly developed PIV interrogation algorithms. The database is publicly available on the website www.piv.de/uncertainty.

  4. Testing the strong equivalence principle by radio ranging

    NASA Technical Reports Server (NTRS)

    Canuto, V. M.; Goldman, I.; Shapiro, I. I.

    1984-01-01

    Planetary range data offer the most promising means to test the validity of the Strong Equivalence Principle (SEP). Analytical expressions for the perturbation in the 'range' expected from an SEP violation predicted by the 'variation-of-G' method and by the 'two-times' approach are derived and compared. The dominant term in both expressions is quadratic in time. Analysis of existing range data should allow a determination of the coefficient of this term with a one-standard-deviation uncertainty of about 1 part in 100 billion/yr.

  5. Optimality principles for the visual code

    NASA Astrophysics Data System (ADS)

    Pitkow, Xaq

    One way to try to make sense of the complexities of our visual system is to hypothesize that evolution has developed nearly optimal solutions to the problems organisms face in the environment. In this thesis, we study two such principles of optimality for the visual code. In the first half of this dissertation, we consider the principle of decorrelation. Influential theories assert that the center-surround receptive fields of retinal neurons remove spatial correlations present in the visual world. It has been proposed that this decorrelation serves to maximize information transmission to the brain by avoiding transfer of redundant information through optic nerve fibers of limited capacity. While these theories successfully account for several aspects of visual perception, the notion that the outputs of the retina are less correlated than its inputs has never been directly tested at the site of the putative information bottleneck, the optic nerve. We presented visual stimuli with naturalistic image correlations to the salamander retina while recording responses of many retinal ganglion cells using a microelectrode array. The output signals of ganglion cells are indeed decorrelated compared to the visual input, but the receptive fields are only partly responsible. Much of the decorrelation is due to the nonlinear processing by neurons rather than the linear receptive fields. This form of decorrelation dramatically limits information transmission. Instead of improving coding efficiency we show that the nonlinearity is well suited to enable a combinatorial code or to signal robust stimulus features. In the second half of this dissertation, we develop an ideal observer model for the task of discriminating between two small stimuli which move along an unknown retinal trajectory induced by fixational eye movements. The ideal observer is provided with the responses of a model retina and guesses the stimulus identity based on the maximum likelihood rule, which involves sums over all random walk trajectories. These sums can be implemented in a biologically plausible way. The necessary ingredients are: neurons modeled as a cascade of a linear filter followed by a static nonlinearity, a recurrent network with additive and multiplicative interactions between neurons, and divisive global inhibition. This architecture implements Bayesian inference by representing likelihoods as neural activity which can then diffuse through the recurrent network and modulate the influence of later information. We also develop approximation methods for characterizing the performance of the ideal observer. We find that the effect of positional uncertainty is essentially to slow the acquisition of signal. The time scaling is related to the size of the uncertainty region, which is in turn related to both the signal strength and the statistics of the fixational eye movements. These results imply that localization cues should determine the slope of the performance curve in time.

  6. Path planning under spatial uncertainty.

    PubMed

    Wiener, Jan M; Lafon, Matthieu; Berthoz, Alain

    2008-04-01

    In this article, we present experiments studying path planning under spatial uncertainties. In the main experiment, the participants' task was to navigate the shortest possible path to find an object hidden in one of four places and to bring it to the final destination. The probability of finding the object (probability matrix) was different for each of the four places and varied between conditions. Givensuch uncertainties about the object's location, planning a single path is not sufficient. Participants had to generate multiple consecutive plans (metaplans)--for example: If the object is found in A, proceed to the destination; if the object is not found, proceed to B; and so on. The optimal solution depends on the specific probability matrix. In each condition, participants learned a different probability matrix and were then asked to report the optimal metaplan. Results demonstrate effective integration of the probabilistic information about the object's location during planning. We present a hierarchical planning scheme that could account for participants' behavior, as well as for systematic errors and differences between conditions. PMID:18491490

  7. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  8. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  9. Extrema Principles Of Dissipation In Fluids

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Karamcheti, Krishnamurty

    1991-01-01

    Report discusses application of principle of least action and other variational or extrema principles to dissipation of energy and production of entropy in fluids. Principle of least action applied successfully to dynamics of particles and to quantum mechanics, but not universally accepted that variational principles applicable to thermodynamics and hydrodynamics. Report argues for applicability of some extrema principles to some simple flows.

  10. An Inconvenient Principle

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

  11. Sciama's principle and the dynamics of galaxies I: Sciama's principle

    E-print Network

    Rourke, Colin

    the nature of the big bang (Tod [12]). For our purposes, we need a statement which is more precise than it in subsequent papers. 85A05; 85D99, 85A40, 83F05 1 Mach's principle In any dynamic theory there are certain attempt to base a full theory of dynamics on Mach's principle. His idea is that every particle Q of matter

  12. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  13. Position-momentum uncertainty relations in the presence of quantum memory

    SciTech Connect

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  14. Position-Momentum Uncertainty Relations in the Presence of Quantum Memory

    E-print Network

    Fabian Furrer; Mario Berta; Marco Tomamichel; Volkher B. Scholz; Matthias Christandl

    2015-01-05

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  15. Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

    SciTech Connect

    Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

    2013-12-01

    The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

  16. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…

  17. Get Provoked: Applying Tilden's Principles.

    ERIC Educational Resources Information Center

    Shively, Carol A.

    1995-01-01

    This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

  18. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.

  19. Incorporating Forecast Uncertainty in Utility Control Center

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  20. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  1. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  2. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pommé, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as ‘easy’ and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  3. False precision, surprise and improved uncertainty assessment.

    PubMed

    Parker, Wendy S; Risbey, James S

    2015-11-28

    An uncertainty report describes the extent of an agent's uncertainty about some matter. We identify two basic requirements for uncertainty reports, which we call faithfulness and completeness. We then discuss two pitfalls of uncertainty assessment that often result in reports that fail to meet these requirements. The first involves adopting a one-size-fits-all approach to the representation of uncertainty, while the second involves failing to take account of the risk of surprises. In connection with the latter, we respond to the objection that it is impossible to account for the risk of genuine surprises. After outlining some steps that both scientists and the bodies who commission uncertainty assessments can take to help avoid these pitfalls, we explain why striving for faithfulness and completeness is important. PMID:26460113

  4. Equivalence Principle and Gravitational Redshift

    SciTech Connect

    Hohensee, Michael A.; Chu, Steven; Mueller, Holger; Peters, Achim

    2011-04-15

    We investigate leading order deviations from general relativity that violate the Einstein equivalence principle in the gravitational standard model extension. We show that redshift experiments based on matter waves and clock comparisons are equivalent to one another. Consideration of torsion balance tests, along with matter-wave, microwave, optical, and Moessbauer clock tests, yields comprehensive limits on spin-independent Einstein equivalence principle-violating standard model extension terms at the 10{sup -6} level.

  5. Publica(ons Principles of layout design

    E-print Network

    Boynton, Walter R.

    ve #12;General design principles: Use color wisely · Choose color schemePublica(ons Principles of layout design Caroline Wicks October 2013 #12;· General design principles · Determine products & 4melines · Workshop content

  6. Schild Action and Space-Time Uncertainty Principle in String Theory

    E-print Network

    Yoneya, T

    1997-01-01

    We show that the path-integral quantization of relativistic strings with the Schild action is essentially equivalent to the usual Polyakov quantization at critical space-time dimensions. We then present an interpretation of the Schild action which points towards a derivation of superstring theory as a theory of quantized space-time where the squared string scale, $\\ell_s^2 \\sim \\alpha'$, plays the role of the minimum quantum for space-time areas. A tentative approach towards such a goal is proposed, based on a microcanonical formulation of large N supersymmetric matrix model.

  7. Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Miyadera, Takayuki; Imai, Hideki

    2008-11-01

    A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.

  8. Linear and nonlinear response functions of the Morse oscillator: Classical divergence and the uncertainty principle

    E-print Network

    Cao, Jianshu

    of the annihilation and creation operators, and their classical limits. The formulation allows us to calculate are intrinsically quantum mechanical, we speculate that the clas- sical divergence can be removed by a careful of the quantum Morse oscillator is explored to formulate the coherent state, the phase-space representations

  9. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  10. Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?

    NASA Astrophysics Data System (ADS)

    Mc Leod, David; Mc Leod, Roger

    2007-04-01

    String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that ?p = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5

  11. Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.

    ERIC Educational Resources Information Center

    Wassermann, Selma

    2001-01-01

    Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill. (PKP)

  12. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  13. Induction of models under uncertainty

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter

    1986-01-01

    This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.

  14. Uncertainties in nuclear decay data evaluations

    NASA Astrophysics Data System (ADS)

    Bé, M.-M.; Chechev, V. P.; Pearce, A.

    2015-06-01

    Over the years, scientists have compiled and evaluated experimental results and combined these with theoretical studies with the goal of obtaining the best values, and their uncertainties, for the quantities related to radioactive decay. To meet the demand, an international group, the Decay Data Evaluation Project, was organised in 1995. The methodology adopted by this group is detailed. Some examples are given explaining how the final uncertainties are assessed from the experimental results and their uncertainties.

  15. Joint quantum measurements with minimum uncertainty

    E-print Network

    Martin Ringbauer; Devon N. Biggerstaff; Matthew A. Broome; Alessandro Fedrizzi; Cyril Branciard; Andrew G. White

    2013-08-28

    Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the 3-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.99998(6) allow us to verge upon the fundamental limits of measurement uncertainty.

  16. New measure of uncertainty importance for PSA

    SciTech Connect

    Chun, Moon-Hyun; Han, Seok-Jung

    1997-12-01

    A number of uncertainty importance measures to provide information on the relative contribution of input uncertainties to output uncertainties have been proposed. Each measure has some weakness. This paper proposes a new uncertainty importance measure based on the distributional sensitivity analysis. The new measure is developed to utilize the metric distance (MD) calculated from cumulative distribution functions. The MD provides a more useful measure than the existing approaches because this measure considers the characteristics of the entire distribution and also calculates directly from an output distribution without any prior assumptions.

  17. Renyi Entropy and the Uncertainty Relations

    SciTech Connect

    Bialynicki-Birula, Iwo

    2007-02-21

    Quantum mechanical uncertainty relations for the position and the momentum and for the angle and the angular momentum are expressed in the form of inequalities involving the Renyi entropies. These uncertainty relations hold not only for pure but also for mixed states. Analogous uncertainty relations are valid also for a pair of complementary observables (the analogs of x and p) in N-level systems. All these uncertainty relations become more attractive when expressed in terms of the symmetrized Renyi entropies. The mathematical proofs of all the inequalities discussed in this paper can be found in Phys. Rev. A 74, No. 5 (2006); arXiv:quant-ph/0608116.

  18. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  19. Output Consensus of Heterogeneous Linear Discrete-Time Multiagent Systems With Structural Uncertainties.

    PubMed

    Li, Shaobao; Feng, Gang; Luo, Xiaoyuan; Guan, Xinping

    2015-12-01

    This paper investigates the output consensus problem of heterogeneous discrete-time multiagent systems with individual agents subject to structural uncertainties and different disturbances. A novel distributed control law based on internal reference models is first presented for output consensus of heterogeneous discrete-time multiagent systems without structural uncertainties, where internal reference models embedded in controllers are designed with the objective of reducing communication costs. Then based on the distributed internal reference models and the well-known internal model principle, a distributed control law is further presented for output consensus of heterogeneous discrete-time multiagent systems with structural uncertainties. It is shown in both cases that the consensus trajectory of the internal reference models determines the output trajectories of agents. Finally, numerical simulation results are provided to illustrate the effectiveness of the proposed control schemes. PMID:25622334

  20. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    SciTech Connect

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  1. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    SciTech Connect

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab.

  2. Evaluation of the measurement uncertainty in screening immunoassays in blood establishments: computation of diagnostic accuracy models.

    PubMed

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard

    2015-02-01

    The European Union regulation for blood establishments does not require the evaluation of measurement uncertainty in virology screening tests, which is required by ISO 15189 guideline following GUM principles. GUM modular approaches have been discussed by medical laboratory researchers but no consensus has been achieved regarding practical application. Meanwhile, the application of empirical approaches fulfilling GUM principles has gained support. Blood establishments' screening tests accredited by ISO 15189 need to select an appropriate model even GUM models are intended uniquely for quantitative examination procedures. Alternative (to GUM) models focused on probability have been proposed in medical laboratories' diagnostic tests. This article reviews, discusses and proposes models for diagnostic accuracy in blood establishments' screening tests. The output of these models is an alternative to VIM's measurement uncertainty concept. Example applications are provided for an anti-HCV test where calculations were performed using a commercial spreadsheet. The results show that these models satisfy ISO 15189 principles and that the estimation of clinical sensitivity, clinical specificity, binary results agreement and area under the ROC curve are alternatives to the measurement uncertainty concept. PMID:25617905

  3. Form of prior for constrained thermodynamic processes with uncertainty

    NASA Astrophysics Data System (ADS)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  4. Stimulating uncertainty: Amplifying the quantum vacuum with superconducting circuits

    E-print Network

    P. D. Nation; J. R. Johansson; M. P. Blencowe; Franco Nori

    2012-01-12

    The ability to generate particles from the quantum vacuum is one of the most profound consequences of Heisenberg's uncertainty principle. Although the significance of vacuum fluctuations can be seen throughout physics, the experimental realization of vacuum amplification effects has until now been limited to a few cases. Superconducting circuit devices, driven by the goal to achieve a viable quantum computer, have been used in the experimental demonstration of the dynamical Casimir effect, and may soon be able to realize the elusive verification of analogue Hawking radiation. This article describes several mechanisms for generating photons from the quantum vacuum and emphasizes their connection to the well-known parametric amplifier from quantum optics. Discussed in detail is the possible realization of each mechanism, or its analogue, in superconducting circuit systems. The ability to selectively engineer these circuit devices highlights the relationship between the various amplification mechanisms.

  5. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  6. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of meteorological, public health, media studies, social science, and psychology literature opens up a vast and interesting library that is not obvious to the volcanologist at a first glance. It shows that forecasts and their uncertainty can be phrased and delivered, and followed-up upon, in a manner that reduces the chance of message distortion. The mass-media delivery vehicle requires careful tracking because the potential for forecast distortion can result in a frame that the scientific response is "absurd", "confused", "shambolic", or "dysfunctional." This can help set up a "frightened", "frustrated", "angry", even "furious" reaction to the forecast and forecaster.

  7. EDITORIAL: Squeezed states and uncertainty relations

    NASA Astrophysics Data System (ADS)

    Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector

    2004-06-01

    This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfüred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besançon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly from Latin American countries including, of course, Mexico. There were many talks on the subjects traditionally covered in this conference series, including quantum fluctuations, different forms of squeezing, unlike kinds of nonclassical states of light, and distinct representations of the quantum superposition principle, such as even and odd coherent states. The entanglement phenomenon, frequently in the form of the EPR paradox, is responsible for the main advantages of quantum engineering compared with classical methods. Even though entanglement has been known since the early days of quantum mechanics, its properties, such as the most appropriate entanglement measures, are still under current investigation. The phenomena of dissipations and decoherence of the initial pure states are very important because the fast decoherence can destroy all the advantages of quantum processes in teleportation, quantum computing and image processing. Due to this, methods of controlling the decoherence, such as by the use of different kinds of nonlinearities and deformations, are also under study. From the very beginning of quantum mechanics, the uncertainty relations were basic inequalities distinguishing the classical and quantum worlds. Among the theoretical methods for quantum optics and quantum mechanics, this conference covered phase space and group representations, such as the Wigner and probability distribution functions, which provide an alternative approach to the Schr\\"odinger or Heisenberg picture. Different forms of probability representations of quantum states are important tools to be applied in studying various quantum phenomena, such as quantum interference, decoherence and quantum tomography. They have been established also as a very useful tool in all branches of classical optics. From the mathematical point of view, it is well known that the coherent and squeezed states are representations of the Lorentz group. It was noted throughout the conference that another form of the Lorentz group, namely, the 2 x 2 representation of the SL(2,c) group, is becoming

  8. Map scale and the communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    Conventionally the scale at which mapped information is presented in earth sciences reflects the uncertainty in this information. This partly reflects the cartographic sources of error in printed maps, but also conventions on the amount of underpinning observation on which the map is based. In soil surveys a convention is that the number of soil profile observations per unit area of printed map is fixed over a range of scales. For example, for surveys in the Netherlands, Steur (1961) suggested that there should be 5 field observations per cm2 of map. Bie and Beckett (1970) showed that there is a consistent relationship between map scale and the field effort of the soil survey. It is now common practice to map variables by geostatistical methods. The output from kriging can be on the support of the original data (point kriging) or can be upscaled to 'blocks' by block kriging. The block kriging prediction is of the spatial mean of the target variable across a block of specified dimensions. In principle the size of the block on which data are presented can by varied arbitrarily. In some circumstances the block size may be determined by operational requirements. However, for general purposes, predictions can be presented for blocks of any size. The same variable, sampled at a fixed intensity, could be presented as estimates for blocks 10 × 10 m on one map and 100 × 100 m on another map. The data user might be tempted to assume that the predictions on smaller blocks provide more information than the larger blocks. However, the prediction variance of the block mean diminishes with block size so improvement of the notional resolution of the information is accompanied by a reduction in its precision. This precision can be quantified by the block kriging variance, however this on its own may not serve to indicate whether the block size represents a good compromise between resolution and precision in a particular circumstance such that the resolution reasonably communicates the uncertainty of information to the data user. In this presentation I show how, in place of the block kriging variance, one can use the model-based correlation between the block kriged estimate and the true spatial mean of the block as a readilly interpreted measure of the quality of block-kriging predictions. Graphs of this correlation as a function of block size, for a given sampling configuration, allow one to assess the suitability of different block sizes in circumstances where these are not fixed by operational requirements. For example, it would be possible to determine a new convention by which block kriged predictions are routinely presented only for block sizes such that the correlation exceeds some threshold value. Steur, G.G.L. 1961. Methods of soil survey in use in the Netherlands Soil Survey Institute. Boor Spade 11, 59-77. Bie, S.W., Beckett, P.H.T. 1970. The costs of soil survey. Soils and Fertilizers 34, 1-15.

  9. Dimensions of Uncertainty: A Visual Classification of Geospatial Uncertainty Visualization Research

    E-print Network

    Klippel, Alexander

    Dimensions of Uncertainty: A Visual Classification of Geospatial Uncertainty Visualization Research of Geography, State College, USA {jms1186, dpr173, klippel}@psu.edu Abstract. In recent years, geospatial body of research on geospatial uncertainty visualization into different dimensions. A visual

  10. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  11. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  12. Manipulation Under Voting Rule Uncertainty Edith Elkind

    E-print Network

    Elkind, Edith

    Manipulation Under Voting Rule Uncertainty Edith Elkind School of Physical and Mathematical social choice is the complexity of various forms of dishonest behavior, such as manipulation, control of (coalitional) manipulation for the setting where there is uncertainty about the voting rule: the manipulator(s

  13. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  14. Uncertainty Analysis by the "Worst Case" Method.

    ERIC Educational Resources Information Center

    Gordon, Roy; And Others

    1984-01-01

    Presents a new method of uncertainty propagation which concentrates on the calculation of upper and lower limits (the "worst cases"), bypassing absolute and relative uncertainties. Includes advantages of this method and its use in freshmen laboratories, advantages of the traditional method, and a numerical example done by both methods. (JN)

  15. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  16. DO MODEL UNCERTAINTY WITH CORRELATED INPUTS

    EPA Science Inventory

    The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

  17. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  18. Nonclassicality in phase-number uncertainty relations

    SciTech Connect

    Matia-Hernando, Paloma; Luis, Alfredo

    2011-12-15

    We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.

  19. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  20. Uncertainty Propagation in an Ecosystem Nutrient Budget.

    EPA Science Inventory

    New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

  1. Dealing with Uncertainty in the Semantic Web

    E-print Network

    Theune, Mariët

    Dealing with Uncertainty in the Semantic Web Tjitze Rienstra M.Sc. Thesis November 8, 2009. Paul van der Vet Dr. Maarten Fokkinga #12;#12;Abstract Standardizing the Semantic Web is still of the Semantic Web, are yet to be standardized. One of these is dealing with uncertainty. Like classical logic

  2. Uncertainty and Engagement with Learning Games

    ERIC Educational Resources Information Center

    Howard-Jones, Paul A.; Demetriou, Skevi

    2009-01-01

    Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

  3. Uncertainty quantification in the Nuclear Data Program

    NASA Astrophysics Data System (ADS)

    Brown, D. A.; Herman, M.; Hoblit, S.; McCutchan, E. A.; Nobre, G. P. A.; Pritychenko, B.; Sonzogni, A. A.

    2015-03-01

    The US Nuclear Data Program is charged with collecting, analyzing and archiving information critical to basic nuclear research and to the development of nuclear technologies. Users of nuclear data require detailed uncertainty information for a variety of reasons. In this paper, we review some of the main aspects of the generation and use of uncertainty information, linking to structure, astrophysics, and reaction data.

  4. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    PubMed

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  5. Heisenberg uncertainty relation for mixed states

    SciTech Connect

    Luo Shunlong

    2005-10-15

    The Heisenberg uncertainty relation sets a fundamental limit for quantum measurement of incompatible observables. Its standard form derived by Weyl and Robertson is of purely quantum nature when the state is pure. However, for mixed states, because the variance involving a mixed state is a hybrid of classical and quantum uncertainty, the conventional uncertainty relation is of a 'mixed' flavor. It is desirable to seek some decomposition of variance into classical and quantum parts and to cast the Heisenberg uncertainty relation for mixed states in a more quantum form. By use of the skew information introduced by Wigner and Yanase in 1963, we make such an attempt and establish a different uncertainty relation which is stronger than the conventional one.

  6. The virtue of uncertainty in health care.

    PubMed

    Buetow, Stephen

    2011-10-01

    Uncertainty is unavoidable in health care, yet frequently tacit. When uncertainty is acknowledged, it tends to be defined in terms of the unpredictable nature of the care, and limits to human knowledge. It is cast as a problem that evidence-based health care can minimize. Challenging that simplistic perspective, this paper reconstructs uncertainty as a property whose meaning derives from how people are relationally disposed to perceive it in the social context in which they are embedded. Five conditions are suggested to define a need to protect and cultivate uncertainty as a virtue or positive disposition. These conditions are that uncertainty is natural, promotes creativity and a critical attitude, can signify wisdom, nurtures safety, sustains hope and protects against excess. In contrast, certainty is a delusion. Believing in certainty is unscientific and antiscientific because it can obscure and devalue critique in scientific practice. PMID:21848939

  7. Contending with uncertainty in conservation management decisions

    PubMed Central

    McCarthy, Michael A

    2014-01-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  8. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  9. Contending with uncertainty in conservation management decisions.

    PubMed

    McCarthy, Michael A

    2014-08-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  10. Habitable zone dependence on stellar parameter uncertainties

    SciTech Connect

    Kane, Stephen R.

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  11. Habitable Zone Dependence on Stellar Parameter Uncertainties

    E-print Network

    Kane, Stephen R

    2014-01-01

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with a HZ planet to determine the uncertainty in their HZ status.

  12. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (ESTSC)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore »of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  13. Uncertainty reasoning in expert systems

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  14. The 4th Thermodynamic Principle?

    NASA Astrophysics Data System (ADS)

    Montero García, José De La Luz; Novoa Blanco, Jesús Francisco

    2007-04-01

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today. We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four "forces" or fundamental interactions of nature. The Einstein's golden dream is possible.

  15. The 4th Thermodynamic Principle?

    SciTech Connect

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

  16. Principles of Virus Structural Organization

    PubMed Central

    Prasad, B.V. Venkataram; Schmid, Michael F

    2013-01-01

    Viruses, the molecular nanomachines infecting hosts ranging from prokaryotes to eukaryotes, come in different sizes, shapes and symmetries. Questions such as what principles govern their structural organization, what factors guide their assembly, how these viruses integrate multifarious functions into one unique structure have enamored researchers for years. In the last five decades, following Caspar and Klug's elegant conceptualization of how viruses are constructed, high resolution structural studies using X-ray crystallography and more recently cryo-EM techniques have provided a wealth of information on structures of variety of viruses. These studies have significantly furthered our understanding of the principles that underlie structural organization in viruses. Such an understanding has practical impact in providing a rational basis for the design and development of antiviral strategies. In this chapter, we review principles underlying capsid formation in a variety of viruses, emphasizing the recent developments along with some historical perspective. PMID:22297509

  17. Bayes and the Simplicity Principle in Perception

    ERIC Educational Resources Information Center

    Feldman, Jacob

    2009-01-01

    Discussions of the foundations of perceptual inference have often centered on 2 governing principles, the likelihood principle and the simplicity principle. Historically, these principles have usually been seen as opposed, but contemporary statistical (e.g., Bayesian) theory tends to see them as consistent, because for a variety of reasons simpler…

  18. Principle Paradigms Revisiting the Dublin Core 1:1 Principle

    ERIC Educational Resources Information Center

    Urban, Richard J.

    2012-01-01

    The Dublin Core "1:1 Principle" asserts that "related but conceptually different entities, for example a painting and a digital image of the painting, are described by separate metadata records" (Woodley et al., 2005). While this seems to be a simple requirement, studies of metadata quality have found that cultural heritage…

  19. Clocking in the face of unpredictability beyond quantum uncertainty

    NASA Astrophysics Data System (ADS)

    Madjid, F. Hadi; Myers, John M.

    2015-05-01

    In earlier papers we showed unpredictability beyond quantum uncertainty in atomic clocks, ensuing from a proven gap between given evidence and explanations of that evidence. Here we reconceive a clock, not as an isolated entity, but as enmeshed in a self-adjusting communications network adapted to one or another particular investigation, in contact with an unpredictable environment. From the practical uses of clocks, we abstract a clock enlivened with the computational capacity of a Turing machine, modified to transmit and to receive numerical communications. Such "live clocks" phase the steps of their computations to mesh with the arrival of transmitted numbers. We lift this phasing, known in digital communications, to a principle of logical synchronization, distinct from the synchronization defined by Einstein in special relativity. Logical synchronization elevates digital communication to a topic in physics, including applications to biology. One explores how feedback loops in clocking affect numerical signaling among entities functioning in the face of unpredictable influences, making the influences themselves into subjects of investigation. The formulation of communications networks in terms of live clocks extends information theory by expressing the need to actively maintain communications channels, and potentially, to create or drop them. We show how networks of live clocks are presupposed by the concept of coordinates in a spacetime. A network serves as an organizing principle, even when the concept of the rigid body that anchors a special-relativistic coordinate system is inapplicable, as is the case, for example, in a generic curved spacetime.

  20. Clocking in the face of unpredictability beyond quantum uncertainty

    E-print Network

    F. Hadi Madjid; John M. Myers

    2015-04-16

    In earlier papers we showed unpredictability beyond quantum uncertainty in atomic clocks, ensuing from a proven gap between given evidence and explanations of that evidence. Here we reconceive a clock, not as an isolated entity, but as enmeshed in a self-adjusting communications network adapted to one or another particular investigation, in contact with an unpredictable environment. From the practical uses of clocks, we abstract a clock enlivened with the computational capacity of a Turing machine, modified to transmit and to receive numerical communications. Such "live clocks" phase the steps of their computations to mesh with the arrival of transmitted numbers. We lift this phasing, known in digital communications, to a principle of \\emph{logical synchronization}, distinct from the synchronization defined by Einstein in special relativity. Logical synchronization elevates digital communication to a topic in physics, including applications to biology. One explores how feedback loops in clocking affect numerical signaling among entities functioning in the face of unpredictable influences, making the influences themselves into subjects of investigation. The formulation of communications networks in terms of live clocks extends information theory by expressing the need to actively maintain communications channels, and potentially, to create or drop them. We show how networks of live clocks are presupposed by the concept of coordinates in a spacetime. A network serves as an organizing principle, even when the concept of the rigid body that anchors a special-relativistic coordinate system is inapplicable, as is the case, for example, in a generic curved spacetime.

  1. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  2. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  3. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    SciTech Connect

    Porter, D.W.

    1996-04-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites.

  4. Traceable measurement and uncertainty analysis of the gross calorific value of methane determined by isoperibolic calorimetry

    NASA Astrophysics Data System (ADS)

    Haloua, F.; Foulon, E.; Allard, A.; Hay, B.; Filtz, J. R.

    2015-12-01

    As methane is the major component of natural gas and non-conventional gases as biogas or mine gases, its energy content has to be measured accurately regardless of its production site for fiscal trading of transported and distributed natural gas. The determination of calorific value of fuel gases with the lowest uncertainty can only be performed by direct method with a reference gas calorimeter. To address this point, LNE developed a few years ago an isoperibolic reference gas calorimeter according to the Rossini’s principle. The energy content Hs of methane of purity 99.9995% has been measured to 55 507.996 kJ kg-1 (890.485 kJ mol-1) with an expanded relative uncertainty of 0.091% (coverage factor k??=??2.101 providing a level of confidence of approximately 95%). These results are based on ten repeated measurements and on the uncertainty assessment performed in accordance with the guide to the expression of uncertainty in measurement (GUM). The experimental setup and the results are reported here and for the first time, the fully detailed uncertainty calculation is exposed.

  5. The legal status of Uncertainty

    NASA Astrophysics Data System (ADS)

    Altamura, M.; Ferraris, L.; Miozzo, D.; Musso, L.; Siccardi, F.

    2011-03-01

    An exponential improvement of numerical weather prediction (NWP) models was observed during the last decade (Lynch, 2008). Civil Protection (CP) systems exploited Meteo services in order to redeploy their actions towards the prediction and prevention of events rather than towards an exclusively response-oriented mechanism1. Nevertheless, experience tells us that NWP models, even if assisted by real time observations, are far from being deterministic. Complications frequently emerge in medium to long range forecasting, which are subject to sudden modifications. On the other hand, short term forecasts, if seen through the lens of criminal trials2, are to the same extent, scarcely reliable (Molini et al., 2009). One particular episode related with wrong forecasts, in the Italian panorama, has deeply frightened CP operators as the NWP model in force missed a meteorological adversity which, in fact, caused death and dealt severe damage in the province of Vibo Valentia (2006). This event turned into a very discussed trial, lasting over three years, and intended against whom assumed the legal position of guardianship within the CP. A first set of data is now available showing that in concomitance with the trial of Vibo Valentia the number of alerts issued raised almost three folds. We sustain the hypothesis that the beginning of the process of overcriminalization (Husak, 2008) of CPs is currently increasing the number of false alerts with the consequent effect of weakening alert perception and response by the citizenship (Brezntiz, 1984). The common misunderstanding of such an issue, i.e. the inherent uncertainty in weather predictions, mainly by prosecutors and judges, and generally by whom deals with law and justice, is creating the basis for a defensive behaviour3 within CPs. This paper intends, thus, to analyse the social and legal relevance of uncertainty in the process of issuing meteo-hydrological alerts by CPs. Footnotes: 1 The Italian Civil Protection is working in this direction since 1992 (L. 225/92). An example of this effort is clearly given by the Prime Minister Decree (DPCM 20/12/2001 "Linee guida relative ai piani regionali per la programmazione delle attivita' di previsione, prevenzione e lotta attiva contro gli incendi boschivi - Guidelines for regional plans for the planning of prediction, prevention and forest fires fighting activities") that, already in 2001, emphasized "the most appropriate approach to pursue the preservation of forests is to promote and encourage prediction and prevention activities rather than giving priority to the emergency-phase focused on fire-fighting". 2 Supreme Court of the United States, In re Winship (No. 778), No. 778 argued: 20 January 1970, decided: 31 March 1970: Proof beyond a reasonable doubt, which is required by the Due Process Clause in criminal trials, is among the "essentials of due process and fair treatment" 3 In Kessler and McClellan (1996): "Defensive medicine is a potentially serious social problem: if fear of liability drives health care providers to administer treatments that do not have worthwhile medical benefits, then the current liability system may generate inefficiencies much larger than the costs of compensating malpractice claimants".

  6. Climate change, uncertainty, and natural resource management

    USGS Publications Warehouse

    Nichols, J.D.; Koneff, M.D.; Heglund, P.J.; Knutson, M.G.; Seamans, M.E.; Lyons, J.E.; Morton, J.M.; Jones, M.T.; Boomer, G.S.; Williams, B.K.

    2011-01-01

    Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

  7. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  8. Constructing the uncertainty of due dates.

    PubMed

    Vos, Sarah C; Anthony, Kathryn E; O'Hair, H Dan

    2014-01-01

    By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make on how and when to give birth. Thirty-three women who were pregnant or had given birth within the past 2 years participated in key informant or small-group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait for birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type of uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

  9. Constructing the Uncertainty of Due Dates

    PubMed Central

    Vos, Sarah C.; Anthony, Kathryn E.; O'Hair, H. Dan

    2015-01-01

    By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make of how and when to give birth. Thirty-three women who were pregnant or had given birth within the past two years participated in key informant or small group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait on birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

  10. Uncertainty and global climate change research

    SciTech Connect

    Tonn, B.E.; Weiher, R.

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  11. Numerical uncertainty in computational engineering and physics

    SciTech Connect

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  12. Reversing the Balance Wheel Principle

    ERIC Educational Resources Information Center

    Orkodashvili, Mariam

    2008-01-01

    The paper discusses funding principles and policies of higher education during the recession period. The role of state appropriations for the viability of public higher education institutions is emphasized. State funding affecting institutional behaviour is another issue raised. The paper analyzes the possibility of expanding state funding for…

  13. Public Comment: Principles for Determining

    E-print Network

    Public Comment: Principles for Determining Species At-Risk of IUU Fishing and Seafood Fraud May 7, May 12, and June 4, 2015 #12;IUU fishing and seafood fraud are complex and global issues requiring and Seafood Fraud #12;· Established June 2014 · Departments of Commerce & State · Public engagement: Federal

  14. The Deeper Principles of Prevention

    ERIC Educational Resources Information Center

    Calhoun, John A.

    2004-01-01

    The founder of the National Crime Prevention Council observes that many successful prevention initiatives are driven by deeply held moral or religious beliefs. He proposes that they enhance their "policy-speak" with deeper spiritual principles underlying many effective programs. This article puts forth a glossary of reclaiming concepts rooted in…

  15. Variational Principles for Water Waves

    E-print Network

    Boris Kolev; David H. Sattinger

    2007-12-01

    We describe the Hamiltonian structures, including the Poisson brackets and Hamiltonians, for free boundary problems for incompressible fluid flows with vorticity. The Hamiltonian structure is used to obtain variational principles for stationary gravity waves both for irrotational flows as well as flows with vorticity.

  16. On the Dirichlet's Box Principle

    ERIC Educational Resources Information Center

    Poon, Kin-Keung; Shiu, Wai-Chee

    2008-01-01

    In this note, we will focus on several applications on the Dirichlet's box principle in Discrete Mathematics lesson and number theory lesson. In addition, the main result is an innovative game on a triangular board developed by the authors. The game has been used in teaching and learning mathematics in Discrete Mathematics and some high schools in…

  17. Demonstrating Fermat's Principle in Optics

    ERIC Educational Resources Information Center

    Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

    2011-01-01

    We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either…

  18. Electronic Structure Principles and Aromaticity

    ERIC Educational Resources Information Center

    Chattaraj, P. K.; Sarkar, U.; Roy, D. R.

    2007-01-01

    The relationship between aromaticity and stability in molecules on the basis of quantities such as hardness and electrophilicity is explored. The findings reveal that aromatic molecules are less energetic, harder, less polarizable, and less electrophilic as compared to antiaromatic molecules, as expected from the electronic structure principles.

  19. Basic Principles for Adult Education.

    ERIC Educational Resources Information Center

    Office of Vocational and Adult Education, Washington, DC. Div. of Adult Education.

    A basic set of principles for adult education reflects what should be found in each state and local program. First, basic skills should be mastered by all students. Second, course content should be directly related to learner, labor market, and community needs. Third, partnership efforts should be expanded and strengthened. Fourth, programs must…

  20. Predictive uncertainty in auditory sequence processing

    PubMed Central

    Hansen, Niels Chr.; Pearce, Marcus T.

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018