Science.gov

Sample records for uncertainty principle

  1. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    NASA Astrophysics Data System (ADS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-03-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found.

  2. Extended uncertainty from first principles

    NASA Astrophysics Data System (ADS)

    Costa Filho, Raimundo N.; Braga, João P. M.; Lira, Jorge H. S.; Andrade, José S.

    2016-04-01

    A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.

  3. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  4. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  5. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  6. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  7. Curriculum in Art Education: The Uncertainty Principle.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1989-01-01

    Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the

  8. Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.

    ERIC Educational Resources Information Center

    McKerrow, K. Kelly; McKerrow, Joan E.

    1991-01-01

    The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)

  9. Uncertainty Principle and Elementary Wavelet

    NASA Astrophysics Data System (ADS)

    Bliznetsov, M.

    This paper is aimed to define time-and-spectrum characteristics of elementary wavelet. An uncertainty relation between the width of a pulse amplitude spectrum and its time duration and extension in space is investigated in the paper. Analysis of uncertainty relation is carried out for the causal pulses with minimum-phase spectrum. Amplitude spectra of elementary pulses are calculated using modified Fourier spectral analysis. Modification of Fourier analysis is justified by the necessity of solving zero frequency paradox in amplitude spectra that are calculated with the help of standard Fourier anal- ysis. Modified Fourier spectral analysis has the same resolution along the frequency axis and excludes physically unobservable values from time-and-spectral presenta- tions and determines that Heaviside unit step function has infinitely wide spectrum equal to 1 along the whole frequency range. Dirac delta function has the infinitely wide spectrum in the infinitely high frequency scope. Difference in propagation of wave and quasi-wave forms of energy motion is established from the analysis of un- certainty relation. Unidirectional pulse velocity depends on the relative width of the pulse spectra. Oscillating pulse velocity is constant in given nondispersive medium. Elementary wavelet has the maximum relative spectrum width and minimum time du- ration among all the oscillating pulses whose velocity is equal to the velocity of casual harmonic components of the pulse spectra. Relative width of elementary wavelet spec- trum in regard to resonance frequency is square root of 4/3 and approximately equal to 1.1547.... Relative width of this wavelet spectrum in regard to the center frequency is equal to 1. The more relative width of unidirectional pulse spectrum exceeds rela- tive width of elementary wavelet spectrum the higher velocity of unidirectional pulse propagation. The concept of velocity exceeding coefficient is introduced for pulses presenting quasi-wave form of energy motion that is known as shock waves. The ele- mentary wavelet and elastic pulse radiated by the spherical source of seismic waves in an ideal elastic medium have the same spectral characteristic. Planck's radiation law for electric waves is also characterized by the equality of relative width of the spec- trum in regard to the center frequency to 1 and does not conflict with the introduced concept of elementary wavelet.

  10. An uncertainty principle for unimodular quantum groups

    SciTech Connect

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  11. A Principle of Uncertainty for Information Seeking.

    ERIC Educational Resources Information Center

    Kuhlthau, Carol C.

    1993-01-01

    Proposes an uncertainty principle for information seeking based on the results of a series of studies that investigated the user's perspective of the information search process. Constructivist theory is discussed as a conceptual framework for studying the user's perspective, and areas for further research are suggested. (Contains 44 references.)

  12. Geometric formulation of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bosyk, G. M.; Osn, T. M.; Lamberti, P. W.; Portesi, M.

    2014-03-01

    A geometric approach to formulate the uncertainty principle between quantum observables acting on an N-dimensional Hilbert space is proposed. We consider the fidelity between a density operator associated with a quantum system and a projector associated with an observable, and interpret it as the probability of obtaining the outcome corresponding to that projector. We make use of fidelity-based metrics such as angle, Bures, and root infidelity to propose a measure of uncertainty. The triangle inequality allows us to derive a family of uncertainty relations. In the case of the angle metric, we recover the Landau-Pollak inequality for pure states and show, in a natural way, how to extend it to the case of mixed states in arbitrary dimension. In addition, we derive and compare alternative uncertainty relations when using other known fidelity-based metrics.

  13. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  14. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  15. Space tests of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Khodadi, M.

    2015-08-01

    A generalized uncertainty principle admitting a minimal measurable length contains a parameter of which the numerical value needs to be fixed. In fact, the application of the Generalized Uncertainty Principle (GUP) to some quantum mechanical problems offers different values for the upper bound of the GUP dimensionless parameter . In this work, by applying a GUP which is linear and quadratic in the correction to Newton's law of gravity, and then using the stability condition of the circular orbits of the planets, we propose an upper bound for . By using the astronomical data of the Solar System objects, a new and severe constraint on the upper bound of the parameter is derived. Also, using the modified Newtonian potential, inspired by a GUP which is linear and quadratic in , we investigate the possibility of measuring the relevant parameter through observables provided by the Galileo Navigation Satellite System.

  16. Hardy Uncertainty Principle, Convexity and Parabolic Evolutions

    NASA Astrophysics Data System (ADS)

    Escauriaza, L.; Kenig, C. E.; Ponce, G.; Vega, L.

    2015-11-01

    We give a new proof of the L 2 version of Hardy's uncertainty principle based on calculus and on its dynamical version for the heat equation. The reasonings rely on new log-convexity properties and the derivation of optimal Gaussian decay bounds for solutions to the heat equation with Gaussian decay at a future time.We extend the result to heat equations with lower order variable coefficient.

  17. Quantum randomness certified by the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Vallone, Giuseppe; Marangon, Davide G.; Tomasin, Marco; Villoresi, Paolo

    2014-11-01

    We present an efficient method to extract the amount of true randomness that can be obtained by a quantum random number generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min-entropy and max-entropy. We tested our method with two different QRNGs by using a train of qubits or ququart and demonstrated the scalability toward practical applications.

  18. Dilaton cosmology, noncommutativity, and generalized uncertainty principle

    SciTech Connect

    Vakili, Babak

    2008-02-15

    The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

  19. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  20. Gravitational tests of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Scardigli, Fabio; Casadio, Roberto

    2015-09-01

    We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a generalized uncertainty principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard general relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.

  1. Lorentz invariance violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag

    2016-01-01

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  2. Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael

    1993-01-01

    Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)

  3. Open Timelike Curves Violate Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Pienaar, J. L.; Ralph, T. C.; Myers, C. R.

    2013-02-01

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenbergs uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.

  4. Potential Wells and the Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Owens, Constance; Blado, Gardo; Meyers, Vincent

    2014-03-01

    Out of the four fundamental forces, we have yet to be able to unify gravity with the other three forces. This predicament has kept scientists from being able to explain systems that use both general relativity (GR) and quantum mechanics (QM). The quest to quantize gravity, in other words to make GR a quantum theory, has been at the forefront of physics research in recent decades. Incorporating gravity into QM changes the laws of ordinary quantum mechanics. Potential wells are a common tool used to study particle behavior in quantum mechanics. At first they were simply theoretical toy models, but within time it was discovered that potential wells could actually be used to model real-life situations and thus have proven to be very useful theoretically and experimentally. For example, the double square well (DSW) can be used to model the potential experienced by an electron in a diatomic molecule. DSWs can also be used to study bilayer systems. In this paper we derive the results for the finite square well and the DSW using a form of the generalized uncertainty principle to study and discuss how the incorporation of gravity modifies these results. We also discuss applications and the effects of gravity on quantum tunneling.

  5. Open timelike curves violate Heisenberg's uncertainty principle.

    PubMed

    Pienaar, J L; Ralph, T C; Myers, C R

    2013-02-01

    Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity. PMID:23432226

  6. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  7. String theory, scale relativity and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Castro, Carlos

    1997-06-01

    Extensions (modifications) of the Heisenberg uncertainty principle are derived within the framework of the theory of special scale-relativity proposed by Nottale. In particular, generalizations of the stringy uncertainty principle are obtained where the size of the strings is bounded by the Planck scale and the size of the universe. Based on the fractal structures inherent with two dimensional quantum gravity, which has attracted considerable interest recently, we conjecture that the underlying fundamental principle behind string theory should be based on an extension of the scale relativity principle where both dynamics as well as scales are incorporated in the same footing.

  8. Uncertainty principle for proper time and mass

    NASA Astrophysics Data System (ADS)

    Kudaka, Shoju; Matsumoto, Shuichi

    1999-03-01

    We review Bohr's reasoning in the Bohr-Einstein debate on the photon box experiment. The essential point of his reasoning leads us to an uncertainty relation between the proper time and the rest mass of the clock. It is shown that this uncertainty relation can be derived if only we take the fundamental point of view that the proper time should be included as a dynamic variable in the Lagrangian describing the system of the clock. Some problems and some positive aspects of our approach are then discussed.

  9. Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Dutta, Abhijit; Gangopadhyay, Sunandan

    2016-01-01

    In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordstrm black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.

  10. Risks, scientific uncertainty and the approach of applying precautionary principle.

    PubMed

    Lo, Chang-fa

    2009-03-01

    The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures. PMID:19705643

  11. Microscopic black hole stabilization via the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Vayenas, Constantinos G.; Grigoriou, Dimitrios

    2015-01-01

    Due to the Heisenberg uncertainty principle, gravitational confinement of two- or three-rotating particle systems can lead to microscopic Planckian or sub-Planckian black holes with a size of order their Compton wavelength. Some properties of such states are discussed in terms of the Schwarzschild geodesics of general relativity and compared with properties computed via the combination of special relativity, equivalence principle, Newton's gravitational law and Compton wavelength. It is shown that the generalized uncertainty principle (GUP) provides a satisfactory fit of the Schwarzschild radius and Compton wavelength of such microscopic, particle-like, black holes.

  12. Demonstration of the angular uncertainty principle for single photons

    NASA Astrophysics Data System (ADS)

    Jack, B.; Aursand, P.; Franke-Arnold, S.; Ireland, D. G.; Leach, J.; Barnett, S. M.; Padgett, M. J.

    2011-06-01

    We present an experimental demonstration of a form of the angular uncertainty principle for single photons. Producing light from type I down-conversion, we use spatial light modulators to perform measurements on signal and idler photons. By measuring states in the angle and orbital angular momentum basis, we demonstrate the uncertainty relation of Franke-Arnold et al (2004 New J. Phys. 6 103). We consider two manifestations of the uncertainty relation. In the first we herald the presence of a photon by detection of its paired partner and demonstrate the uncertainty relation on this single photon. In the second, we perform orbital angular momentum measurements on one photon and angular measurements on its correlated partner exploring, in this way, the uncertainty relation through non-local measurements.

  13. Single-Slit Diffraction and the Uncertainty Principle

    ERIC Educational Resources Information Center

    Rioux, Frank

    2005-01-01

    A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

  14. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert

  15. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  16. Nonasymptotic homogenization of periodic electromagnetic structures: Uncertainty principles

    NASA Astrophysics Data System (ADS)

    Tsukerman, Igor; Markel, Vadim A.

    2016-01-01

    We show that artificial magnetism of periodic dielectric or metal/dielectric structures has limitations and is subject to at least two "uncertainty principles." First, the stronger the magnetic response (the deviation of the effective permeability tensor from identity), the less accurate ("certain") the predictions of any homogeneous model. Second, if the magnetic response is strong, then homogenization cannot accurately reproduce the transmission and reflection parameters and, simultaneously, power dissipation in the material. These principles are general and not confined to any particular method of homogenization. Our theoretical analysis is supplemented with a numerical example: a hexahedral lattice of cylindrical air holes in a dielectric host. Even though this case is highly isotropic, which might be thought of as conducive to homogenization, the uncertainty principles remain valid.

  17. The 'Herbivory Uncertainty Principle': application in a cerrado site.

    PubMed

    Gadotti, C A; Batalha, M A

    2010-05-01

    Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control), quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation. PMID:20379648

  18. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  19. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  20. Quantum black hole in the generalized uncertainty principle framework

    SciTech Connect

    Bina, A.; Moslehi, A.; Jalalzadeh, S.

    2010-01-15

    In this paper we study the effects of the generalized uncertainty principle (GUP) on canonical quantum gravity of black holes. Through the use of modified partition function that involves the effects of the GUP, we obtain the thermodynamical properties of the Schwarzschild black hole. We also calculate the Hawking temperature and entropy for the modification of the Schwarzschild black hole in the presence of the GUP.

  1. Uncertainty principle for Gabor systems and the Zak transform

    SciTech Connect

    Czaja, Wojciech; Zienkiewicz, Jacek

    2006-12-15

    We show that if g(set-membership sign)L{sup 2}(R) is a generator of a Gabor orthonormal basis with the lattice ZxZ, then its Zak transform Z(g) satisfies {nabla}Z(g)(negated-set-membership sign)L{sup 2}([0,1){sup 2}). This is a generalization and extension of the Balian-Low uncertainty principle.

  2. Uncertainty principle of genetic information in a living cell

    PubMed Central

    Strippoli, Pierluigi; Canaider, Silvia; Noferini, Francesco; D'Addabbo, Pietro; Vitale, Lorenza; Facchin, Federica; Lenzi, Luca; Casadei, Raffaella; Carinci, Paolo; Zannotti, Maria; Frabetti, Flavia

    2005-01-01

    Background Formal description of a cell's genetic information should provide the number of DNA molecules in that cell and their complete nucleotide sequences. We pose the formal problem: can the genome sequence forming the genotype of a given living cell be known with absolute certainty so that the cell's behaviour (phenotype) can be correlated to that genetic information? To answer this question, we propose a series of thought experiments. Results We show that the genome sequence of any actual living cell cannot physically be known with absolute certainty, independently of the method used. There is an associated uncertainty, in terms of base pairs, equal to or greater than μs (where μ is the mutation rate of the cell type and s is the cell's genome size). Conclusion This finding establishes an "uncertainty principle" in genetics for the first time, and its analogy with the Heisenberg uncertainty principle in physics is discussed. The genetic information that makes living cells work is thus better represented by a probabilistic model rather than as a completely defined object. PMID:16197549

  3. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  4. Generalized uncertainty principle in Bianchi type I quantum cosmology

    NASA Astrophysics Data System (ADS)

    Vakili, B.; Sepangi, H. R.

    2007-07-01

    We study a quantum Bianchi type I model in which the dynamical variables of the corresponding minisuperspace obey the generalized Heisenberg algebra. Such a generalized uncertainty principle has its origin in the existence of a minimal length suggested by quantum gravity and sting theory. We present approximate analytical solutions to the corresponding Wheeler DeWitt equation in the limit where the scale factor of the universe is small and compare the results with the standard commutative and noncommutative quantum cosmology. Similarities and differences of these solutions are also discussed.

  5. Classical Dynamics Based on the Minimal Length Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2016-02-01

    In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.

  6. Effects of the generalised uncertainty principle on quantum tunnelling

    NASA Astrophysics Data System (ADS)

    Blado, Gardo; Prescott, Trevor; Jennings, James; Ceyanes, Joshuah; Sepulveda, Rafael

    2016-03-01

    In a previous paper (Blado et al 2014 Eur. J. Phys. 35 065011), we showed that quantum gravity effects can be discussed with only a background in non-relativistic quantum mechanics at the undergraduate level by looking at the effect of the generalised uncertainty principle (GUP) on the finite and infinite square wells. In this paper, we derive the GUP corrections to the tunnelling probability of simple quantum mechanical systems which are accessible to undergraduates (alpha decay, simple models of quantum cosmogenesis and gravitational tunnelling radiation) and which employ the WKB approximation, a topic discussed in undergraduate quantum mechanics classes. It is shown that the GUP correction increases the tunnelling probability in each of the examples discussed.

  7. Scalar field cosmology modified by the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Paliathanasis, Andronikos; Pan, Supriya; Pramanik, Souvik

    2015-12-01

    We consider quintessence scalar field cosmology in which the Lagrangian of the scalar field is modified by the generalized uncertainty principle. We show that the perturbation terms that arise from the deformed algebra are equivalent with the existence of a second scalar field, where the two fields interact in the kinetic part. Moreover, we consider a spatially flat Friedmann-Lematre-Robertson-Walker spacetime, and we derive the gravitational field equations. We show that the modified equation of state parameter w GUP can cross the phantom divide line; that is w GUP < -1. Furthermore, we derive the field equations in the dimensionless parameters, the dynamical system that arises is a singular perturbation system in which we study the existence of the fixed points in the slow manifold. Finally, we perform numerical simulations for some well known models and we show that for these models with the specific initial conditions, the parameter w GUP crosses the phantom barrier.

  8. Molecular Response Theory in Terms of the Uncertainty Principle.

    PubMed

    Harde, Hermann; Grischkowsky, Daniel

    2015-08-27

    We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared. PMID:26280761

  9. Effect of the Generalized Uncertainty Principle on post-inflation preheating

    SciTech Connect

    Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr

    2011-12-01

    We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.

  10. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the

  11. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  12. Effect of holographic principle and generalized uncertainty principle on the semiclassical Parikh-Wilczek tunneling radiation of black holes

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Noorizadeh, H.; Sahraei, R.; Majlesara, M. H.; Trohid, S. A. A.

    2012-06-01

    In the present paper, based on the generalized uncertainty principle, the Parikh-Wilczek black hole tunneling radiation is studied. It is shown that the black hole tunneling radiation receives a correction. Furthermore a bound on the tunneling approach, is shown to be valid when, based on the holographic principle, a bound applied to the black hole entropy.

  13. SOURCE ASSESSMENT: ANALYSIS OF UNCERTAINTY--PRINCIPLES AND APPLICATIONS

    EPA Science Inventory

    This report provides the results of a study that was conducted to analyze the uncertainties involved in the calculation of the decision parameters used in the Source Assessment Program and to determine the effect of these uncertainties on the decision-making procedure. A general ...

  14. Entropy of the Randall-Sundrum brane world with the generalized uncertainty principle

    SciTech Connect

    Kim, Wontae; Park, Young-Jai; Kim, Yong-Wan

    2006-11-15

    By introducing the generalized uncertainty principle, we calculate the entropy of the bulk scalar field on the Randall-Sundrum brane background without any cutoff. We obtain the entropy of the massive scalar field proportional to the horizon area. Here, we observe that the mass contribution to the entropy exists in contrast to all previous results of the usual black hole cases with the generalized uncertainty principle.

  15. Uncertainties.

    PubMed

    Dalla Chiara, Maria Luisa

    2010-09-01

    In contemporary science uncertainty is often represented as an intrinsic feature of natural and of human phenomena. As an example we need only think of two important conceptual revolutions that occurred in physics and logic during the first half of the twentieth century: (1) the discovery of Heisenberg's uncertainty principle in quantum mechanics; (2) the emergence of many-valued logical reasoning, which gave rise to so-called 'fuzzy thinking'. I discuss the possibility of applying the notions of uncertainty, developed in the framework of quantum mechanics, quantum information and fuzzy logics, to some problems of political and social sciences. PMID:19859828

  16. Path Integral for Dirac oscillator with generalized uncertainty principle

    SciTech Connect

    Benzair, H.; Boudjedaa, T.; Merad, M.

    2012-12-15

    The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.

  17. Uncertainty Principle--Limited Experiments: Fact or Academic Pipe-Dream?

    ERIC Educational Resources Information Center

    Albergotti, J. Clifton

    1973-01-01

    The question of whether modern experiments are limited by the uncertainty principle or by the instruments used to perform the experiments is discussed. Several key experiments show that the instruments limit our knowledge and the principle remains of strictly academic concern. (DF)

  18. Generalized uncertainty principle and the conformally coupled scalar field quantum cosmology

    NASA Astrophysics Data System (ADS)

    Pedram, Pouria

    2015-03-01

    We exactly solve the Wheeler-DeWitt equation for the closed homogeneous and isotropic quantum cosmology in the presence of a conformally coupled scalar field and in the context of the generalized uncertainty principle. This form of generalized uncertainty principle is motivated by the black hole physics and it predicts a minimal length uncertainty proportional to the Planck length. We construct wave packets in momentum minisuperspace which closely follow classical trajectories and strongly peak on them upon choosing appropriate initial conditions. Moreover, based on the DeWitt criterion, we obtain wave packets that exhibit singularity-free behavior.

  19. Uncertainty principle for experimental measurements: Fast versus slow probes.

    PubMed

    Hansmann, P; Ayral, T; Tejeda, A; Biermann, S

    2016-01-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments - angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy - suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902

  20. Uncertainty principle for experimental measurements: Fast versus slow probes

    NASA Astrophysics Data System (ADS)

    Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.

    2016-02-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments – angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy – suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates.

  1. Uncertainty principle for experimental measurements: Fast versus slow probes

    PubMed Central

    Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.

    2016-01-01

    The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments – angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy – suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902

  2. Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students' depictions

    NASA Astrophysics Data System (ADS)

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-12-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

  3. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  4. Double Special Relativity with a Minimum Speed and the Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Nassif, Cludio

    The present work aims to search for an implementation of a new symmetry in the spacetime by introducing the idea of an invariant minimum speed scale (V). Such a lowest limit V, being unattainable by the particles, represents a fundamental and preferred reference frame connected to a universal background field (a vacuum energy) that breaks Lorentz symmetry. So there emerges a new principle of symmetry in the spacetime at the subatomic level for very low energies close to the background frame (v ? V), providing a fundamental understanding for the uncertainty principle, i.e. the uncertainty relations should emerge from the spacetime with an invariant minimum speed.

  5. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  6. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  7. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  8. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an

  9. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  10. Generalized uncertainty principle corrections to the simple harmonic oscillator in phase space

    NASA Astrophysics Data System (ADS)

    Das, Saurya; Robbins, Matthew P. G.; Walton, Mark A.

    2016-01-01

    We compute Wigner functions for the harmonic oscillator including corrections from generalized uncertainty principles (GUPs), and study the corresponding marginal probability densities and other properties. We show that the GUP corrections to the Wigner functions can be significant, and comment on their potential measurability in the laboratory.

  11. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    SciTech Connect

    Tawfik, A.

    2013-07-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

  12. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  13. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron

  14. Integrating Leonardo da Vinci's principles of demonstration, uncertainty, and cultivation in contemporary nursing education.

    PubMed

    Story, Lachel; Butts, Janie

    2014-03-01

    Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosit principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience. PMID:23830068

  15. Quantized Energy Spectrum and Modified Andreev Bound States of a Superconductor with Generalized Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Ke, Sha-Sha; Yao, Xu-Ping; L, Hai-Feng

    2016-01-01

    The effect of a minimal uncertainty in position or momentum measurement on a superconductor system is investigated. The Bogoliubov-de Gennes equations for the quasiparticle states in a linear potential are solved exactly, where the position and momenta are assumed to obey the modified commutation relations. It is found that the quantized energy spectrum of the superconductor could be induced by generalized uncertainty principle (GUP) directly. We also discuss the GUP-corrected Andreev bound states and supercurrent in a SNS structure. The results imply that the GUP effect in superconductor systems could be testable under present experimental condition.

  16. Quantum dynamics of the Taub universe in a generalized uncertainty principle framework

    SciTech Connect

    Battisti, Marco Valerio; Montani, Giovanni

    2008-01-15

    The implications of a generalized Uncertainty principle on the Taub cosmological model are investigated. The model is studied in the Arnowitt-Deser-Misner reduction of the dynamics and therefore a time variable is ruled out. Such a variable is quantized in a canonical way and the only physical degree of freedom of the system (related to the universe anisotropy) is quantized by means of a modified Heisenberg algebra. The analysis is performed at both the classical and quantum level. In particular, at quantum level, the motion of wave packets is investigated. The two main results obtained are as follows: (i) The classical singularity is probabilistically suppressed. The universe exhibits a stationary behavior and the probability amplitude is peaked in a determinate region. (ii) The generalized uncertainty principle wave packets provide the right behavior in the establishment of a quasi-isotropic configuration for the universe.

  17. Certifying Einstein-Podolsky-Rosen steering via the local uncertainty principle

    NASA Astrophysics Data System (ADS)

    Zhen, Yi-Zheng; Zheng, Yu-Lin; Cao, Wen-Fei; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai

    2016-01-01

    Uncertainty principle lies at the heart of quantum mechanics, while nonlocality is an intriguing phenomenon of quantum mechanics to rule out local causal theories. One subtle form of nonlocality is so-called Einstein-Podolsky-Rosen (EPR) steering, which holds the potential for shared entanglement verification even if the one-sided measurement device is untrusted. However, certifying EPR steering remains a big challenge presently. Here, we employ the local uncertainty relation to provide an experimental friendly approach for EPR steering verification. We show that the strength of EPR steering is quantitatively linked to the strength of the uncertainty relation, as well as the amount of entanglement. We find also that the realignment method works for detecting EPR steering of an arbitrary dimensional system.

  18. Microscope and spectroscope results are not limited by Heisenberg's Uncertainty Principle!

    NASA Astrophysics Data System (ADS)

    Prasad, Narasimha S.; Roychoudhuri, Chandrasekhar

    2011-09-01

    A reviewing of many published experimental and theoretical papers demonstrate that the resolving powers of microscopes, spectroscopes and telescopes can be enhanced by orders of magnitude better than old classical limits by various advanced techniques including de-convolution of the CW-response function of these instruments. Heisenberg's original analogy of limited resolution of a microscope, to support his mathematical uncertainty relation, is no longer justifiable today. Modern techniques of detecting single isolated atoms through fluorescence also over-ride this generalized uncertainty principle. Various nano-technology techniques are also making atoms observable and location precisely measurable. Even the traditional time-frequency uncertainty relation or bandwidth limit ?v?t >= 1 can be circumvented while doing spectrometry with short pulses by deriving and de-convolving the pulse-response function of the spectrometer just as we do for CW input.

  19. Energy distribution of massless particles on black hole backgrounds with generalized uncertainty principle

    SciTech Connect

    Li Zhongheng

    2009-10-15

    We derive new formulas for the spectral energy density and total energy density of massless particles in a general spherically symmetric static metric from a generalized uncertainty principle. Compared with blackbody radiation, the spectral energy density is strongly damped at high frequencies. For large values of r, the spectral energy density diminishes when r grows, but at the event horizon, the spectral energy density vanishes and therefore thermodynamic quantities near a black hole, calculated via the generalized uncertainty principle, do not require any cutoff parameter. We find that the total energy density can be expressed in terms of Hurwitz zeta functions. It should be noted that at large r (low local temperature), the difference between the total energy density and the Stefan-Boltzmann law is too small to be observed. However, as r approaches an event horizon, the effect of the generalized uncertainty principle becomes more and more important, which may be observable. As examples, the spectral energy densities in the background metric of a Schwarzschild black hole and of a Schwarzschild black hole plus quintessence are discussed. It is interesting to note that the maximum of the distribution shifts to higher frequencies when the quintessence equation of state parameter w decreases.

  20. The uncertainty principle enables non-classical dynamics in an interferometer.

    PubMed

    Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

    2014-01-01

    The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics. PMID:25105741

  1. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGESBeta

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  2. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    SciTech Connect

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.

  3. Using the uncertainty principle to design simple interactions for targeted self-assembly

    NASA Astrophysics Data System (ADS)

    Edlund, E.; Lindgren, O.; Jacobi, M. Nilsson

    2013-07-01

    We present a method that systematically simplifies isotropic interactions designed for targeted self-assembly. The uncertainty principle is used to show that an optimal simplification is achieved by a combination of heat kernel smoothing and Gaussian screening of the interaction potential in real and reciprocal space. We use this method to analytically design isotropic interactions for self-assembly of complex lattices and of materials with functional properties. The derived interactions are simple enough to narrow the gap between theory and experimental implementation of theory based designed self-assembling materials.

  4. Lifespan of rotating black hole in the frame of generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    He, Tangmei; Zhang, Jingyi; Yang, Jinbo; Tan, Hongwei

    2016-01-01

    In this paper, the lifespan under the generalized uncertainty principle (GUP) of rotating black hole is derived through the corrected radiation energy flux and the first law of the thermodynamics of black hole. The radiation energy flux indicates that there exist the highest temperature and the minimum mass both of which are relevant to the initial mass of the black hole in the final stage of the radiation. The lifespan of rotating black hole includes three terms: the dominant term is just the lifespan in the flat spacetime; the other two terms are individually induced by the rotation and the GUP.

  5. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect

    Said, Jackson Levi; Adami, Kristian Zarb

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  6. Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

    SciTech Connect

    Tallacchini, Mariachiara . E-mail: mariachiara.tallacchini@unimi.it

    2005-09-01

    The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

  7. Covariant energy–momentum and an uncertainty principle for general relativity

    SciTech Connect

    Cooperstock, F.I.; Dupre, M.J.

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

  8. Principle and Uncertainty Quantification of an Experiment Designed to Infer Actinide Neutron Capture Cross-Sections

    SciTech Connect

    G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul

    2010-01-01

    An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.

  9. Principles for Robust On-orbit Uncertainties Traceable to the SI (Invited)

    NASA Astrophysics Data System (ADS)

    Shirley, E. L.; Dykema, J. A.; Fraser, G. T.; Anderson, J.

    2009-12-01

    Climate-change research requires space-based measurements of the Earth’s spectral radiance, reflectance, and atmospheric properties with unprecedented accuracy. Increases in measurement accuracy would improve and accelerate the quantitative determination of decadal climate change. The increases would also permit attribution of climate change to anthropogenic causes and foster understanding of climate evolution on an accelerated time scale. Beyond merely answering key questions about global climate change, accurate measurements would also be of benefit by testing and refining climate models to enhance and quantify their predictive value. Accurate measurements imply traceability to the SI system of units. In this regard, traceability is a property of the result of a measurement, or the value of a standard, whereby it can be related to international standards through an unbroken chain of comparisons, all having stated (and realistic) uncertainties. SI-traceability allows one to compare measurements independent of locale, time, or sensor. In this way, SI-traceability alleviates the urgency to maintain a false assurance of measurement accuracy by having an unbroken time series of observations continually adjusted so that measurement results obtained with a given instrument match the measurement results of its recent predecessors. Moreover, to make quantitative inferences from measurement results obtained in various contexts, which might range, for instance, from radiometry to atmospheric chemistry, having SI-traceability throughout all work is essential. One can derive principles for robust claims of SI-traceability from lessons learned by the scientific community. In particular, National Measurement Institutes (NMIs), such as NIST, use several strategies in their realization of practical SI-traceable measurements of the highest accuracy: (1.) basing ultimate standards on fundamental physical phenomena, such as the Quantum Hall resistance, instead of measurement artifacts; (2.) developing a variety of approaches to measure a given physical quantity; (3.) conducting intercomparisons of measurements performed by different institutions; (4.) perpetually seeking complete understanding of all sources of measurement bias and uncertainty; (5.) rigorously analyzing measurement uncertainties; and (6.) maintaining a high level of transparency that permits peer review of measurement practices. It is imperative to establish SI-traceability at the beginning of an environmental satellite program. This includes planning for system-level pre-launch and, in particular, on-orbit instrument calibration. On-orbit calibration strategies should be insensitive to reasonably expected perturbations that arise during launch or on orbit, and one should employ strategies to validate on-orbit traceability. As a rule, optical systems with simple designs tend to be more amenable to robust calibration schemes.

  10. Revisiting the Calculation of I/V Profiles in Molecular Junctions Using the Uncertainty Principle.

    PubMed

    Ramos-Berdullas, Nicols; Mandado, Marcos

    2014-04-17

    Ortiz and Seminario (J. Chem. Phys. 2007, 127, 111106/1-3) proposed some years ago a simple and direct approach to obtain I/V profiles from the combination of ab initio equilibrium electronic structure calculations and the uncertainty principle as an alternative or complementary tool to more sophisticated nonequilibrium Green's functions methods. In this work, we revisit the fundamentals of this approach and reformulate accordingly the expression of the electric current. By analogy to the spontaneous electron decay process in electron transitions, in our revision, the current is calculated upon the relaxing process from the "polarized" state induced by the external electric field to the electronic ground state. The electric current is obtained from the total charge transferred through the molecule and the corresponding electronic energy relaxation. The electric current expression proposed is more general compared with the previous expression employed by Ortiz and Seminario, where the charge variation must be tested among different slabs of atoms at the contact. This new approach has been tested on benzene-1,4-dithiolate attached to different gold clusters that represent the contact with the electrodes. Analysis of the total electron deformation density induced by the external electric voltage and properties associated with the electron deformation orbitals supports the conclusions obtained from the I/V profiles. PMID:24689867

  11. Black-hole thermodynamics with modified dispersion relations and generalized uncertainty principles

    NASA Astrophysics Data System (ADS)

    Amelino-Camelia, Giovanni; Arzano, Michele; Ling, Yi; Mandanici, Gianluca

    2006-04-01

    In several approaches to the quantum-gravity problem evidence has emerged of the validity of a 'GUP' (a generalized position-momentum uncertainty principle) and/or a 'MDR' (a modification of the energy-momentum dispersion relation), but very little is known about the implications of GUPs and MDRs for black-hole thermodynamics, another key topic for quantum-gravity research. We investigate an apparent link, already suggested in an earlier exploratory study involving two of us, between the possibility of a GUP and/or an MDR and the possibility of a log term in the area-entropy black-hole formula. We then obtain, from that same perspective, a modified relation between the mass of a black hole and its temperature, and we examine the validity of the 'generalized second law of black-hole thermodynamics' in theories with a GUP and/or an MDR. After an analysis of GUP- and MDR-modifications of the black-body radiation spectrum, we conclude the study with a description of the black-hole evaporation process.

  12. Femtoscopic scales in p + p and p + Pb collisions in view of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Braun-Munzinger, P.; Karpenko, Iu. A.; Sinyukov, Yu. M.

    2013-08-01

    A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in p + p collisions at LHC energy (√{ s} = 7 TeV). A prediction is also presented of pion interferometric radii for p + Pb collisions at √{ s} = 5.02 TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as 'afterburner' is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of p + p collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic A + A collisions, also for high multiplicity p + p and p + Pb events.

  13. f (R )-modified gravity, Wald entropy, and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Hammad, Fayal

    2015-08-01

    Wald's entropy formula allows one to find the entropy of black holes' event horizon within any diffeomorphism invariant theory of gravity. When applied to general relativity, the formula yields the Bekenstein-Hawking result but, for any other gravitational action that departs from the Hilbert action, the resulting entropy acquires an additional multiplicative factor that depends on the global geometry of the background spacetime. On the other hand, the generalized uncertainty principle (GUP) has extensively been recently used to investigate corrections to the Bekenstein-Hawking entropy formula, with the conclusion that the latter always comes multiplied by a factor that depends on the area of the event horizon. We show, by considering the case of an f (R )-modified gravity, that the usual black hole entropy derivation based on the GUP might be modified in such a way that the two methods yield the same corrections to Bekenstein-Hawking formula. The procedure turns out to be an interesting method for seeking modified gravity theories. Two different versions of the GUP are used, and it is found that only one of them yields a viable modified gravity model. Conversely, it is possible to find a general formulation of the GUP that would reproduce Wald entropy formula for any f (R ) theory of gravity.

  14. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    PubMed

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction. PMID:26022837

  15. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  16. The effect of generalized uncertainty principle on square well, a case study

    SciTech Connect

    Ma, Meng-Sen; Zhao, Ren

    2014-08-15

    According to a special case (? = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except ?. But the eigenstates will depend on another two parameters besides ?.

  17. Corrected Bekenstein-Hawking entropy of warped AdS3 rotating black hole with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Mahanta, Chandra Rekha; Misra, Rajesh

    2015-08-01

    In the Generalized Uncertainty Principle (GUP), there should be a minimal black hole whose size is comparable to the minimal length so that it cannot evaporate completely through the thermal radiation. Again, the black hole is not allowed to have a mass less than a scale of order Planck mass, which suggested a black hole remnant. We study the warped AdS3 rotating black hole and calculate the entropy, heat capacity and critical mass with the help of GUP. We compute the area theorem with GUP correction.

  18. Reverse-reconciliation continuous-variable quantum key distribution based on the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Furrer, Fabian

    2014-10-01

    A big challenge in continuous-variable quantum key distribution is to prove security against arbitrary coherent attacks including realistic assumptions such as finite-size effects. Recently, such a proof has been presented in [Phys. Rev. Lett. 109, 100502 (2012), 10.1103/PhysRevLett.109.100502] for a two-mode squeezed state protocol based on a novel uncertainty relation with quantum memories. But the transmission distances were fairly limited due to a direct reconciliation protocol. We prove here security against coherent attacks of a reverse-reconciliation protocol under similar assumptions but allowing distances of over 16 km for experimentally feasible parameters. We further clarify the limitations when using the uncertainty relation with quantum memories in security proofs of continuous-variable quantum key distribution.

  19. Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators

    SciTech Connect

    Marchiolli, M.A.; Mendona, P.E.M.F.

    2013-09-15

    We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the MassarSpindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the WienerKhinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the MassarSpindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. Determination of new restrictions upon the selective process of signals and wavelet bases. Demonstration of looser bounds interpolating between the tightest bound and the MassarSpindel inequality. Construction of finite ground states properly describing the tightest bound. Establishment of an important connection with the discrete Weyl function.

  20. Chaos and the way of Zen: psychiatric nursing and the 'uncertainty principle'.

    PubMed

    Barker, P J

    1996-08-01

    The biological sciences have been dominated by 'classicist' science-predicated on the post-Enlightenment belief that a real world exists, which behaves according to notions of causality and consistency. Although medicine, and by implication psychiatric nursing, derives its explanatory power from such a science, much of its focus-illness-is not amenable to causal explanation or prediction. The theoretical developments of the 'new physics' have been used to redefine science and, as a result, have challenged traditional constructions of reality. The new physics are usually framed in terms of the physical world, or to construe consciousness. In this paper I shall consider the implications of chaos-a relative of the new physics-for psychiatric nursing practice. As nursing appears to crave a 'certainty principle' to govern the theoretical underpinnings of practice, this study considers how chaos might contribute to a metaparadigm of nursing. PMID:8997984

  1. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  2. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Technical Reports Server (NTRS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  3. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

  4. Adding a strategic edge to human factors/ergonomics: principles for the management of uncertainty as cornerstones for system design.

    PubMed

    Grote, Gudela

    2014-01-01

    It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735

  5. The special theory of Brownian relativity: equivalence principle for dynamic and static random paths and uncertainty relation for diffusion.

    PubMed

    Mezzasalma, Stefano A

    2007-03-15

    The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124

  6. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Zger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.

  7. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  8. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  9. Entropic uncertainty relations for multiple measurements

    NASA Astrophysics Data System (ADS)

    Liu, Shang; Mu, Liang-Zhu; Fan, Heng

    2015-04-01

    We present the entropic uncertainty relations for multiple measurement settings which demonstrate the uncertainty principle of quantum mechanics. Those uncertainty relations are obtained for both cases with and without the presence of quantum memory, and can be proven by a unified method. Our results recover some well known entropic uncertainty relations for two observables, which show the uncertainties about the outcomes of two incompatible measurements. The bounds of those relations which quantify the extent of the uncertainty take concise forms and are easy to calculate. Those uncertainty relations might play important roles in the foundations of quantum theory. Potential experimental demonstration of those entropic uncertainty relations is discussed.

  10. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  11. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  12. Laser triangulation: fundamental uncertainty in distance measurement.

    PubMed

    Dorsch, R G; Husler, G; Herrmann, J M

    1994-03-01

    We discuss the uncertainty limit in distance sensing by laser triangulation. The uncertainty in distance measurement of laser triangulation sensors and other coherent sensors is limited by speckle noise. Speckle arises because of the coherent illumination in combination with rough surfaces. A minimum limit on the distance uncertainty is derived through speckle statistics. This uncertainty is a function of wavelength, observation aperture, and speckle contrast in the spot image. Surprisingly, it is the same distance uncertainty that we obtained from a single-photon experiment and from Heisenberg's uncertainty principle. Experiments confirm the theory. An uncertainty principle connecting lateral resolution and distance uncertainty is introduced. Design criteria for a sensor with minimum distanc uncertainty are determined: small temporal coherence, small spatial coherence, a large observation aperture. PMID:20862156

  13. Quantal localization and the uncertainty principle

    SciTech Connect

    Leopold, J.G.; Richards, D.

    1988-09-01

    We give a dynamical explanation for the localization of the wave function for the one-dimensional hydrogen atom, with the Coulomb singularity, in a high-frequency electric field, which leads to a necessary condition for classical dynamics to be valid. Numerical tests confirm the accuracy of the condition. Our analysis is relevant to the comparison between the classical and quantal dynamics of the kicked rotor and standard map.

  14. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  15. Reformulating the Quantum Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  16. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  17. Reformulating the Quantum Uncertainty Relation

    PubMed Central

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the triviality problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  18. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of

  19. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  20. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it

  1. Rényi entropy uncertainty relation for successive projective measurements

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-06-01

    We investigate the uncertainty principle for two successive projective measurements in terms of Rényi entropy based on a single quantum system. Our results cover a large family of the entropy (including the Shannon entropy) uncertainty relations with a lower optimal bound. We compare our relation with other formulations of the uncertainty principle in two-spin observables measured on a pure quantum state of qubit. It is shown that the low bound of our uncertainty relation has better tightness.

  2. Majorization formulation of uncertainty in quantum mechanics

    SciTech Connect

    Partovi, M. Hossein

    2011-11-15

    Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.

  3. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  4. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  5. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Land, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ? (x) and ? (p); 11. Complementarity; 12. Mathematical relation between ? (x) and ? (p) for free particles; 13. General relation between ? (q) and ? (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ? (t) and ? (?); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ? and ?; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrdinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for ?p (q) and Xq (p); 39. Differential equation for ?? (q); 40. The general probability amplitude ??' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrdinger's equation for conservative systems; 45. Schrdinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrdinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.

  6. Uncertainty in the Classroom--Teaching Quantum Physics

    ERIC Educational Resources Information Center

    Johansson, K. E.; Milstead, D.

    2008-01-01

    The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how

  7. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  8. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  9. Abolishing the maximum tension principle

    NASA Astrophysics Data System (ADS)

    D?browski, Mariusz P.; Gohar, H.

    2015-09-01

    We find the series of example theories for which the relativistic limit of maximum tension Fmax =c4 / 4 G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.

  10. Principled leadership.

    PubMed

    Lexa, Frank James

    2010-07-01

    Leadership is not just a set of activities; it is also about vision and character. Principles matter: for you, for your coworkers, and for the group or institution you serve. Individuals and groups can succeed only through a climate of commitment and trust. Your integrity and principled leadership are the cornerstones for building an effective team. Following principles doesn't mean that you will win every time, but having a plan and sticking to it even in tough times is a strong element of long-term success. PMID:20630389

  11. Messaging climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Cooke, Roger M.

    2015-01-01

    Climate change is full of uncertainty and the messengers of climate science are not getting the uncertainty narrative right. To communicate uncertainty one must first understand it, and then avoid repeating the mistakes of the past.

  12. Angular performance measure for tighter uncertainty relations

    SciTech Connect

    Hradil, Z.; Rehacek, J.; Klimov, A. B.; Rigas, I.; Sanchez-Soto, L. L.

    2010-01-15

    The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

  13. Angular performance measure for tighter uncertainty relations

    NASA Astrophysics Data System (ADS)

    Hradil, Z.; ?eh?ek, J.; Klimov, A. B.; Rigas, I.; Snchez-Soto, L. L.

    2010-01-01

    The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

  14. The physical origins of the uncertainty theorem

    NASA Astrophysics Data System (ADS)

    Giese, Albrecht

    2013-10-01

    The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

  15. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  16. Uncertainty as knowledge.

    PubMed

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D

    2015-11-28

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  17. Psychosomatic Principles

    PubMed Central

    Cleghorn, R. A.

    1965-01-01

    There are four lines of development that might be called psychosomatic principles. The first represents the work initiated by Claude Bernard, Cannon, and others, in neurophysiology and endocrinology in relationship to stress. The second is the application of psychoanalytic formulations to the understanding of illness. The third is in the development of the social sciences, particularly anthropology, social psychology and sociology with respect to the emotional life of man, and, fourth, there is an increased application of epidemiological techniques to the understanding and incidence of disease and its causes. These principles can be applied to the concepts of comprehensive medicine and they bid fair to be unifying and helpful in its study. This means that future practitioners, as well as those working in the field of psychosomatic medicine, are going to have to have a much more precise knowledge of the influence of emotions on bodily processes. PMID:14259334

  18. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  19. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of 10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around 10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from 10% to 15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  20. Role of information theoretic uncertainty relations in quantum theory

    SciTech Connect

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  1. Role of information theoretic uncertainty relations in quantum theory

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo

    2015-04-01

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson-Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.

  2. Anti HeisenbergRefutation Of Heisenberg's Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Baruk?i?, Ilija

    2011-03-01

    The quantum mechanical uncertainty principle for position and momentum plays an important role in many treatments on the (philosophical, physical and other) implications of quantum mechanics. Roughly speaking, the more precisely the momentum (position) of a (quantum mechanical) object is given, the less precisely can one say what its position (momentum) is. This quantum mechanical measurement problem is not just an interpretational difficulty, it raises broader issues as well. The measurement (of a property) of a (quantum mechanical) object determines the existence of the measured. In brief, the quantum mechanical uncertainty principle challenges some fundamental principles of Science and especially the principle of causality. In particular, an independently existing (external) objective reality is denied. As we shall see, that the quantum mechanical uncertainty principle for position and momentum is based on the assumption that 1?0, which is a logical contradiction.

  3. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  4. Regulating under uncertainty: newsboy for exposure limits.

    PubMed

    Cooke, Roger M; Macdonell, Margaret

    2008-06-01

    Setting action levels or limits for health protection is complicated by uncertainty in the dose-response relation across a range of hazards and exposures. To address this issue, we consider the classic newsboy problem. The principles used to manage uncertainty for that case are applied to two stylized exposure examples, one for high dose and high dose rate radiation and the other for ammonia. Both incorporate expert judgment on uncertainty quantification in the dose-response relationship. The mathematical technique of probabilistic inversion also plays a key role. We propose a coupled approach, whereby scientists quantify the dose-response uncertainty using techniques such as structured expert judgment with performance weights and probabilistic inversion, and stakeholders quantify associated loss rates. PMID:18643816

  5. Uncertainty in audiometer calibration

    NASA Astrophysics Data System (ADS)

    Aurélio Pedroso, Marcos; Gerges, Samir N. Y.; Gonçalves, Armando A., Jr.

    2004-02-01

    The objective of this work is to present a metrology study necessary for the accreditation of audiometer calibration procedures at the National Brazilian Institute of Metrology Standardization and Industrial Quality—INMETRO. A model for the calculation of measurement uncertainty was developed. Metrological aspects relating to audiometer calibration, traceability and measurement uncertainty were quantified through comparison between results obtained at the Industrial Noise Laboratory—LARI of the Federal University of Santa Catarina—UFSC and the Laboratory of Electric/acoustics—LAETA of INMETRO. Similar metrological performance of the measurement system used in both laboratories was obtained, indicating that the interlaboratory results are compatible with the expected values. The uncertainty calculation was based on the documents: EA-4/02 Expression of the Uncertainty of Measurement in Calibration (European Co-operation for Accreditation 1999 EA-4/02 p 79) and Guide to the Expression of Uncertainty in Measurement (International Organization for Standardization 1993 1st edn, corrected and reprinted in 1995, Geneva, Switzerland). Some sources of uncertainty were calculated theoretically (uncertainty type B) and other sources were measured experimentally (uncertainty type A). The global value of uncertainty calculated for the sound pressure levels (SPLs) is similar to that given by other calibration institutions. The results of uncertainty related to measurements of SPL were compared with the maximum uncertainties Umax given in the standard IEC 60645-1: 2001 (International Electrotechnical Commission 2001 IEC 60645-1 Electroacoustics—Audiological Equipment—Part 1:—Pure-Tone Audiometers).

  6. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  7. Uncertainty quantification for Markov chain models

    NASA Astrophysics Data System (ADS)

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

  8. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of intolerance of uncertainty has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  9. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine. PMID:21181616

  10. Generalized uncertainty principle and self-adjoint operators

    SciTech Connect

    Balasubramanian, Venkat; Das, Saurya; Vagenas, Elias C.

    2015-09-15

    In this work we explore the self-adjointness of the GUP-modified momentum and Hamiltonian operators over different domains. In particular, we utilize the theorem by von-Neumann for symmetric operators in order to determine whether the momentum and Hamiltonian operators are self-adjoint or not, or they have self-adjoint extensions over the given domain. In addition, a simple example of the Hamiltonian operator describing a particle in a box is given. The solutions of the boundary conditions that describe the self-adjoint extensions of the specific Hamiltonian operator are obtained.

  11. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to

  12. The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment

    ERIC Educational Resources Information Center

    Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

    2010-01-01

    An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

  13. Time-energy uncertainty relation and Lorentz covariance

    SciTech Connect

    Hussar, P.E.; Kim, Y.S.; Noz, M.E.

    1985-02-01

    The uncertainty relations applicable to space and time separations between the quarks in a hadron are discussed. It is pointed out that the time-energy uncertainty relation between the time and energy separations is the same as the relationship between the widths and lifetimes of unstable states. It is then shown that this relation can be combined with Heisenberg's position-momentum uncertainty relation to give the uncertainty principle in a covariant form. It is pointed out that this effect manifests itself in Feynman's parton picture.

  14. Physics and Operational Research: measure of uncertainty via Nonlinear Programming

    NASA Astrophysics Data System (ADS)

    Davizon-Castillo, Yasser A.

    2008-03-01

    Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

  15. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  16. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  17. Deterministic uncertainty analysis

    SciTech Connect

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig.

  18. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may

  19. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  20. Time crystals from minimum time uncertainty

    NASA Astrophysics Data System (ADS)

    Faizal, Mir; Khalil, Mohammed M.; Das, Saurya

    2016-01-01

    Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal. As an application of our formalism, we analyze the effect of such a deformation on the rate of spontaneous emission in a hydrogen atom.

  1. Controlling entropic uncertainty bound through memory effects

    NASA Astrophysics Data System (ADS)

    Karpat, Gktu?; Piilo, Jyrki; Maniscalco, Sabrina

    2015-09-01

    One of the defining traits of quantum mechanics is the uncertainty principle which was originally expressed in terms of the standard deviation of two observables. Alternatively, it can be formulated using entropic measures, and can also be generalized by including a memory particle that is entangled with the particle to be measured. Here we consider a realistic scenario where the memory particle is an open system interacting with an external environment. Through the relation of conditional entropy to mutual information, we provide a link between memory effects and the rate of change of conditional entropy controlling the lower bound of the entropic uncertainty relation. Our treatment reveals that the memory effects stemming from the non-Markovian nature of quantum dynamical maps directly control the lower bound of the entropic uncertainty relation in a general way, independently of the specific type of interaction between the memory particle and its environment.

  2. The Uncertainty Relation for Quantum Propositions

    NASA Astrophysics Data System (ADS)

    Zizzi, Paola

    2013-01-01

    Logical propositions with the fuzzy modality "Probably" are shown to obey an uncertainty principle very similar to that of Quantum Optics. In the case of such propositions, the partial truth values are in fact probabilities. The corresponding assertions in the metalanguage, have complex assertion degrees which can be interpreted as probability amplitudes. In the logical case, the uncertainty relation is about the assertion degree, which plays the role of the phase, and the total number of atomic propositions, which plays the role of the number of modes. In analogy with coherent states in quantum physics, we define "quantum coherent propositions" those which minimize the above logical uncertainty relation. Finally, we show that there is only one kind of compound quantum-coherent propositions: the "cat state" propositions.

  3. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications. PMID:22042902

  4. 3 Component PIV Uncertainty

    NASA Astrophysics Data System (ADS)

    Warner, Scott; Smith, Barton

    2013-11-01

    The random uncertainty of 2-component (2C) Particle Image Velocimetry (PIV) has recently been addressed in three unique methods called the Uncertainty Surface Method (USM) from Utah State University, Image Matching (IM) method from Lavision and Delft, and correlation Signal to Noise Ration (SNR) methods from Virginia Tech. Since 3C (stereo) Particle Image Velocimetry (PIV) velocity fields are derived from two, 2C fields, random uncertainties from the 2C fields clearly propagate into the 3C field. In this work, we will demonstrate such a propagation using commercial PIV software and the USM method, although the propagation works similarly for any 2C random uncertainty method. Stereo calibration information is needed to perform this propagation. As a starting point, a pair of 2C uncertainty fields will be combined in exactly the same manner as velocity fields to form a 3C uncertainty field using commercial software. Correlated uncertainties between the components in the two 2C fields will be addressed. These results will then by compared to a more rigorous propagation, which requires access to the calibration information. Thanks to the Nuclear Science & Technology Directorate at Idaho National Laboratory. The work was supported through the U.S. Department of Energy, Laboratory Directed Research & Development grant under DOE Contract 122440 (Project Number: 12-045).

  5. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  6. Adaptive framework for uncertainty analysis in electromagnetic field measurements.

    PubMed

    Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano

    2015-04-01

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. PMID:25143178

  7. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  8. A Generalized Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Chen, Zhengli; Liang, Lili; Li, Haojing; Wang, Wenhua

    2015-08-01

    By using a generalization of the Wigner-Yanase-Dyson skew information, a quantity is introduced in this paper for every Hilbert-Schmidt operator A on a Hilbert space H and a related uncertainty relation was established. The obtained inequality generalizes a known uncertainty relation. Moreover, a negative answer to a conjecture induced in Dou and Du (Int. J. Theor. Phys. 53, 952-958, 2014) was given by a counterexample.

  9. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.

  10. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  11. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  12. Picture independent quantum action principle

    SciTech Connect

    Mantke, W.J.

    1992-01-01

    The Schwinger action principle for quantum mechanics is extended into a picture independent form. This displays the quantum connection. Time variations are formulated as variations of a time variable and included into the kinematical variations. Kets and bras represent experimental operations. Experimental operations at different times cannot be identified. The ket and the bra spaces are fiber bundles over time. The same applies to the classical configuration space. For the classical action principle the action can be varied by changing the path or the classical variables. The latter variation of classical functions corresponds to kinematical variations of quantum variables. The picture independent formulation represents time evolution by a connection. A standard experiment is represented by a ket, a connection and a bra. For particular start and end times of experiments, the action and the contraction into a transition amplitude are elements of a new tensor space of quantum correspondents of path functionals. The classical correspondent of the transition amplitude is the probability for a specified state to evolve along a particular path segment. The elements of the dual tensor space represent standard experiments or superpositions thereof. The kinematical variations of the quantum variables are commuting numbers. Variations that include the effect of Poincare or gauge transformations have different commutator properties. The Schwinger action principle is derived from the Feynman path integral formulation. The limitations from the time-energy uncertainty relation might be accommodated by superposing experiments that differ in their start- and end-times. In its picture independent form the action principle can be applied to all superpositions of standard experiments. This may involve superpositions of different connections. The extension of the superposition principle to connections allows representation of a quantum field by a part of the connection.

  13. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  14. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  15. The value of uncertainty.

    PubMed

    Feldman, Michael

    2013-01-01

    The author discusses some of the characteristics of Roy Schafer's contributions to psychoanalysis that he finds most valuable, such as his openness to uncertainty, his anti-reductive view of analytic constructions, his unique formulation of the analyst's role, and his close attention to how the patient engenders particular emotional reactions in the analyst. The author also presents a clinical vignette illustrating the value of the analyst's tolerance of uncertainty in the face of the patient's push for interpretations, explanations, and reassurance. PMID:23457099

  16. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  17. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  18. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  19. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  20. Quasar uncertainty study

    SciTech Connect

    Khatib-Rahbar, M.; Park, C.; Davis, R.; Nourbakhsh, H.; Lee, M.; Cazzoli, E.; Schmidt, E.

    1986-10-01

    Over the last decade, substantial development and progress has been made in the understanding of the nature of severe accidents and associated fission product release and transport. As part of this continuing effort, the United States Nuclear Regulatory Commission (USNRC) sponsored the development of the Source Term Code Package (STCP), which models core degradation, fission product release from the damaged fuel, and the subsequent migration of the fission products from the primary system to the containment and finally to the environment. The objectives of the QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors) program are: (1) to address the uncertainties associated with input parameters and phenomenological models used in the STCP; and (2) to define reasonable and technically defensible parameter ranges and modelling assumptions for the use in the STCP. The uncertainties in the radiological releases to the environment can be defined as the degree of current knowledge associated with the magnitude, the timing, duration, and other pertinent characteristics of the release following a severe nuclear reactor accident. These uncertainties can be quantified by probability density functions (PDF) using the Source Term Code Package as the physical model. An attempt will also be made to address the phenomenological issues not adequately modeled by the STCP, using more advanced, mechanistic models.

  1. Coping with Uncertainty.

    ERIC Educational Resources Information Center

    Wargo, John

    1985-01-01

    Draws conclusions on the scientific uncertainty surrounding most chemical use regulatory decisions, examining the evolution of law and science, benefit analysis, and improving information. Suggests: (1) rapid development of knowledge of chemical risks and (2) a regulatory system which is flexible to new scientific knowledge. (DH)

  2. Hybrid uncertainty theory

    SciTech Connect

    Oblow, E.M.

    1985-05-13

    A hybrid uncertainty theory for artificial intelligence problems combining the strengths of fuzzy-set theory and Dempster/Shafer theory is presented. The basic operations for combining uncertain information are given with an indication of their applicability in expert systems and robot planning problems.

  3. Reciprocity and uncertainty.

    PubMed

    Bereby-Meyer, Yoella

    2012-02-01

    Guala points to a discrepancy between strong negative reciprocity observed in the lab and the way cooperation is sustained "in the wild." This commentary suggests that in lab experiments, strong negative reciprocity is limited when uncertainty exists regarding the players' actions and the intentions. Thus, costly punishment is indeed a limited mechanism for sustaining cooperation in an uncertain environment. PMID:22289307

  4. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  5. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  6. The equivalence principle in a quantum world

    NASA Astrophysics Data System (ADS)

    Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre

    2015-09-01

    We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).

  7. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  8. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  9. Multiresolutional models of uncertainty generation and reduction

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.

  10. Separability conditions from the Landau-Pollak uncertainty relation

    SciTech Connect

    Vicente, Julio I. de; Sanchez-Ruiz, Jorge

    2005-05-15

    We obtain a collection of necessary (sufficient) conditions for a bipartite system of qubits to be separable (entangled), which are based on the Landau-Pollak formulation of the uncertainty principle. These conditions are tested and compared with previously stated criteria by applying them to states whose separability limits are already known. Our results are also extended to multipartite and higher-dimensional systems.

  11. The 4P Approach to Dealing with Scientific Uncertainty.

    ERIC Educational Resources Information Center

    Costanza, Robert; Cornwell, Laura

    1992-01-01

    Suggests a new approach to environmental protection that requires users of environmental resources to post a bond adequate to cover uncertain future environmental damages. Summarized as the "precautionary polluter pays principle," or the 4P approach, it shifts the burden of proof and the cost of uncertainty from the public to the resource user.…

  12. Intuitions, principles and consequences.

    PubMed

    Shaw, A B

    2001-02-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. PMID:11233371

  13. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  14. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  15. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  16. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  17. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness

  18. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that

  19. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  20. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  1. Deterministic uncertainty analysis

    SciTech Connect

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs.

  2. Equivalence of wave-particle duality to entropic uncertainty.

    PubMed

    Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie

    2014-01-01

    Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138

  3. Using Models that Incorporate Uncertainty

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.

    2002-01-01

    In this article, the author discusses the use in policy analysis of models that incorporate uncertainty. He believes that all models should consider incorporating uncertainty, but that at the same time it is important to understand that sampling variability is not usually the dominant driver of uncertainty in policy analyses. He also argues that

  4. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  5. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory.

    PubMed

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details. PMID:26118488

  6. Prediction uncertainty evaluation methods of core performance parameters in large liquid-metal fast breeder reactors

    SciTech Connect

    Takeda, T.; Yoshimura, A. . Faculty of Engineering); Kamei, T. ); Shirakata, K. )

    1989-10-01

    Formulas for predicting the uncertainty of neutronic performance parameters are derived for three methods: the bias factor method, the adjustment method, and the combined method. The prediction uncertainties are obtained by including both experimental and method errors. The adjustment method, in principle, yields the same uncertainty as the combined method. The derived formulas are applied to a large homogeneous 1000-MW (electric) liquid-metal fast breeder reactor core.

  7. Position-momentum uncertainty relations based on moments of arbitrary order

    SciTech Connect

    Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.

    2011-05-15

    The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

  8. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  9. Schwarzschild mass uncertainty

    NASA Astrophysics Data System (ADS)

    Davidson, Aharon; Yellin, Ben

    2014-02-01

    Applying Dirac's procedure to -dependent constrained systems, we derive a reduced total Hamiltonian, resembling an upside down harmonic oscillator, which generates the Schwarzschild solution in the mini super-spacetime. Associated with the now -dependent Schrodinger equation is a tower of localized Guth-Pi-Barton wave packets, orthonormal and non-singular, admitting equally spaced average-`energy' levels. Our approach is characterized by a universal quantum mechanical uncertainty structure which enters the game already at the flat spacetime level, and accompanies the massive Schwarzschild sector for any arbitrary mean mass. The average black hole horizon surface area is linearly quantized.

  10. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

  11. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  12. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  13. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  14. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  15. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  16. The maintenance of uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ?-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary

  17. Uncertainty in Wildfire Behavior

    NASA Astrophysics Data System (ADS)

    Finney, M.; Cohen, J. D.

    2013-12-01

    The challenge of predicting or modeling fire behavior is well recognized by scientists and managers who attempt predictions of fire spread rate or growth. At the scale of the spreading fire, the uncertainty in winds, moisture, fuel structure, and fire location make accurate predictions difficult, and the non-linear response of fire spread to these conditions means that average behavior is poorly represented by average environmental parameters. Even more difficult are estimations of threshold behaviors (e.g. spread/no-spread, crown fire initiation, ember generation and spotting) because the fire responds as a step-function to small changes in one or more environmental variables, translating to dynamical feedbacks and unpredictability. Recent research shows that ignition of fuel particles, itself a threshold phenomenon, depends on flame contact which is absolutely not steady or uniform. Recent studies of flame structure in both spreading and stationary fires reveals that much of the non-steadiness of the flames as they contact fuel particles results from buoyant instabilities that produce quasi-periodic flame structures. With fuel particle ignition produced by time-varying heating and short-range flame contact, future improvements in fire behavior modeling will likely require statistical approaches to deal with the uncertainty at all scales, including the level of heat transfer, the fuel arrangement, and weather.

  18. The precautionary principle and ecological hazards of genetically modified organisms.

    PubMed

    Giampietro, Mario

    2002-09-01

    This paper makes three points relevant to the application of the precautionary principle to the regulation of GMOs. i) The unavoidable arbitrariness in the application of the precautionary principle reflects a deeper epistemological problem affecting scientific analyses of sustainability. This requires understanding the difference between the concepts of "risk", "uncertainty" and "ignorance". ii) When dealing with evolutionary processes it is impossible to ban uncertainty and ignorance from scientific models. Hence, traditional risk analysis (probability distributions and exact numerical models) becomes powerless. Other forms of scientific knowledge (general principles or metaphors) may be useful alternatives. iii) The existence of ecological hazards per se should not be used as a reason to stop innovations altogether. However, the precautionary principle entails that scientists move away from the concept of "substantive rationality" (trying to indicate to society optimal solutions) to that of "procedural rationality" (trying to help society to find "satisficing" solutions). PMID:12436844

  19. Defending principlism well understood.

    PubMed

    Quante, Michael; Vieth, Andreas

    2002-12-01

    After presenting the current version of principlism, in the process repudiating a widespread deductivist misinterpretation, a fundamental metaethical disagreement is developed by outlining the deductivistic critique of principlism. Once the grounds for this critique have been understood, the dispute between casuistry, deductivism and principlism can be restructured, and the model of "application" proven to be the central difference. In the concluding section it is argued that principlism is the most attractive position, if the perceptual model of weak intuitionism is made more explicit. PMID:12607161

  20. Physical principles of hearing

    NASA Astrophysics Data System (ADS)

    Martin, Pascal

    2015-10-01

    The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography

  1. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity to assessing the damage to different elements at risk, of the databases on different elements at risk, such as population and building stock distribution, as well critical facilities characteristics, on the reliability of expected loss estimations at regional and global scale.

  2. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health. PMID:14585517

  3. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  4. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables. PMID:25763941

  5. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  6. Medical decisions under uncertainty.

    PubMed

    Carmi, A

    1993-01-01

    The court applies the criteria of the reasonable doctor and common practice in order to consider the behaviour of a defendant physician. The meaning of our demand that the doctor expects that his or her acts or omissions will bring about certain implications is that, according to the present circumstances and subject to the limited knowledge of the common practice, the course of certain events or situations in the future may be assumed in spite of the fog of uncertainty which surrounds us. The miracles and wonders of creation are concealed from us, and we are not aware of the way and the nature of our bodily functioning. Therefore, there seems to be no way to avoid mistakes, because in several cases the correct diagnosis cannot be determined even with the most advanced application of all information available. Doctors find it difficult to admit that they grope in the dark. They wish to form clear and accurate diagnoses for their patients. The fact that their profession is faced with innumerable and unavoidable risks and mistakes is hard to swallow, and many of them claim that in their everyday work this does not happen. They should not content themselves by changing their style. A radical metamorphosis is needed. They should not be tempted to formulate their diagnoses in 'neutral' statements in order to be on the safe side. Uncertainty should be accepted and acknowledged by the profession and by the public at large as a human phenomenon, as an integral part of any human decision, and as a clear characteristic of any legal or medical diagnosis.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8231694

  7. Uncertainty As Knowledge: Harnessing Ambiguity and Uncertainty into Policy Constraints

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.

    2014-12-01

    There are numerous sources of uncertainty that impact policy decisions relating to climate change: There is scientific uncertainty, as for example encapsulated in estimates of climate sensitivity. There is policy uncertainty, which arises when mitigation efforts are erratic or are reversed (as recently happened in Australia). There is also technological uncertainty which affects the mitigation pathway. How can policy decisions be informed in light of these multiple sources of uncertainty? We propose an "ordinal" approach that relies on comparisons such as "greater than" or "lesser than" (known as ordinal), which can help sidestep disagreement about specific parameter estimates (e.g., climate sensitivity). To illustrate, recent analyses (Lewandowsky et al., 2014, Climatic Change) have shown that the magnitude of uncertainty about future temperature increases is directly linked with the magnitude of future risk: the greater the uncertainty, the greater the risk of mitigation failure (defined as exceeding a carbon budget for a predetermined threshold). Here we extend this approach to other sources of uncertainty, with a particular focus on "ambiguity" or "second-order" uncertainty, which arises when there is dissent among experts.

  8. Principles of learning.

    PubMed

    Voith, V L

    1986-12-01

    This article discusses some general principles of learning as well as possible constraints and how such principles can apply to horses. A brief review is presented of experiments that were designed to assess learning in horses. The use of behavior modification techniques to treat behavior problems in horses is discussed and several examples of the use of these techniques are provided. PMID:3492241

  9. Principled Grammar Teaching

    ERIC Educational Resources Information Center

    Batstone, Rob; Ellis, Rod

    2009-01-01

    A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world

  10. Principled Grammar Teaching

    ERIC Educational Resources Information Center

    Batstone, Rob; Ellis, Rod

    2009-01-01

    A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world…

  11. Evaluation of measurement uncertainty based on Bayesian information fusion

    NASA Astrophysics Data System (ADS)

    Wang, Shan; Chen, Xiaohuai; Yang, Qiao

    2013-10-01

    This paper raises a new method for evaluating uncertainty that taking count of both the record and the data. By using Bayesian Statistical Principle, the prior distribution and the posterior one, provided by the record and the data, were combined together. The statistical characteristics parameter estimation was descended from the posterior distribution, so that a formula of the uncertainty, which combined the advantages of type A and B, was acquired. By simulation and verification, this measurement shows great advantages compared with the others, especially to small size of data analysis.

  12. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  13. Parameter uncertainty for ASP models

    SciTech Connect

    Knudsen, J.K.; Smith, C.L.

    1995-10-01

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more than one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.

  14. Mach's holographic principle

    SciTech Connect

    Khoury, Justin; Parikh, Maulik

    2009-10-15

    Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.

  15. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  16. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  17. Uncertainty analysis of thermoreflectance measurements

    NASA Astrophysics Data System (ADS)

    Yang, Jia; Ziade, Elbara; Schmidt, Aaron J.

    2016-01-01

    We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica.

  18. Evaluating uncertainty in simulation models

    SciTech Connect

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  19. Uncertainty and Anticipation in Anxiety

    PubMed Central

    Grupe, Dan W.; Nitschke, Jack B.

    2014-01-01

    Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199

  20. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  1. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-04-01

    Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  2. Uncertainty analysis in RECCAP

    NASA Astrophysics Data System (ADS)

    Enting, I. G.

    2010-12-01

    The Global Carbon Project RECCAP exercise aims to produce regional analyses of net carbon fluxes between the atmosphere and the land and ocean carbon systems. The project aims to synthesise multiple source of information from modelling, inversions and inventory studies. A careful analysis of uncertainty is essential, both for the final synthesis and for assuring consistency in the process of combining disparate inputs. A unifying approach is to treat the overall analysis as a process of statistical estimation. The broadest-scale grouping of approaches is `top-down' vs. `bottom-up' techniques, but each of these needs to be further partitioned. Top-down approaches generally take the form of inversions, using measurements of carbon dioxide concentrations to either deduce surface concentrations or deduce parameters in spatially-explicit process-based models. These two types of inversion will have somewhat different statistical characteristics, but each will achieve only limited spatial resolution due to the ill-conditioned nature of the inversion. Bottom-up techniques aim to resolve great spatial detail. They comprise both census-type studies (mainly for anthropogenic emissions) and modelling studies with remotely-sensed data to provide spatially and temporally explicit forcing or constraints. Again, these two types of approach are likely to have quite different statistical characteristics. An important issue in combining information is consistency between definitions used for the disparate components. Cases where there is significant potential for ambiguity include wildfire and delayed responses to land-use change. A particular concern is the potential for `double counting' when combining bottom-up estimates with the results of inversion techniques that have incorporated Bayesian constraints using the same data as is used in the bottom-up estimates. The communication of distribution of uncertainty in one time and two space dimensions poses particular challenges. Temporal variability can be usefully characterised in terms of long-term trends, seasonal cycles and irregular variability. Additional choices need to be made concerning the frequency ranges that define each of these components. Spatial resolution remains problematic with the diffuse boundaries of top-down approaches failing to match the sharp boundaries from bottom-up techniques.

  3. Group environmental preference aggregation: the principle of environmental justice

    SciTech Connect

    Davos, C.A.

    1986-01-01

    The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.

  4. Physical principles in quantum field theory and in covariant harmonic oscillator formalism

    SciTech Connect

    Han, D.; Kim, Y.S.; Noz, M.E.

    1981-12-01

    It is shown that both covariant harmonic oscillator formalism and quantum field theory are based on common physical principles which include Poincare covariance, Heisenberg's space--momentum uncertainty relation, and Dirac's ''C-number'' time--energy uncertainty relation. It is shown in particular that the oscillator wave functions are derivable from the physical principles which are used in the derivation of the Klein--Nishina formula.

  5. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, wave patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  6. [Bioethics of principles].

    PubMed

    Prez-Soba Dez del Corral, Juan Jos

    2008-01-01

    Bioethics emerges about the tecnological problems of acting in human life. Emerges also the problem of the moral limits determination, because they seem exterior of this practice. The Bioethics of Principles, take his rationality of the teleological thinking, and the autonomism. These divergence manifest the epistemological fragility and the great difficulty of hmoral thinking. This is evident in the determination of autonomy's principle, it has not the ethical content of Kant's propose. We need a new ethic rationality with a new refelxion of new Principles whose emerges of the basic ethic experiences. PMID:18402229

  7. Multi-band pyrometer uncertainty analysis and improvement

    NASA Astrophysics Data System (ADS)

    Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

    2010-12-01

    According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

  8. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  9. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that

  10. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  11. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  12. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  13. Hydrology, society, change and uncertainty

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  14. Planning ATES systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.

  15. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  16. Global ethics and principlism.

    PubMed

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together. PMID:22073817

  17. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  18. Energy: Principles, problems, alternatives

    SciTech Connect

    Priest, J.

    1984-01-01

    This book is a relatively elementary pedagogical work designed to teach physical principles as they relate to the societal uses of energy. Contents, abridged: Physical basis for energy. Fossil fuels. Electric energy. Nuclear power. Solar energy. Other energy systems. Index.

  19. Psychotherapy: some guiding principles.

    PubMed

    Haas, W M

    1997-01-01

    A veteran therapist reflects on his first 50 years of practice and underlines some important principles that can contribute to successful therapeutic outcomes: (1) Know Yourself; (2) Know Your Freud; (3) Protect Privacy and Your Principles; (4) Develop Your Own Style; (5) Utilize Here-and-Now Family Experience; (6) Connect with the Covert Inner Child; (7) Appreciate Eclectic Options; (8) Help Clients Get Beyond Their Anger; (9) Face Mortality; (10) Enjoy Your Practice. PMID:9470964

  20. How uncertainty bounds the shape index of simple cells.

    PubMed

    Barbieri, D; Citti, G; Sarti, A

    2014-01-01

    We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex.Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15. PMID:24742044

  1. How Uncertainty Bounds the Shape Index of Simple Cells

    PubMed Central

    2014-01-01

    We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex. Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15. PMID:24742044

  2. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  3. Estimating uncertainties in watershed studies

    NASA Astrophysics Data System (ADS)

    Campbell, John; Yanai, Ruth; Green, Mark

    2011-06-01

    Quantifying Uncertainty in Ecosystem Studies (QUEST) Workshop: Uncertainty in Hydrologic Fluxes of Elements at the Small Watershed Scale; Boston, Massachusetts, 14-15 March 2011; Small watersheds have been used widely to quantify chemical fluxes and cycling in terrestrial ecosystems for about the past half century. The small watershed approach has been valuable in characterizing hydrologic and nutrient budgets, for instance, in estimating the net gain or loss of solutes in response to disturbance. However, the uncertainty in these ecosystem budget calculations is generally ignored. Without uncertainty estimates in watershed studies, it is difficult to evaluate the significance of observed differences between watersheds or changes in budgets over time, and erroneous conclusions may be drawn. The historical lack of attention given to uncertainty has been due at least in part to the lack of appropriate analytical tools and approaches. The issue of uncertainty has been confronted more rigorously in other disciplines, yet the advances made have not been comprehensively applied to biogeochemical input-output budgets. In recent years, there has been growing recognition that estimates of uncertainty are essential for coming to sound scientific conclusions, identifying which budget components most need improvement, and developing more efficient monitoring strategies, thereby maximizing information gained per unit cost.

  4. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  5. Principles for electric power policy

    SciTech Connect

    Not Available

    1984-01-01

    Principles for Electric Power Policy analyzes the probable interactions of future electrical power developments and governmental policies. This book demonstrates that the role of electric power will be even larger in America's future than in its past; that the uncertainties that currently hamper electric planning will continue for at least a decade; and that the single most efficient means of achieving the policy flexibility necessary in such an uncertain environment is a renewed commitment to both technological and non-technological ''electric power'' research. This book presents six scenarios for future growth in electric power, based on various assumptions about society and various rates of growth in GNP and in energy demand. Six background chapters summarize relevant aspects of various fields that affect electric power. They also direct the reader to references from which highly detailed information and technical support may be drawn. The concluding chapter provides a historical perspective on the evolution of electric power. This principal appendix presents the complete results of an expert opinion survey covering a broad range of economic, environmental, demographic, and energy related subjects. Its bibliography is a rich source of references to both historical and technical data.

  6. Uncertainty of testing methods--what do we (want to) know?

    PubMed

    Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone

    2013-01-01

    It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods. PMID:23665803

  7. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  8. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find

  9. Uncertainty relations for characteristic functions

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz; Tasca, D. S.; Walborn, S. P.

    2016-02-01

    We present the uncertainty relation for the characteristic functions (ChUR) of the quantum mechanical position and momentum probability distributions. This inequality is more general than the Heisenberg uncertainty relation and is saturated in two extreme cases for wave functions described by periodic Dirac combs. We further discuss a broad spectrum of applications of the ChUR; in particular, we constrain quantum optical measurements involving general detection apertures and provide the uncertainty relation that is relevant for loop quantum cosmology. A method to measure the characteristic function directly using an auxiliary qubit is also briefly discussed.

  10. The traveltime holographic principle

    NASA Astrophysics Data System (ADS)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes ?pq from exterior transmission traveltimes ?sp and ?sq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times ?pq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes ?pq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes ?pq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  11. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

  12. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

  13. The Bayesian brain: phantom percepts resolve sensory uncertainty.

    PubMed

    De Ridder, Dirk; Vanneste, Sven; Freeman, Walter

    2014-07-01

    Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the Bayesian updating of knowledge via active sensory exploration of the environment, driven by the Shannonian free-energy principle, provides an explanation for the generation of phantom percepts, as a way to reduce uncertainty, to make sense of the world. PMID:22516669

  14. Climate targets: Values and uncertainty

    NASA Astrophysics Data System (ADS)

    Lempert, Robert J.

    2015-10-01

    Policymakers know that the risks associated with climate change mean they need to cut greenhouse-gas emissions. But uncertainty surrounding the likelihood of different scenarios makes choosing specific policies difficult.

  15. Uncertainty Relation for Smooth Entropies

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Renner, Renato

    2011-03-01

    Uncertainty relations give upper bounds on the accuracy by which the outcomes of two incompatible measurements can be predicted. While established uncertainty relations apply to cases where the predictions are based on purely classical data (e.g., a description of the systems state before measurement), an extended relation which remains valid in the presence of quantum information has been proposed recently [Berta et al., Nature Phys.NPAHAX1745-2473 6, 659 (2010)10.1038/nphys1734]. Here, we generalize this uncertainty relation to one formulated in terms of smooth entropies. Since these entropies measure operational quantities such as extractable secret key length, our uncertainty relation is of immediate practical use. To illustrate this, we show that it directly implies security of quantum key distribution protocols. Our security claim remains valid even if the implemented measurement devices deviate arbitrarily from the theoretical model.

  16. Uncertainty relation for smooth entropies.

    PubMed

    Tomamichel, Marco; Renner, Renato

    2011-03-18

    Uncertainty relations give upper bounds on the accuracy by which the outcomes of two incompatible measurements can be predicted. While established uncertainty relations apply to cases where the predictions are based on purely classical data (e.g., a description of the system's state before measurement), an extended relation which remains valid in the presence of quantum information has been proposed recently [Berta etal., Nature Phys. 6, 659 (2010)]. Here, we generalize this uncertainty relation to one formulated in terms of smooth entropies. Since these entropies measure operational quantities such as extractable secret key length, our uncertainty relation is of immediate practical use. To illustrate this, we show that it directly implies security of quantum key distribution protocols. Our security claim remains valid even if the implemented measurement devices deviate arbitrarily from the theoretical model. PMID:21469854

  17. Uncertainty analysis of thermoreflectance measurements.

    PubMed

    Yang, Jia; Ziade, Elbara; Schmidt, Aaron J

    2016-01-01

    We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica. PMID:26827342

  18. Visualizing uncertainty about the future.

    PubMed

    Spiegelhalter, David; Pearson, Mike; Short, Ian

    2011-09-01

    We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge. PMID:21903802

  19. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  20. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  1. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  2. Principles of quantum electronics

    SciTech Connect

    Marcuse, D.

    1980-01-01

    The principles that govern quantum electronics devices are developed, and their theoretical applications to typical problems are presented. Attention is given to field quantization, interaction between fields and charges, photon emission by 'free' electrons, interaction of bound electrons with radiation, noise and counting statistics, the density matrix method, multiple-photon processes, losses in quantum electronics, and Maxwell's theory as quantum theory of the photon. The principles that are developed are applied to explain the physics of masers, lasers, optical parametric effects, the Raman effect, and the fundamental noise limit of optical detectors.

  3. Principles of Optics

    NASA Astrophysics Data System (ADS)

    Born, Max; Wolf, Emil

    1999-10-01

    Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.

  4. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  5. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  6. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  7. Uncertainty in perception and the Hierarchical Gaussian Filter

    PubMed Central

    Mathys, Christoph D.; Lomakina, Ekaterina I.; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H.; Friston, Karl J.; Stephan, Klaas E.

    2014-01-01

    In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder–Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient—but at the same time intuitive—framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800

  8. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  9. PRINCIPLES OF MODELLING

    EPA Science Inventory

    The scope of modelling the behavior of pollutants in the aquatic environment is now immense. n many practical applications, there are effectively no computational constraints on what is possible. here is accordingly an increasing need for a set of principles of modelling that in ...

  10. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  11. Pattern recognition principles

    NASA Technical Reports Server (NTRS)

    Tou, J. T.; Gonzalez, R. C.

    1974-01-01

    The present work gives an account of basic principles and available techniques for the analysis and design of pattern processing and recognition systems. Areas covered include decision functions, pattern classification by distance functions, pattern classification by likelihood functions, the perceptron and the potential function approaches to trainable pattern classifiers, statistical approach to trainable classifiers, pattern preprocessing and feature selection, and syntactic pattern recognition.

  12. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background

  13. Principles of Cancer Screening.

    PubMed

    Pinsky, Paul F

    2015-10-01

    Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike. PMID:26315516

  14. Business Principles 201.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    This teaching guide consists of guidelines for conducting a secondary-level course on business principles. Intended as part of an office skills or accounting/data processing program, the course provides the management viewpoint toward the planning and operation of a business. First, the goals and objectives of the course are outlined. Provided

  15. STANDARD SETTING PRINCIPLES

    EPA Science Inventory

    The basis for setting drinking water standards has not changed much in principle during the past decade, but the procedure for creating them in an open manner has caused the United States, at least, to go through a much more elaborate process to obtain approval and support from t...

  16. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  17. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  18. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In

  19. PRINCIPLES OF WATER FILTRATION

    EPA Science Inventory

    This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...

  20. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  1. First Principles of Instruction.

    ERIC Educational Resources Information Center

    Merrill, M. David

    2002-01-01

    Examines instructional design theories and elaborates principles about when learning is promoted, i.e., when learners are engaged in solving real-world problems, when existing knowledge is activated as a foundation for new knowledge, and when new knowledge is demonstrated to the learner, applied by the learner, and integrated into the learner's…

  2. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  3. Communicating Uncertainties on Climate Change

    NASA Astrophysics Data System (ADS)

    Planton, S.

    2009-09-01

    The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

  4. Uncertainty in Integrative Structural Modeling

    PubMed Central

    Schneidman-Duhovny, Dina; Pellarin, Riccardo; Sali, Andrej

    2014-01-01

    Integrative structural modelling uses multiple types of input information and proceeds in four stages: (i) gathering information, (ii) designing model representation and converting information into a scoring function, (iii) sampling good-scoring models, and (iv) analyzing models and information. In the first stage, uncertainty originates from data that are sparse, noisy, ambiguous, or derived from heterogeneous samples. In the second stage, uncertainty can originate from a representation that is too coarse for the available information or a scoring function that does not accurately capture the information. In the third stage, the major source of uncertainty is insufficient sampling. In the fourth stage, clustering, cross-validation, and other methods are used to estimate the precision and accuracy of the models and information. PMID:25173450

  5. Climate negotiations under scientific uncertainty.

    PubMed

    Barrett, Scott; Dannenberg, Astrid

    2012-10-23

    How does uncertainty about "dangerous" climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners' dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners' dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  6. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  7. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  8. Uncertainties in atmospheric neutrino fluxes

    NASA Astrophysics Data System (ADS)

    Barr, G. D.; Robbins, S.; Gaisser, T. K.; Stanev, T.

    2006-11-01

    An evaluation of the principal uncertainties in the computation of neutrino fluxes produced in cosmic ray showers in the atmosphere is presented. The neutrino flux predictions are needed for comparison with experiment to perform neutrino oscillation studies. The paper concentrates on the main limitations which are due to hadron production uncertainties. It also treats primary cosmic ray flux uncertainties, which are at a lower level. The absolute neutrino fluxes are found to have errors of around 15% in the neutrino energy region important for contained events underground. Large cancellations of these errors occur when ratios of fluxes are considered, in particular, the ??/?? ratio below E?=1GeV, the (??+??)/(?e+?e) ratio below E?=10GeV and the up/down ratios above E?=1GeV are at the 1% level. A detailed breakdown of the origin of these errors and cancellations is presented.

  9. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  10. The precautionary principle in environmental science.

    PubMed Central

    Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

    2001-01-01

    Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114

  11. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  12. Principles of Glacier Mechanics

    NASA Astrophysics Data System (ADS)

    Waddington, Edwin D.

    Glaciers are awesome in size and move at a majestic pace, and they frequently occupy spectacular mountainous terrain. Naturally, many Earth scientists are attracted to glaciers. Some of us are even fortunate enough to make a career of studying glacier flow. Many others work on the large, flat polar ice sheets where there is no scenery. As a leader of one of the foremost research projects now studying the flow of mountain glaciers (Storglaciaren, Norway), Roger Hooke is well qualified to describe the principles of glacier mechanics. Principles of Glacier Mechanics is written for upper-level undergraduate students and graduate students with an interest in glaciers and the landforms that glaciers produce. While most of the examples in the text are drawn from valley glacier studies, much of the material is also relevant to glacier flatland on the polar ice sheets.

  13. Common principles and multiculturalism.

    PubMed

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  14. The Shakespearean Principle Revisited

    PubMed Central

    Fred, Herbert L.

    2012-01-01

    Let every eye negotiate for itself and trust no agent. That line is from William Shakespeare's Much Ado About Nothing. 1 To me, it is a fundamental doctrine of patient care, and I have named it the Shakespearean Principle.2 It stimulates skepticism,3 promotes doubt,4 improves communication, fosters proper decision-making, and protects against a malady that currently plagues our professionherd mentality.5 This editorial shows what can happen when doctors violate the Shakespearean Principle. The story is real and tells of a woman whose doctor unintentionally killed her. To ensure anonymity, the time and place of the tragedy, as well as the players involved, have been changed. PMID:22412219

  15. Principles of Natural Photosynthesis.

    PubMed

    Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A

    2016-01-01

    Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285

  16. The Principle of Maximum Conformality

    SciTech Connect

    Brodsky, Stanley J; Giustino, Di; /SLAC

    2011-04-05

    A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

  17. Principles of plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Hutchinson, Ian H.

    The physical principles, techniques, and instrumentation of plasma diagnostics are examined in an introduction and reference work for students and practicing scientists. Topics addressed include basic plasma properties, magnetic diagnostics, plasma particle flux, and refractive-index measurements. Consideration is given to EM emission by free and bound electrons, the scattering of EM radiation, and ion processes. Diagrams, drawings, graphs, sample problems, and a glossary of symbols are provided.

  18. Principles of nuclear geology

    SciTech Connect

    Aswathanarayana, U.

    1985-01-01

    This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

  19. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-02-23

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  20. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  1. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = (in units of h/2?, where h is ...

  2. Geographic Uncertainty in Environmental Security

    NASA Astrophysics Data System (ADS)

    Ahlquist, Jon

    2008-06-01

    This volume contains 17 papers presented at the NATO Advanced Research Workshop on Fuzziness and Uncertainty held in Kiev, Ukraine, 28 June to 1 July 2006. Eleven of the papers deal with fuzzy set concepts, while the other six (papers 5, 7, 13, 14, 15, and 16) are not fuzzy. A reader with no prior exposure to fuzzy set theory would benefit from having an introductory text at hand, but the papers are accessible to a wide audience. In general, the papers deal with broad issues of classification and uncertainty in geographic information.

  3. Uncertainties in climate data sets

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1992-01-01

    Climate diagnostics are constructed from either analyzed fields or from observational data sets. Those that have been commonly used are normally considered ground truth. However, in most of these collections, errors and uncertainties exist which are generally ignored due to the consistency of usage over time. Examples of uncertainties and errors are described in NMC and ECMWF analyses and in satellite observational sets-OLR, TOVS, and SMMR. It is suggested that these errors can be large, systematic, and not negligible in climate analysis.

  4. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  5. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728

  6. Probing Mach's principle

    NASA Astrophysics Data System (ADS)

    Annila, Arto

    2012-06-01

    The principle of least action in its original form la Maupertuis is used to explain geodetic and frame-dragging precessions which are customarily accounted for a curved space-time in general relativity. The least-time equations of motion agree with observations and are also in concert with general relativity. Yet according to the least-time principle, gravitation does not relate to the mathematical metric of space-time, but to a tangible energy density embodied by photons. The density of free space is in balance with the total mass of the Universein accord with the Planck law. Likewise, a local photon density and its phase distribution are in balance with the mass and charge distribution of a local body. Here gravitational force is understood as an energy density difference that will diminish when the oppositely polarized pairs of photons co-propagate from the energy-dense system of bodies to the energy-sparse system of the surrounding free space. Thus when the body changes its state of motion, the surrounding energy density must accommodate the change. The concurrent resistance in restructuring the surroundings, ultimately involving the entire Universe, is known as inertia. The all-around propagating energy density couples everything with everything else in accord with Machs principle.

  7. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle

  8. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  9. Uncertainty in 3D gel dosimetry

    NASA Astrophysics Data System (ADS)

    De Deene, Yves; Jirasek, Andrew

    2015-01-01

    Three-dimensional (3D) gel dosimetry has a unique role to play in safeguarding conformal radiotherapy treatments as the technique can cover the full treatment chain and provides the radiation oncologist with the integrated dose distribution in 3D. It can also be applied to benchmark new treatment strategies such as image guided and tracking radiotherapy techniques. A major obstacle that has hindered the wider dissemination of gel dosimetry in radiotherapy centres is a lack of confidence in the reliability of the measured dose distribution. Uncertainties in 3D dosimeters are attributed to both dosimeter properties and scanning performance. In polymer gel dosimetry with MRI readout, discrepancies in dose response of large polymer gel dosimeters versus small calibration phantoms have been reported which can lead to significant inaccuracies in the dose maps. The sources of error in polymer gel dosimetry with MRI readout are well understood and it has been demonstrated that with a carefully designed scanning protocol, the overall uncertainty in absolute dose that can currently be obtained falls within 5% on an individual voxel basis, for a minimum voxel size of 5 mm3. However, several research groups have chosen to use polymer gel dosimetry in a relative manner by normalizing the dose distribution towards an internal reference dose within the gel dosimeter phantom. 3D dosimetry with optical scanning has also been mostly applied in a relative way, although in principle absolute calibration is possible. As the optical absorption in 3D dosimeters is less dependent on temperature it can be expected that the achievable accuracy is higher with optical CT. The precision in optical scanning of 3D dosimeters depends to a large extend on the performance of the detector. 3D dosimetry with X-ray CT readout is a low contrast imaging modality for polymer gel dosimetry. Sources of error in x-ray CT polymer gel dosimetry (XCT) are currently under investigation and include inherent limitations in dosimeter homogeneity, imaging performance, and errors induced through post-acquisition processing. This overview highlights a number of aspects relating to uncertainties in polymer gel dosimetry.

  10. Entropic uncertainty from effective anticommutators

    NASA Astrophysics Data System (ADS)

    Kaniewski, Jedrzej; Tomamichel, Marco; Wehner, Stephanie

    2014-07-01

    We investigate entropic uncertainty relations for two or more binary measurements, for example, spin-1/2 or polarization measurements. We argue that the effective anticommutators of these measurements, i.e., the anticommutators evaluated on the state prior to measuring, are an expedient measure of measurement incompatibility. Based on the knowledge of pairwise effective anticommutators we derive a class of entropic uncertainty relations in terms of conditional Rnyi entropies. Our uncertainty relations are formulated in terms of effective measures of incompatibility, which can be certified in a device-independent fashion. Consequently, we discuss potential applications of our findings to device-independent quantum cryptography. Moreover, to investigate the tightness of our analysis we consider the simplest (and very well studied) scenario of two measurements on a qubit. We find that our results outperform the celebrated bound due to Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103] and provide an analytical expression for the minimum uncertainty which also outperforms some recent bounds based on majorization.

  11. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  12. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  13. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between

  14. Uncertainties in radiation flow experiments

    NASA Astrophysics Data System (ADS)

    Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.

    2016-03-01

    Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.

  15. The face of uncertainty eats.

    PubMed

    Corwin, Rebecca L W

    2011-09-01

    The idea that foods rich in fat and sugar may be addictive has generated much interest, as well as controversy, among both scientific and lay communities. Recent research indicates that fatty and sugary food in-and-of itself is not addictive. Rather, the food and the context in which it is consumed interact to produce an addiction-like state. One of the contexts that appears to be important is the intermittent opportunity to consume foods rich in fat and sugar in environments where food is plentiful. Animal research indicates that, under these conditions, intake of the fatty sugary food escalates across time and binge-type behavior develops. However, the mechanisms that account for the powerful effect of intermittency on ingestive behavior have only begun to be elucidated. In this review, it is proposed that intermittency stimulates appetitive behavior that is associated with uncertainty regarding what, when, and how much of the highly palatable food to consume. Uncertainty may stimulate consumption of optional fatty and sugary treats due to differential firing of midbrain dopamine neurons, activation of the stress axis, and involvement of orexin signaling. In short, uncertainty may produce an aversive state that bingeing on palatable food can alleviate, however temporarily. "Food addiction" may not be "addiction" to food at all; it may be a response to uncertainty within environments of food abundance. PMID:21999691

  16. Quantification of entanglement via uncertainties

    SciTech Connect

    Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S.

    2007-03-15

    We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.

  17. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  18. Who plays dice? Subjective uncertainty in deterministic quantum world

    NASA Astrophysics Data System (ADS)

    Carter, Brandon

    2006-11-01

    Einstein's 1905 recognition that light consists of discrete ``quanta'' inaugurated the duality (wave versus particle) paradox that was resolved 20 years later by Born's introduction of the probability interpretation on which modem quantum theory is based. Einstein's refusal to abandon the classical notion of deterministic evolution - despite the unqualified success of the new paradigm on a local scale - foreshadowed the restoration of determinism in the attempt to develop a global treatment applicable to cosmology by Everett, who failed however to provide a logically coherent treatment of subjective uncertainty at a local level. This drawback has recently been overcome in an extended formulation allowing deterministic description of a physical universe in which the uncertainty concerns only our own particular local situations, whose probability is prescribed by an appropriate micro-anthropic principle.

  19. Robust optimization of nonlinear impulsive rendezvous with uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, YaZhong; Yang, Zhen; Li, HengNian

    2014-04-01

    The optimal rendezvous trajectory designs in many current research efforts do not incorporate the practical uncertainties into the closed loop of the design. A robust optimization design method for a nonlinear rendezvous trajectory with uncertainty is proposed in this paper. One performance index related to the variances of the terminal state error is termed the robustness performance index, and a two-objective optimization model (including the minimum characteristic velocity and the minimum robustness performance index) is formulated on the basis of the Lambert algorithm. A multi-objective, non-dominated sorting genetic algorithm is employed to obtain the Pareto optimal solution set. It is shown that the proposed approach can be used to quickly obtain several inherent principles of the rendezvous trajectory by taking practical errors into account. Furthermore, this approach can identify the most preferable design space in which a specific solution for the actual application of the rendezvous control should be chosen.

  20. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  1. Nonequilibrium quantum Landauer principle.

    PubMed

    Goold, John; Paternostro, Mauro; Modi, Kavan

    2015-02-13

    Using the operational framework of completely positive, trace preserving operations and thermodynamic fluctuation relations, we derive a lower bound for the heat exchange in a Landauer erasure process on a quantum system. Our bound comes from a nonphenomenological derivation of the Landauer principle which holds for generic nonequilibrium dynamics. Furthermore, the bound depends on the nonunitality of dynamics, giving it a physical significance that differs from other derivations. We apply our framework to the model of a spin-1/2 system coupled to an interacting spin chain at finite temperature. PMID:25723198

  2. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  3. Principles of Pituitary Surgery.

    PubMed

    Farrell, Christopher J; Nyquist, Gurston G; Farag, Alexander A; Rosen, Marc R; Evans, James J

    2016-02-01

    Since the description of a transnasal approach for treatment of pituitary tumors, transsphenoidal surgery has undergone continuous development. Hirsch developed a lateral endonasal approach before simplifying it to a transseptal approach. Cushing approached pituitary tumors using a transsphenoidal approach but transitioned to the transcranial route. Transsphenoidal surgery was not "rediscovered" until Hardy introduced the surgical microscope. An endoscopic transsphenoidal approach for pituitary tumors has been reported and further advanced. We describe the principles of pituitary surgery including the key elements of surgical decision making and discuss the technical nuances distinguishing the endoscopic from the microscopic approach. PMID:26614830

  4. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  5. Academic Principles: A Brief Introduction

    ERIC Educational Resources Information Center

    Association of American Universities, 2013

    2013-01-01

    For many decades certain core principles have guided the conduct of teaching, research, and scholarship at American universities, as well as the ways in which these institutions are governed. There is ample evidence that these principles have strongly contributed to the quality of American universities. The principles have also made these

  6. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is

  7. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  8. Great Lakes Literacy Principles

    NASA Astrophysics Data System (ADS)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These Great Lakes Literacy Principles represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  9. [Principles of callus distraction].

    PubMed

    Hankemeier, S; Bastian, L; Gosling, T; Krettek, C

    2004-10-01

    Callus distraction is based on the principle of regenerating bone by continuous distraction of proliferating callus tissue. It has become the standard treatment of significant leg shortening and large bone defects. Due to many problems and complications, exact preoperative planning, operative technique and careful postoperative follow-up are essential. External fixators can be used for all indications of callus distraction. However, due to pin tract infections, pain and loss of mobility caused by soft tissue transfixation, fixators are applied in patients with open growth plates, simultaneous lengthening with continuous deformity corrections, and increased risk of infection. Distraction over an intramedullary nail allows removal of the external fixator at the end of distraction before callus consolidation (monorail method). The intramedullary nail protects newly formed callus tissue and reduces the risk of axial deviation and refractures. Recently developed, fully intramedullary lengthening devices eliminate fixator-associated complications and accelerate return to normal daily activities. This review describes principles of callus distraction, potential complications and their management. PMID:15452653

  10. Principle of relative locality

    SciTech Connect

    Amelino-Camelia, Giovanni; Freidel, Laurent; Smolin, Lee; Kowalski-Glikman, Jerzy

    2011-10-15

    We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

  11. Principles of Safety Pharmacology

    PubMed Central

    Pugsley, M K; Authier, S; Curtis, M J

    2008-01-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

  12. Revisiting Tversky's diagnosticity principle.

    PubMed

    Evers, Ellen R K; Lakens, Danil

    2014-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  13. Revisiting Tversky's diagnosticity principle

    PubMed Central

    Evers, Ellen R. K.; Lakens, Danil

    2013-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  14. Encoding uncertainty in the hippocampus.

    PubMed

    Harrison, L M; Duggins, A; Friston, K J

    2006-06-01

    The medial temporal lobe may play a critical role in binding successive events into memory while encoding contextual information in implicit and explicit memory tasks. Information theory provides a quantitative basis to model contextual information engendered by conditional dependence between, or conditional uncertainty about, consecutive events in a sequence. We show that information theoretic indices characterizing contextual dependence within a sequential reaction time task (SRTT) predict regional responses, measured by fMRI, in areas associated with sequence learning and navigation. Specifically, activity of a distributed paralimbic system, centered on the left hippocampus, correlated selectively with predictability as measured with mutual information. This is clear evidence that the brain is sensitive to the probabilistic context in which events are encountered. This is potentially important for theories about how the brain represents uncertainty and makes perceptual inferences, particularly those based on predictive coding and hierarchical Bayes. PMID:16527453

  15. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  16. Alignment uncertainty and genomic analysis.

    PubMed

    Wong, Karen M; Suchard, Marc A; Huelsenbeck, John P

    2008-01-25

    The statistical methods applied to the analysis of genomic data do not account for uncertainty in the sequence alignment. Indeed, the alignment is treated as an observation, and all of the subsequent inferences depend on the alignment being correct. This may not have been too problematic for many phylogenetic studies, in which the gene is carefully chosen for, among other things, ease of alignment. However, in a comparative genomics study, the same statistical methods are applied repeatedly on thousands of genes, many of which will be difficult to align. Using genomic data from seven yeast species, we show that uncertainty in the alignment can lead to several problems, including different alignment methods resulting in different conclusions. PMID:18218900

  17. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  18. Uncertainty in flood risk mapping

    NASA Astrophysics Data System (ADS)

    Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

    2014-05-01

    A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.

  19. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  20. Ozone Uncertainties Study Algorithm (OUSA)

    NASA Astrophysics Data System (ADS)

    Bahethi, O. P.

    An algorithm to carry out sensitivities, uncertainties and overall imprecision studies to a set of input parameters for a one dimensional steady ozone photochemistry model is described. This algorithm can be used to evaluate steady state perturbations due to point source or distributed ejection of H2O, CLX, and NOx, besides, varying the incident solar flux. This algorithm is operational on IBM OS/360-91 computer at NASA/Goddard Space Flight Center's Science and Applications Computer Center (SACC).

  1. Ozone Uncertainties Study Algorithm (OUSA)

    NASA Technical Reports Server (NTRS)

    Bahethi, O. P.

    1982-01-01

    An algorithm to carry out sensitivities, uncertainties and overall imprecision studies to a set of input parameters for a one dimensional steady ozone photochemistry model is described. This algorithm can be used to evaluate steady state perturbations due to point source or distributed ejection of H2O, CLX, and NOx, besides, varying the incident solar flux. This algorithm is operational on IBM OS/360-91 computer at NASA/Goddard Space Flight Center's Science and Applications Computer Center (SACC).

  2. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

  3. Uncertainty propagation in nuclear forensics.

    PubMed

    Pomm, S; Jerome, S M; Venchiarutti, C

    2014-07-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined. PMID:24607529

  4. Quantification of uncertainties in composites

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Singhal, S. N.; Murthy, P. L. N.; Chamis, Christos C.

    1993-01-01

    An integrated methodology is developed for computationally simulating the probabilistic composite material properties at all composite scales. The simulation requires minimum input consisting of the description of uncertainties at the lowest scale (fiber and matrix constituents) of the composite and in the fabrication process variables. The methodology allows the determination of the sensitivity of the composite material behavior to all the relevant primitive variables. This information is crucial for reducing the undesirable scatter in composite behavior at its macro scale by reducing the uncertainties in the most influential primitive variables at the micro scale. The methodology is computationally efficient. The computational time required by the methodology described herein is an order of magnitude less than that for Monte Carlo Simulation. The methodology has been implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of the methodology/code are demonstrated by simulating the uncertainties in the heat-transfer, thermal, and mechanical properties of a typical laminate and comparing the results with the Monte Carlo simulation method and experimental data. The important observation is that the computational simulation for probabilistic composite mechanics has sufficient flexibility to capture the observed scatter in composite properties.

  5. Basic Principles in Oncology

    NASA Astrophysics Data System (ADS)

    Vogl, Thomas J.

    The evolving field of interventional oncology can only be considered as a small integrative part in the complex area of oncology. The new field of interventional oncology needs a standardization of the procedures, the terminology, and criteria to facilitate the effective communication of ideas and appropriate comparison between treatments and new integrative technology. In principle, ablative therapy is a part of locoregional oncological therapy and is defined either as chemical ablation using ethanol or acetic acid, or thermotherapies such as radiofrequency, laser, microwave, and cryoablation. All these new evolving therapies have to be exactly evaluated and an adequate terminology has to be used to define imaging findings and pathology. All the different technologies and evaluated therapies have to be compared, and the results have to be analyzed in order to improve the patient outcome.

  6. Principles of Induction Accelerators

    NASA Astrophysics Data System (ADS)

    Briggs*, Richard J.

    The basic concepts involved in induction accelerators are introduced in this chapter. The objective is to provide a foundation for the more detailed coverage of key technology elements and specific applications in the following chapters. A wide variety of induction accelerators are discussed in the following chapters, from the high current linear electron accelerator configurations that have been the main focus of the original developments, to circular configurations like the ion synchrotrons that are the subject of more recent research. The main focus in the present chapter is on the induction module containing the magnetic core that plays the role of a transformer in coupling the pulsed power from the modulator to the charged particle beam. This is the essential common element in all these induction accelerators, and an understanding of the basic processes involved in its operation is the main objective of this chapter. (See [1] for a useful and complementary presentation of the basic principles in induction linacs.)

  7. Kepler and Mach's Principle

    NASA Astrophysics Data System (ADS)

    Barbour, Julian

    The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.

  8. Dynamical principles in neuroscience

    SciTech Connect

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-15

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

  9. Dynamical principles in neuroscience

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.

    2006-10-01

    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?

  10. Fault Management Guiding Principles

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

    2011-01-01

    Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

  11. Rotating Torsion Balance Tests of the Weak Equivalence Principle

    NASA Astrophysics Data System (ADS)

    Wagner, Todd A.

    We used a rotating torsion balance to make the most precise laboratory search for equivalence-principle violation. We used a beryllium-aluminum composition dipole to complement our previous measurement with a beryllium-titanium composition dipole. We improved the tilt stability of the apparatus and reduced the temperature gradient feed-through to improve the uncertainty by 30% compared to our beryllium-titanium result. Using the beryllium-aluminum test bodies, we found eta⊕ = (--1.3+/-1.2)x10 --13. The combined limits using both test bodies pairs generally limit any new equivalence-principle-violating force that couples to ordinary neutral matter. We also measured test-bodies with compositions that mimic the difference in composition between the earth and moon to provide a model-independent weak equivalence principle limit of etaCD = (1.2 +/- 1.1) x10--13 for comparison with lunar laser ranging strong equivalence principle measurements. The combined lunar laser ranging and weak equivalence principle measurements limit equivalence-principle violation for gravitational binding energy to ≤ 6 x10 --4 at 1-sigma.

  12. Accounting for Calibration Uncertainty in Detectors for High-Energy Astrophysics

    NASA Astrophysics Data System (ADS)

    Xu, Jin

    Systematic instrumental uncertainties in astronomical analyses have been generally ignored in data analysis due to the lack of robust principled methods, though the importance of incorporating instrumental calibration uncertainty is widely recognized by both users and instrument builders. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. Lee et al. (2011) introduced a so-called pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduces an ad hoc technique that simplifies computation by assuming that the current data is not useful in narrowing the uncertainty for the calibration product, i.e., that the prior and posterior distributions for the calibration products are the same. In the thesis, we focus on incorporating calibration uncertainty into a principled Bayesian X-ray spectral analysis, specifically we account for uncertainty in the so-called effective area curve and the photon redistribution matrix. X-ray spectral analysis models the distribution of the energies of X-ray photons emitted from an astronomical source. The effective area curve of an X-ray detector describes its sensitive as a function of the energy of incoming photons, and the photon redistribution matrix describes the probability distribution of the recorded (discrete) energy of a photon as a function of the true (discretized) energy. Starting with the effective area curve, we follow Lee et al. (2011) and use a principle component analysis (PCA) to efficiently represent the uncertainty. Here, however, we leverage this representation to enable a principled, fully Bayesian method to account for calibration uncertainty in high-energy spectral analysis. For the photon redistribution matrix, we first model each conditional distribution as a normal distribution and then apply PCA to the parameters describing the normal models. This results in an efficient low-dimensional summary of the uncertainty in the redistribution matrix. Our methods for both calibration products are compared with standard analysis techniques and the pragmatic Bayesian method of Lee et al. (2011). The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product; we demonstrate this for the effective area curve. In this way, our fully Bayesian approach can yield more accurate and efficient estimates of the source parameters, and valid estimates of their uncertainty. Moreover, the fully Bayesian approach is the only method that allows us to make a valid inference about the effective area curve itself, quantifying which possible curves are most consistent with the data.

  13. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  14. Confronting uncertainty in peatland ecohydrology

    NASA Astrophysics Data System (ADS)

    Morris, P. J.; Waddington, J. M.; Baird, A. J.; Belyea, L. R.

    2011-12-01

    Background and Rationale: Peatlands are heavily water-controlled systems; long-term peat accumulation relies on slow organic matter decay in cool, saturated soil conditions. This interdependence of ecological, hydrological and biogeochemical processes makes peatlands prime examples of ecohydrological systems. Peatland ecohydrology exhibits a number of facets of complexity in the form of multiple mutual interdependencies between physical and biological processes and structures. Uncertainty as to the underlying mechanisms that control complex systems arises from a wide variety of sources; in this paper we explore three types of uncertainty in reference to peatland ecohydrology. 1) Parameterization. Analysis of complex systems such as peatlands lends itself naturally to a simulation modelling approach. An obvious source of uncertainty under a modelling approach is that of parameterization. A central theme in modelling studies is often that of sensitivity analysis: parameters to which model behavior is sensitive must be understood with high fidelity; in less sensitive areas of a model a greater level of uncertainty may be tolerated. Using a simple peatland water-budget model we demonstrate the importance of separating uncertainty from sensitivity. Using a Monte Carlo approach to analyze the model's behavior we identify those parameters that are both uncertain and to which the model's behavior is sensitive, and which therefore exhibit the most pressing need for further research. 2) Model structure. A more subtle form of uncertainty surrounds the assumed algorithmic structure of a model. We analyze the behavior of a simple ecohydrological model of long-term peatland development. By sequentially switching different feedbacks on and off we demonstrate that the level of complexity represented in the model is of central importance to the model's behavior, distinct from parameterization. 3) Spatial heterogeneity. We examine the role of horizontal spatial heterogeneity by extending the 1-D model used in section (2) to include a horizontal dimension. The spatially-explicit model simulates the growth of a domed bog over 5,000 years using the same equations, algorithmic structures and parameter values as the one-dimensional model. However, the behavior of the two models' two state variables (peat thickness, central water-table depth) is substantially different. The inclusion of spatial heterogeneity therefore not only leads to the prediction of spatial structures that simply cannot be represented in 1-D models, but also exerts an independent effect on state variables. This finding adds weight to the argument that spatial interactions play a non-trivial role in governing the behaviour of ecohydrological systems, and that failure to take account of spatial heterogeneity may fundamentally undermine models of ecohydrological systems. Synthesis: We demonstrate how exploring and confronting sources of uncertainty in peatland ecohydrology may be used to reduce the complexity of these and other systems, and to identify clearly the most urgent priorities for future observational research.

  15. Optimal uncertainty quantification with model uncertainty and legacy data

    NASA Astrophysics Data System (ADS)

    Kamga, P.-H. T.; Li, B.; McKerns, M.; Nguyen, L. H.; Ortiz, M.; Owhadi, H.; Sullivan, T. J.

    2014-12-01

    We present an optimal uncertainty quantification (OUQ) protocol for systems that are characterized by an existing physics-based model and for which only legacy data is available, i.e., no additional experimental testing of the system is possible. Specifically, the OUQ strategy developed in this work consists of using the legacy data to establish, in a probabilistic sense, the level of error of the model, or modeling error, and to subsequently use the validated model as a basis for the determination of probabilities of outcomes. The quantification of modeling uncertainty specifically establishes, to a specified confidence, the probability that the actual response of the system lies within a certain distance of the model. Once the extent of model uncertainty has been established in this manner, the model can be conveniently used to stand in for the actual or empirical response of the system in order to compute probabilities of outcomes. To this end, we resort to the OUQ reduction theorem of Owhadi et al. (2013) in order to reduce the computation of optimal upper and lower bounds on probabilities of outcomes to a finite-dimensional optimization problem. We illustrate the resulting UQ protocol by means of an application concerned with the response to hypervelocity impact of 6061-T6 Aluminum plates by Nylon 6/6 impactors at impact velocities in the range of 5-7 km/s. The ability of the legacy OUQ protocol to process diverse information on the system and its ability to supply rigorous bounds on system performance under realistic-and less than ideal-scenarios demonstrated by the hypervelocity impact application is remarkable.

  16. Dosimetric Uncertainties: Magnetic Field Coupling to Peripheral Nerve.

    PubMed

    Kavet, Robert

    2015-12-01

    The International Commission on Non-ionizing Radiation Protection (ICNIRP) and the Institute for Electrical and Electronic Engineers (IEEE) have established magnetic field exposure limits for the general public between 400 Hz (ICNIRP)/759 Hz (IEEE) and 100 kHz to protect against adverse effects associated with peripheral nerve stimulation (PNS). Despite apparent common purpose and similarly stated principles, the two sets of limits diverge between 3.35-100 kHz by a factor of about 7.7 with respect to PNS. To address the basis for this difference and the more general issue of dosimetric uncertainty, this paper combines experimental data of PNS thresholds derived from human subjects exposed to magnetic fields together with published estimates of induced in situ electric field PNS thresholds to evaluate dosimetric relationships of external magnetic fields to induced fields at the threshold of PNS and the uncertainties inherent to such relationships. The analyses indicate that the logarithmic range of magnetic field thresholds constrains the bounds of uncertainty of in situ electric field PNS thresholds and coupling coefficients related to the peripheral nerve (the coupling coefficients define the dosimetric relationship of external field to induced electric field). The general public magnetic field exposure limit adopted by ICNIRP uses a coupling coefficient that falls above the bounds of dosimetric uncertainty, while IEEE's is within the bounds of uncertainty toward the lower end of the distribution. The analyses illustrate that dosimetric estimates can be derived without reliance on computational dosimetry and the associated values of tissue conductivity. With the limits now in place, investigative efforts would be required if a field measurement were to exceed ICNIRP's magnetic field limit (the reference level), even when there is a virtual certainty that the dose limit (the basic restriction) has not been exceeded. The constraints on the range of coupling coefficients described in this paper could facilitate a re-evaluation of ICNIRP and IEEE dose and exposure limits and possibly lead toward harmonization. PMID:26509623

  17. Scientific basis for the Precautionary Principle

    SciTech Connect

    Vineis, Paolo . E-mail: p.vineis@imperial.ac.uk

    2005-09-01

    The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as 'stressors', inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones.

  18. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  19. Shannon Revisited: Information in Terms of Uncertainty.

    ERIC Educational Resources Information Center

    Cole, Charles

    1993-01-01

    Discusses the meaning of information in terms of Shannon's mathematical theory of communication and the concept of uncertainty. The uncertainty associated with the transmission of the signal is argued to have more significance for information science than the uncertainty associated with the selection of a message from a set of possible messages.

  20. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective

  1. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zo E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty

  2. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual

  3. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…

  4. Sticking to its principles.

    PubMed

    1992-03-27

    Planned Parenthood says that rather than accept the Bush administration's gag rule it will give up federal funding of its operations. The gag rule forbids professionals at birth control clinics from even referring to abortion as an option to a pregnant woman, much less recommending one. President Bush has agreed to a policy which allows physicians but no one else at clinics to discuss abortion in at least some cases. In his view, according to White House officials, this was an admitted attempt to straddle the issue. Why he would want to straddle is understandable. The right wing of his party, which has always been suspicious of Mr. Bush, is pushing him to uphold what it regards as the Reagan legacy on this issue. The original gag rule, which prevented even physicians from discussing abortion as an option in almost all cases, was issued in the last presidents's 2nd term and upheld last year by the Supreme Court. Give Planned Parenthood credit for sticking to its principles. A lot of recipients of all sorts of federal funds want it both ways, take the money but not accept federal policy guidelines. When they find they can't, many "rise above principle," take the money and adjust policy accordingly. It is not going to be easy for Planned Parenthood now. Federal funds account for a significant portion of the organizations's budgets. Planned Parenthood of Maryland, for example, gets about $500,000 a year from the federal government, or about 12-13% of its total budget. It will either have to cut back on its services or increase its fundraising from other sources or charge women more for services--or all of those things. This is not the end of the story. It is certainly not the end of the political story. Pat Buchanan said of the new regulations, "I like the old position, to be quite candid." Thank goodness he never won a primary. George Bush would not have moved even as far as he hid on the gag rule. There will be a lot of agreement with the Buchanan view at the Republican national convention. We can only hope that by then the president will be looking to the general election campaign and a Democratic opponent who will be appealing to Republican women on this issue. Perhaps then he will relax the gag order a little more. PMID:12317218

  5. Uncertainty of absolute gravity measurements

    NASA Astrophysics Data System (ADS)

    van Camp, Michel; Williams, Simon D. P.; Francis, Olivier

    2005-05-01

    A total of 96 absolute gravity (AG) measurements at the Membach station and 221 at the Proudman Oceanographic Laboratory (POL) is analyzed for noise content. The lengths of the series were around 10 years (POL) and 8 years (Membach). First the noise at frequencies lower than 1 cpd is studied. This noise consists in setup-dependent offsets and geophysical colored sources. The setup white noise is estimated using continuous relative superconducting gravity (SG) measurements at Membach. The colored environmental noise affecting both AG and SG is estimated using the maximum likelihood estimation technique to fit two types of stochastic models to the SG time series, power law noise, and first-order Gauss Markov (FOGM) noise. We estimate the noise amplitudes of a white noise process plus power law model while simultaneously solving for the spectral index and the noise amplitudes of a white noise process plus FOGM noise model is also estimated. The gravity rate of change and the associated uncertainties as a function of the noise structure are then computed. At frequencies higher than 1 cpd, a time-varying white noise component usually dominates AG time series. Finally, the POL and Membach experiments are applied to estimate the uncertainties for AG campaigns repeated once or twice a year to monitor crustal deformation. Such repeated AG measurements should allow one to constrain gravity rate of change with an uncertainty of 1 nm s-2 yr-1 (or 0.5 mm yr-1) after 14 or 24 years, depending on the noise model. Therefore long-term measurements using absolute gravimeters are appropriate for monitoring slow vertical tectonic deformation.

  6. Error models for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Josset, L.; Scheidt, C.; Lunati, I.

    2012-12-01

    In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing spatial uncertainty using distances and kernels", Math Geosci (2009) [2] P. Jenny et al., "Multi-Scale finite-volume method for elliptic problems in subsurface flow simulation", J. Comp. Phys., 187(1) (2003) [3] I. Lunati and S.H. Lee, "An operator formulation of the multiscale finite-volume method with correction function", Multiscale Model. Simul. 8(1) (2009) [4] G. Mariethoz, P. Renard, and J. Straubhaar "The Direct Sampling method to perform multiple-point geostatistical simulations", Water Resour. Res., 46 (2010) [5] P. Bayer et al., "Three-dimensional high resolution fluvio-glacial aquifer analog", J. Hydro 405 (2011) 19

  7. Use of Combined Uncertainty of Pesticide Residue Results for Testing Compliance with Maximum Residue Limits (MRLs).

    PubMed

    Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, rpd

    2015-05-13

    The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants. PMID:25658668

  8. [The principles of homeopathy].

    PubMed

    Hjelvik, M; Mrenskog, E

    1997-06-30

    Homeopathy is a gentle but effective form of treatment which stimulates the natural ability of the organism to heal itself. The word homoeopathy comes from the greek words "homoios" which means similar, and "pathos" which means disease. This reflects the main principle of homoeopathy, the law of similars, which predicts that a disease can be cured by a medicine, which in healthy people is able to produce a condition that resembles the disease. The law of similars is probably a basic law of nature. Therefore it is not surprising that examples can also be found in orthodox medicine, where the mode of functioning for some medicines probably can be ascribed the law of similars. Homoeopathic medicines are likely to work through the body's own curative powers in a way that is best explained by comparison with vaccination. Both the homoeopathic medicine and the vaccine constitute a mild stimulous that causes mobilisation of the body's defence mechanisms and thus increased ability to oppose a pathogenic influence. The homoeopathic medicine does not work at molecular level, but probably through non-materialistic qualities (possibly electromagnetic in nature) in the organism, which are so sensitive that even a mild stimulus is enough to cause a reaction. This means that homoeopathic preparations can still have an effect even when diluted beyond avogadro's number. PMID:9265314

  9. Principles of Bioremediation Assessment

    NASA Astrophysics Data System (ADS)

    Madsen, E. L.

    2001-12-01

    Although microorganisms have successfully and spontaneously maintained the biosphere since its inception, industrialized societies now produce undesirable chemical compounds at rates that outpace naturally occurring microbial detoxification processes. This presentation provides an overview of both the complexities of contaminated sites and methodological limitations in environmental microbiology that impede the documentation of biodegradation processes in the field. An essential step toward attaining reliable bioremediation technologies is the development of criteria which prove that microorganisms in contaminated field sites are truly active in metabolizing contaminants of interest. These criteria, which rely upon genetic, biochemical, physiological, and ecological principles and apply to both in situ and ex situ bioremediation strategies include: (i) internal conservative tracers; (ii) added conservative tracers; (iii) added radioactive tracers; (iv) added isotopic tracers; (v) stable isotopic fractionation patterns; (vi) detection of intermediary metabolites; (vii) replicated field plots; (viii) microbial metabolic adaptation; (ix) molecular biological indicators; (x) gradients of coreactants and/or products; (xi) in situ rates of respiration; (xii) mass balances of contaminants, coreactants, and products; and (xiii) computer modeling that incorporates transport and reactive stoichiometries of electron donors and acceptors. The ideal goal is achieving a quantitative understanding of the geochemistry, hydrogeology, and physiology of complex real-world systems.

  10. Cryogenic Equivalence Principle Experiment

    NASA Technical Reports Server (NTRS)

    Everitt, C. W. F.; Worden, P. W.

    1985-01-01

    The purpose of this project is to test the equivalence of inertial and passive gravitational mass in an Earth-orbiting satellite. A ground-based experiment is now well developed. It consists of comparing the motions of two cylindrical test masses suspended in precision superconducting magnetic bearings and free to move along the horizontal (axis) direction. The masses are made of niobium and lead-plated aluminum. A position detector based on a SQUID magnetometer measures the differential motion between the masses. The periods of the masses are matched by adjustment of the position detector until the system is insensitive to common mode signals, and so that the experiment is less sensitive to seismic vibration. The apparatus is contained in a twelve inch helium dewar suspended in a vibration isolation stand. The stand achieves 30 db isolation from horizontal motions between 0.1 and 60 Hz, by simulating the motion of a 200 meter long pendulum with an air bearing. With this attenuation of seismic noise and a common mode rejection ratio of 10 to the 5th power in the differential mode, the ground based apparatus should have a sensitivity to equivalence principle violations of one part in 10 to the 13th power; the satellite version might have a sensitivity of one part in 10 to the 17th power.

  11. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  12. Validation of an Experimentally Derived Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.

    1996-01-01

    The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.

  13. Visualization of Information Uncertainty: Progress and Challenges

    NASA Astrophysics Data System (ADS)

    Pham, Binh; Streit, Alex; Brown, Ross

    Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.

  14. Toward an uncertainty budget for measuring nanoparticles by AFM

    NASA Astrophysics Data System (ADS)

    Delvallée, A.; Feltin, N.; Ducourtieux, S.; Trabelsi, M.; Hochepied, J. F.

    2016-02-01

    This article reports on the evaluation of an uncertainty budget associated with the measurement of the mean diameter of a nanoparticle (NP) population by Atomic Force Microscopy. The measurement principle consists in measuring the height of a spherical-like NP population to determine the mean diameter and the size distribution. This method assumes that the NPs are well-dispersed on the substrate and isolated enough to avoid measurement errors due to agglomeration phenomenon. Since the measurement is directly impacted by the substrate roughness, the NPs have been deposited on a mica sheet presenting a very low roughness. A complete metrological characterization of the instrument has been carried out and the main error sources have been evaluated. The measuring method has been tested on a population of SiO2 NPs. Homemade software has been used to build the height distribution histogram taking into account only isolated NP. Finally, the uncertainty budget including main components has been established for the mean diameter measurement of this NP population. The most important components of this uncertainty budget are the calibration process along Z-axis, the scanning speed influence and then the vertical noise level.

  15. Forest management under uncertainty for multiple bird population objectives

    USGS Publications Warehouse

    Moore, C.T.; Plummer, W.T.; Conroy, M.J.

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

  16. Performance of Trajectory Models with Wind Uncertainty

    NASA Technical Reports Server (NTRS)

    Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

    2009-01-01

    Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

  17. Uncertainty Quantification and Transdimensional Inversion

    NASA Astrophysics Data System (ADS)

    Sambridge, M.; Hawkins, R.

    2014-12-01

    Over recent years transdimensional inference methods have grown in popularity and found applications in fields ranging from Solid Earth Geophysics, to Geochemistry. In all applications of inversion assumptions are made about the nature of the model parametrisation, complexity and data noise characteristics, and results can be significantly dependent on those assumptions. Often these are in the form of fixed choices imposed a priori, e.g. in the grid size of the model or noise level in the data. A transdimensional approach allows these assumptions to be relaxed by incorporating relevant parameters as unknowns in the inference problem, e.g. the number of model parameters becomes a variable as does the form of basis functions and the variance of the data noise. In this way uncertainty due to parameterisation effects or data noise choices may be incorporated into the inference process. Probabilistic sampling techniques such as Birth-Death Markov chain Monte Carlo and the Reversible jump algorithm, allow sampling over complex posterior probability density functions providing information on constraint, trade-offs and uncertainty in the unknowns. This talk will present a review of trans-dimensional inference and its application in geophysical inversion, and highlight some emerging trends such as Multi-scale McMC, Parallel Tempering and Sequential McMC which hold the promise of further extending the range of problems where these methods are practical.

  18. Path planning under spatial uncertainty.

    PubMed

    Wiener, Jan M; Lafon, Matthieu; Berthoz, Alain

    2008-04-01

    In this article, we present experiments studying path planning under spatial uncertainties. In the main experiment, the participants' task was to navigate the shortest possible path to find an object hidden in one of four places and to bring it to the final destination. The probability of finding the object (probability matrix) was different for each of the four places and varied between conditions. Givensuch uncertainties about the object's location, planning a single path is not sufficient. Participants had to generate multiple consecutive plans (metaplans)--for example: If the object is found in A, proceed to the destination; if the object is not found, proceed to B; and so on. The optimal solution depends on the specific probability matrix. In each condition, participants learned a different probability matrix and were then asked to report the optimal metaplan. Results demonstrate effective integration of the probabilistic information about the object's location during planning. We present a hierarchical planning scheme that could account for participants' behavior, as well as for systematic errors and differences between conditions. PMID:18491490

  19. The inconstant "principle of constancy".

    PubMed

    Kanzer, M

    1983-01-01

    A review of the principle of constancy, as it appeared in Freud's writings, shows that it was inspired by his clinical observations, first with Breuer in the field of cathartic therapy and then through experiences in the early usage of psychoanalysis. The recognition that memories repressed in the unconscious created increasing tension, and that this was relieved with dischargelike phenomena when the unconscious was made conscious, was the basis for his claim to originality in this area. The two principles of "neuronic inertia" Freud expounded in the Project (1895), are found to offer the key to the ambiguous definition of the principle of constancy he was to offer in later years. The "original" principle, which sought the complete discharge of energy (or elimination of stimuli), became the forerunner of the death drive; the "extended" principle achieved balances that were relatively constant, but succumbed in the end to complete discharge. This was the predecessor of the life drives. The relation between the constancy and pleasure-unpleasure principles was maintained for twenty-five years largely on an empirical basis which invoked the concept of psychophysical parallelism between "quantity" and "quality." As the links between the two principles were weakened by clinical experiences attendant upon the growth of ego psychology, a revision of the principle of constancy was suggested, and it was renamed the Nirvana principle. Actually it was shifted from alignment with the "extended" principle of inertia to the original, so that "constancy" was incongruously identified with self-extinction. The former basis for the constancy principle, the extended principle of inertia, became identified with Eros. Only a few commentators seem aware of this radical transformation, which has been overlooked in the Standard Edition of Freud's writings. Physiological biases in the history and conception of the principle of constancy are noted in the Standard Edition. The historical antecedents of the principle of constancy, especially in relation to the teachings and influence of J. F. Herbart (1776-1841), do much to bridge the gap between psychological and neurophysiological aspects of Freud's ideas about constancy and its associated doctrine, psychic determinism. Freud's later teachings about the Nirvana principle and Eros suggest a continuum of "constancies" embodied in the structural and functional development of the mental apparatus as it evolves from primal unity with the environment (e.g., the mother-child unit) and differentiates in patterns that organize the inner and outer worlds in relation to each other. PMID:6681436

  20. Physical Principles of Evolution

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    Theoretical biology is incomplete without a comprehensive theory of evolution, since evolution is at the core of biological thought. Evolution is visualized as a migration process in genotype or sequence space that is either an adaptive walk driven by some fitness gradient or a random walk in the absence of (sufficiently large) fitness differences. The Darwinian concept of natural selection consisting in the interplay of variation and selection is based on a dichotomy: All variations occur on genotypes whereas selection operates on phenotypes, and relations between genotypes and phenotypes, as encapsulated in a mapping from genotype space into phenotype space, are central to an understanding of evolution. Fitness is conceived as a function of the phenotype, represented by a second mapping from phenotype space into nonnegative real numbers. In the biology of organisms, genotype-phenotype maps are enormously complex and relevant information on them is exceedingly scarce. The situation is better in the case of viruses but so far only one example of a genotype-phenotype map, the mapping of RNA sequences into RNA secondary structures, has been investigated in sufficient detail. It provides direct information on RNA selection in vitro and test-tube evolution, and it is a basis for testing in silico evolution on a realistic fitness landscape. Most of the modeling efforts in theoretical and mathematical biology today are done by means of differential equations but stochastic effects are of undeniably great importance for evolution. Population sizes are much smaller than the numbers of genotypes constituting sequence space. Every mutant, after all, has to begin with a single copy. Evolution can be modeled by a chemical master equation, which (in principle) can be approximated by a stochastic differential equation. In addition, simulation tools are available that compute trajectories for master equations. The accessible population sizes in the range of 10^7le Nle 10^8 molecules are commonly too small for problems in chemistry but sufficient for biology.

  1. Principles of ecosystem sustainability

    SciTech Connect

    Chapin, F.S. III; Torn, M.S.; Tateno, Masaki

    1996-12-01

    Many natural ecosystems are self-sustaining, maintaining an characteristic mosaic of vegetation types of hundreds to thousands of years. In this article we present a new framework for defining the conditions that sustain natural ecosystems and apply these principles to sustainability of managed ecosystems. A sustainable ecosystem is one that, over the normal cycle of disturbance events, maintains its characteristics diversity of major functional groups, productivity, and rates of biogeochemical cycling. These traits are determined by a set of four {open_quotes}interactive controls{close_quotes} (climate, soil resource supply, major functional groups of organisms, and disturbance regime) that both govern and respond to ecosystem processes. Ecosystems cannot be sustained unless the interactive controls oscillate within stable bounds. This occurs when negative feedbacks constrain changes in these controls. For example, negative feedbacks associated with food availability and predation often constrain changes in the population size of a species. Linkages among ecosystems in a landscape can contribute to sustainability by creating or extending the feedback network beyond a single patch. The sustainability of managed systems can be increased by maintaining interactive controls so that they form negative feedbacks within ecosystems and by using laws and regulations to create negative feedbacks between ecosystems and human activities, such as between ocean ecosystems and marine fisheries. Degraded ecosystems can be restored through practices that enhance positive feedbacks to bring the ecosystem to a state where the interactive controls are commensurate with desired ecosystem characteristics. The possible combinations of interactive controls that govern ecosystem traits are limited by the environment, constraining the extent to which ecosystems can be managed sustainably for human purposes. 111 refs., 3 figs., 2 tabs.

  2. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  3. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  4. Meaty Principles for Environmental Educators.

    ERIC Educational Resources Information Center

    Rockcastle, V. N.

    1985-01-01

    Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)

  5. Ideario Educativo (Principles of Education).

    ERIC Educational Resources Information Center

    Consejo Nacional Tecnico de la Educacion (Mexico).

    This document is an English-language abstract (approximately 1,500 words) which discusses an overall educational policy for Mexico based on Constitutional principles and those of humanism. The basic principles that should guide Mexican education as seen by the National Technical Council for Education are the following: (1) love of country; (2)…

  6. Ideario Educativo (Principles of Education).

    ERIC Educational Resources Information Center

    Consejo Nacional Tecnico de la Educacion (Mexico).

    This document is an English-language abstract (approximately 1,500 words) which discusses an overall educational policy for Mexico based on Constitutional principles and those of humanism. The basic principles that should guide Mexican education as seen by the National Technical Council for Education are the following: (1) love of country; (2)

  7. Multimedia Principle in Teaching Lessons

    ERIC Educational Resources Information Center

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  8. Principles of Instructed Language Learning

    ERIC Educational Resources Information Center

    Ellis, Rod

    2005-01-01

    This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on

  9. Principles of Instructed Language Learning

    ERIC Educational Resources Information Center

    Ellis, Rod

    2005-01-01

    This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on…

  10. [Dignity, founding principle of law].

    PubMed

    Mathieu, Bertrand

    2010-09-01

    The principle of dignity made a noted appearance in the legal field on the occasion of the adoption of the first texts concerning bioethics. There is in fact an obvious correlation between the need to provide a framework for certain practices and the principle of human dignity. This recognition, which can be seen in international and European law as much as in national law, is marked by certain ambiguities as to its meaning and its impact. So this principle should be subjected to a legal analysis. From this point of view, it presents three main characteristics, it is a matrix principle, which cannot be waived and it constitutes an objective right. Today, beyond its formal recognition, the effectiveness of the principle of dignity is weakened by a tendency to give prevalence to the requirement of freedom, as a subjective right. Beyond the ideological debate on this issue, it is the protection of the individual that is at stake. PMID:21456303

  11. Precautionary principle in international law.

    PubMed

    Saladin, C

    2000-01-01

    The deregulatory nature of trade rules frequently brings them into conflict with the precautionary principle. These rules dominate debate over the content and legal status of the precautionary principle at the international level. The World Trade Organization (WTO), because of its power in settling disputes, is a key player. Many States are concerned to define the precautionary principle consistent with WTO rules, which generally means defining it as simply a component of risk analysis. At the same time, many States, especially environmental and public health policymakers, see the principle as the legal basis for preserving domestic and public health measures in the face of deregulatory pressures from the WTO. The precautionary principle has begun to acquire greater content and to move into the operative articles of legally binding international agreements. It is important to continue this trend. PMID:11114120

  12. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  13. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  14. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  15. An Inconvenient Principle

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

  16. Quantum principles and free particles. [evaluation of partitions

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.

  17. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  18. The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes

    NASA Technical Reports Server (NTRS)

    Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

    1993-01-01

    A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

  19. Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?

    NASA Astrophysics Data System (ADS)

    Mc Leod, David; Mc Leod, Roger

    2007-04-01

    String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that λp = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5

  20. Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Miyadera, Takayuki; Imai, Hideki

    2008-11-01

    A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.

  1. Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.

    ERIC Educational Resources Information Center

    Wassermann, Selma

    2001-01-01

    Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill. (PKP)

  2. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  3. Induction of models under uncertainty

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter

    1986-01-01

    This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.

  4. Dopamine, uncertainty and TD learning

    PubMed Central

    Niv, Yael; Duff, Michael O; Dayan, Peter

    2005-01-01

    Substantial evidence suggests that the phasic activities of dopaminergic neurons in the primate midbrain represent a temporal difference (TD) error in predictions of future reward, with increases above and decreases below baseline consequent on positive and negative prediction errors, respectively. However, dopamine cells have very low baseline activity, which implies that the representation of these two sorts of error is asymmetric. We explore the implications of this seemingly innocuous asymmetry for the interpretation of dopaminergic firing patterns in experiments with probabilistic rewards which bring about persistent prediction errors. In particular, we show that when averaging the non-stationary prediction errors across trials, a ramping in the activity of the dopamine neurons should be apparent, whose magnitude is dependent on the learning rate. This exact phenomenon was observed in a recent experiment, though being interpreted there in antipodal terms as a within-trial encoding of uncertainty. PMID:15953384

  5. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  6. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-01-01

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643

  7. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (PIV-MS), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the PIV-HDR (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for comparison of the measurement accuracy of existing or newly developed PIV interrogation algorithms. The database is publicly available on the website www.piv.de/uncertainty.

  8. Self as the feminine principle.

    PubMed

    Weisstub, E B

    1997-07-01

    In analytical psychology, ego is associated with consciousness and the masculine principle. Although the feminine principle generally characterizes the unconscious, it was not assigned a psychic structure equivalent to the ego. This paper proposes a model of the psyche where self and ego are the major modes of psychic experience. The self as the 'being' mode represents the feminine principle and functions according to primary process; the ego represents 'doing', the masculine principle and secondary process. Feminine and masculine principles are considered to be of equal significance in both men and women and are not limited to gender. Jung's concept of the self is related to the Hindu metaphysical concepts of Atman and Brahman, whose source was the older Aryan nature-oriented, pagan religion. The prominence of self in analytical psychology and its predominantly 'feminine' symbolism can be understood as Jung's reaction to the psychoanalytic emphasis on ego and to Freud's 'patriarchal' orientation. In Kabbalah, a similar development took place when the feminine principle of the Shekinah emerged in a central, redemptive role, as a mythic compensation to the overtly patriarchal Judaic religion. In the proposed model of the psyche neither ego nor self represents the psychic totality. The interplay of both psychic modes/principles constitutes the psyche and the individuation process. PMID:9246929

  9. Damage assessment of the truss system with uncertainty using frequency response function based damage identification method

    NASA Astrophysics Data System (ADS)

    Zhao, Jie; DeSmidt, Hans; Yao, Wei

    2015-04-01

    A novel vibration-based damage identification methodology for the truss system with mass and stiffness uncertainties is proposed and demonstrated. This approach utilizes the damaged-induced changes of frequency response functions (FRF) to assess the severity and location of the structural damage in the system. The damage identification algorithm is developed basing on the least square and Newton-Raphson methods. The dynamical model of system is built using finite element method and Lagrange principle while the crack model is based on fracture mechanics. The method is synthesized via numerical examples for a truss system to demonstrate the effectiveness in detecting both stiffness and mass uncertainty existed in the system.

  10. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  11. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defenders beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attackers payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attackers payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  12. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pomm, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as easy and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  13. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    PubMed

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results. PMID:26357158

  14. Few group collapsing of covariance matrix data based on a conservation principle

    SciTech Connect

    Hiruta,H.; Palmiotti, G.; Salvatores, M.; Arcilla, Jr., R.; Oblozinsky, P.; McKnight, R.D.

    2008-06-24

    A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows preserving at a broad energy group structure the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients.

  15. Principles of Pharmacotherapy: I. Pharmacodynamics

    PubMed Central

    Pallasch, Thomas J.

    1988-01-01

    This paper and the ensuing series present the principles guiding and affecting the ability of drugs to produce therapeutic benefit or untoward harm. The principles of pharmacodynamics and pharmacokinetics, the physiologic basis of adverse drug reactions and suitable antidotal therapy, and the biologic basis of drug allergy, drug-drug interactions, pharmacogenetics, teratology and hematologic reactions to chemicals are explored. These principles serve to guide those administering and using drugs to attain the maximum benefit and least attendant harm from their use. Such is the goal of rational therapeutics. PMID:3046440

  16. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  17. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  18. Notes on the effect of dose uncertainty

    SciTech Connect

    Morris, M.D.

    1987-01-01

    The apparent dose-response relationship between amount of exposure to acute radiation and level of mortality in humans is affected by uncertainties in the dose values. It is apparent that one of the greatest concerns regarding the human data from Hiroshima and Nagasaki is the unexpectedly shallow slope of the dose response curve. This may be partially explained by uncertainty in the dose estimates. Some potential effects of dose uncertainty on the apparent dose-response relationship are demonstrated.

  19. Optimality principles for the visual code

    NASA Astrophysics Data System (ADS)

    Pitkow, Xaq

    One way to try to make sense of the complexities of our visual system is to hypothesize that evolution has developed nearly optimal solutions to the problems organisms face in the environment. In this thesis, we study two such principles of optimality for the visual code. In the first half of this dissertation, we consider the principle of decorrelation. Influential theories assert that the center-surround receptive fields of retinal neurons remove spatial correlations present in the visual world. It has been proposed that this decorrelation serves to maximize information transmission to the brain by avoiding transfer of redundant information through optic nerve fibers of limited capacity. While these theories successfully account for several aspects of visual perception, the notion that the outputs of the retina are less correlated than its inputs has never been directly tested at the site of the putative information bottleneck, the optic nerve. We presented visual stimuli with naturalistic image correlations to the salamander retina while recording responses of many retinal ganglion cells using a microelectrode array. The output signals of ganglion cells are indeed decorrelated compared to the visual input, but the receptive fields are only partly responsible. Much of the decorrelation is due to the nonlinear processing by neurons rather than the linear receptive fields. This form of decorrelation dramatically limits information transmission. Instead of improving coding efficiency we show that the nonlinearity is well suited to enable a combinatorial code or to signal robust stimulus features. In the second half of this dissertation, we develop an ideal observer model for the task of discriminating between two small stimuli which move along an unknown retinal trajectory induced by fixational eye movements. The ideal observer is provided with the responses of a model retina and guesses the stimulus identity based on the maximum likelihood rule, which involves sums over all random walk trajectories. These sums can be implemented in a biologically plausible way. The necessary ingredients are: neurons modeled as a cascade of a linear filter followed by a static nonlinearity, a recurrent network with additive and multiplicative interactions between neurons, and divisive global inhibition. This architecture implements Bayesian inference by representing likelihoods as neural activity which can then diffuse through the recurrent network and modulate the influence of later information. We also develop approximation methods for characterizing the performance of the ideal observer. We find that the effect of positional uncertainty is essentially to slow the acquisition of signal. The time scaling is related to the size of the uncertainty region, which is in turn related to both the signal strength and the statistics of the fixational eye movements. These results imply that localization cues should determine the slope of the performance curve in time.

  20. Updated uncertainty budgets for NIST thermocouple calibrations

    NASA Astrophysics Data System (ADS)

    Meyer, C. W.; Garrity, K. M.

    2013-09-01

    We have recently updated the uncertainty budgets for calibrations in the NIST Thermocouple Calibration Laboratory. The purpose for the updates has been to 1) revise the estimated values of the relevant uncertainty elements to reflect the current calibration facilities and methods, 2) provide uncertainty budgets for every standard calibration service offered, and 3) make the uncertainty budgets more understandable to customers by expressing all uncertainties in units of temperature (°C) rather than emf. We have updated the uncertainty budgets for fixed-point calibrations of type S, R, and B thermocouples and comparison calibrations of type R and S thermocouples using a type S reference standard. In addition, we have constructed new uncertainty budgets for comparison calibrations of type B thermocouples using a type B reference standard as well as using both a type S and type B reference standard (for calibration over a larger range). We have updated the uncertainty budgets for comparison calibrations of base-metal thermocouples using a type S reference standard and alternately using a standard platinum resistance thermometer reference standard. Finally, we have constructed new uncertainty budgets for comparison tests of noble-metal and base-metal thermoelements using a type S reference standard. A description of these updates is presented in this paper.

  1. Get Provoked: Applying Tilden's Principles.

    ERIC Educational Resources Information Center

    Shively, Carol A.

    1995-01-01

    This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

  2. Dye laser principles, with applications

    SciTech Connect

    Duarte, F.J. . Dept. of Physics); Hillman, L.W. . Dept. of Physics)

    1990-01-01

    This book contains papers which explain dye laser principles. Topics covered include: laser dynamics, femtosecond dye lasers, CW dye lasers, technology of pulsed dye lases, photochemistry of laser dyes, and laser applications.

  3. 12 Principles of Knowledge Management.

    ERIC Educational Resources Information Center

    Allee, Verna

    1997-01-01

    Understanding knowledge is the first step in managing it effectively. Twelve guiding principles and the core competencies of knowledge can help organizations make the most of their knowledge assets. (JOW)

  4. Extrema Principles Of Dissipation In Fluids

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Karamcheti, Krishnamurty

    1991-01-01

    Report discusses application of principle of least action and other variational or extrema principles to dissipation of energy and production of entropy in fluids. Principle of least action applied successfully to dynamics of particles and to quantum mechanics, but not universally accepted that variational principles applicable to thermodynamics and hydrodynamics. Report argues for applicability of some extrema principles to some simple flows.

  5. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.

  6. Equivalence Principle and Gravitational Redshift

    SciTech Connect

    Hohensee, Michael A.; Chu, Steven; Mueller, Holger; Peters, Achim

    2011-04-15

    We investigate leading order deviations from general relativity that violate the Einstein equivalence principle in the gravitational standard model extension. We show that redshift experiments based on matter waves and clock comparisons are equivalent to one another. Consideration of torsion balance tests, along with matter-wave, microwave, optical, and Moessbauer clock tests, yields comprehensive limits on spin-independent Einstein equivalence principle-violating standard model extension terms at the 10{sup -6} level.

  7. Testing the strong equivalence principle by radio ranging

    NASA Technical Reports Server (NTRS)

    Canuto, V. M.; Goldman, I.; Shapiro, I. I.

    1984-01-01

    Planetary range data offer the most promising means to test the validity of the Strong Equivalence Principle (SEP). Analytical expressions for the perturbation in the 'range' expected from an SEP violation predicted by the 'variation-of-G' method and by the 'two-times' approach are derived and compared. The dominant term in both expressions is quadratic in time. Analysis of existing range data should allow a determination of the coefficient of this term with a one-standard-deviation uncertainty of about 1 part in 100 billion/yr.

  8. Uncertainty in Future Tropical Precipitation Change

    NASA Astrophysics Data System (ADS)

    Rowell, Dave

    2010-05-01

    This study aims to describe, and begin to understand, the substantial uncertainties in future changes in local precipitation due to differences in model formulation. Four multi-model ensembles (MMEs) are studied: a 268-member perturbed physics slab-model ensemble, two 17-member perturbed physics coupled-model ensembles (one with perturbed carbon cycle parameters), and a 16-member coupled-multi-model ensemble utilising the CMIP3 database. The focus is primarily on the vulnerable tropical regions. An unbiased metric is developed to map the contribution made by uncertainties in model formulation to projected changes in local precipitation, relative to the contribution made by natural variability. This essentially provides a first attempt at evaluating the potential for reducing uncertainties in local precipitation change through model improvements. The role of modeling uncertainty is found to vary substantially, with a rich spatial structure being evident on a range of scales. Many key large-scale features are broadly similar between the different MMEs, suggesting that more detailed investigation of the large slab-model ensemble may have wider applicability. At regional scales, an important finding is that much larger spatial noise is apparent in the smaller MMEs, suggesting that 10s or 100s of model versions are required to robustly map the role of modeling uncertainty in local precipitation change. The causes of some of these large-scale patterns in the role of modeling uncertainty are explored. The intention is that an enhanced understanding of uncertain physical processes will help develop observational constraints for regional uncertainty. In particular, it is found that the uncertainty due to model formulation is larger in equatorial regions than in the subtropics, and in the former always markedly exceeds that due to natural variability. This is consistent across the four MMEs, and is examined further, and separately, for tropical land and for each of the major tropical oceans. It is found that the physical mechanisms, and the resulting pattern and magnitude, of modeling uncertainty differs somewhat between these regions. Furthermore, throughout the tropics, uncertainty in global climate sensitivity contributes little to uncertainty in local precipitation changes. Subsequent analysis then estimates which of the model parameters contribute most to uncertainty in the large perturbed physics ensemble. It is found that there is considerable sensitivity to some parameters within the cloud scheme, and that over tropical land regions uncertainty in modeling the carbon cycle may be as important as uncertainties in modeling atmospheric processes. The importance of some other parameter uncertainties is more surprising, and these too are discussed .

  9. Uncertainties in selected river water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2007-02-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  10. Uncertainties in selected surface water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2006-09-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  11. Strong majorization entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Rudnicki, ?ukasz; Pucha?a, Zbigniew; ?yczkowski, Karol

    2014-05-01

    We analyze entropic uncertainty relations in a finite-dimensional Hilbert space and derive several strong bounds for the sum of two entropies obtained in projective measurements with respect to any two orthogonal bases. We improve the recent bounds by Coles and Piani [P. Coles and M. Piani, Phys. Rev. A 89, 022112 (2014), 10.1103/PhysRevA.89.022112], which are known to be stronger than the well-known result of Maassen and Uffink [H. Maassen and J. B. M. Uffink, Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103]. Furthermore, we find a bound based on majorization techniques, which also happens to be stronger than the recent results involving the largest singular values of submatrices of the unitary matrix connecting both bases. The first set of bounds gives better results for unitary matrices close to the Fourier matrix, while the second one provides a significant improvement in the opposite sectors. Some results derived admit generalization to arbitrary mixed states, so that corresponding bounds are increased by the von Neumann entropy of the measured state. The majorization approach is finally extended to the case of several measurements.

  12. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (ESTSC)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  13. Uncertainty reasoning in expert systems

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  14. EDITORIAL: Squeezed states and uncertainty relations

    NASA Astrophysics Data System (ADS)

    Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector

    2004-06-01

    This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofsica, ptica y Electrnica, and the Universidad Nacional Autnoma de Mxico. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besanon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly from Latin American countries including, of course, Mexico. There were many talks on the subjects traditionally covered in this conference series, including quantum fluctuations, different forms of squeezing, unlike kinds of nonclassical states of light, and distinct representations of the quantum superposition principle, such as even and odd coherent states. The entanglement phenomenon, frequently in the form of the EPR paradox, is responsible for the main advantages of quantum engineering compared with classical methods. Even though entanglement has been known since the early days of quantum mechanics, its properties, such as the most appropriate entanglement measures, are still under current investigation. The phenomena of dissipations and decoherence of the initial pure states are very important because the fast decoherence can destroy all the advantages of quantum processes in teleportation, quantum computing and image processing. Due to this, methods of controlling the decoherence, such as by the use of different kinds of nonlinearities and deformations, are also under study. From the very beginning of quantum mechanics, the uncertainty relations were basic inequalities distinguishing the classical and quantum worlds. Among the theoretical methods for quantum optics and quantum mechanics, this conference covered phase space and group representations, such as the Wigner and probability distribution functions, which provide an alternative approach to the Schr\\"odinger or Heisenberg picture. Different forms of probability representations of quantum states are important tools to be applied in studying various quantum phenomena, such as quantum interference, decoherence and quantum tomography. They have been established also as a very useful tool in all branches of classical optics. From the mathematical point of view, it is well known that the coherent and squeezed states are representations of the Lorentz group. It was noted throughout the conference that another form of the Lorentz group, namely, the 2 x 2 representation of the SL(2,c) group, is becoming

  15. Map scale and the communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    Conventionally the scale at which mapped information is presented in earth sciences reflects the uncertainty in this information. This partly reflects the cartographic sources of error in printed maps, but also conventions on the amount of underpinning observation on which the map is based. In soil surveys a convention is that the number of soil profile observations per unit area of printed map is fixed over a range of scales. For example, for surveys in the Netherlands, Steur (1961) suggested that there should be 5 field observations per cm2 of map. Bie and Beckett (1970) showed that there is a consistent relationship between map scale and the field effort of the soil survey. It is now common practice to map variables by geostatistical methods. The output from kriging can be on the support of the original data (point kriging) or can be upscaled to 'blocks' by block kriging. The block kriging prediction is of the spatial mean of the target variable across a block of specified dimensions. In principle the size of the block on which data are presented can by varied arbitrarily. In some circumstances the block size may be determined by operational requirements. However, for general purposes, predictions can be presented for blocks of any size. The same variable, sampled at a fixed intensity, could be presented as estimates for blocks 10 10 m on one map and 100 100 m on another map. The data user might be tempted to assume that the predictions on smaller blocks provide more information than the larger blocks. However, the prediction variance of the block mean diminishes with block size so improvement of the notional resolution of the information is accompanied by a reduction in its precision. This precision can be quantified by the block kriging variance, however this on its own may not serve to indicate whether the block size represents a good compromise between resolution and precision in a particular circumstance such that the resolution reasonably communicates the uncertainty of information to the data user. In this presentation I show how, in place of the block kriging variance, one can use the model-based correlation between the block kriged estimate and the true spatial mean of the block as a readilly interpreted measure of the quality of block-kriging predictions. Graphs of this correlation as a function of block size, for a given sampling configuration, allow one to assess the suitability of different block sizes in circumstances where these are not fixed by operational requirements. For example, it would be possible to determine a new convention by which block kriged predictions are routinely presented only for block sizes such that the correlation exceeds some threshold value. Steur, G.G.L. 1961. Methods of soil survey in use in the Netherlands Soil Survey Institute. Boor Spade 11, 59-77. Bie, S.W., Beckett, P.H.T. 1970. The costs of soil survey. Soils and Fertilizers 34, 1-15.

  16. Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

    SciTech Connect

    Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

    2013-12-01

    The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

  17. Position-momentum uncertainty relations in the presence of quantum memory

    SciTech Connect

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  18. Estimating epistemic and aleatory uncertainties during hydrologic modeling: An information theoretic approach

    NASA Astrophysics Data System (ADS)

    Gong, Wei; Gupta, Hoshin V.; Yang, Dawen; Sricharan, Kumar; Hero, Alfred O.

    2013-04-01

    With growing interest in understanding the magnitudes and sources of uncertainty in hydrological modeling, the difficult problem of characterizing model structure adequacy is now attracting considerable attention. Here, we examine this problem via a model-structure-independent approach based in information theory. In particular, we (a) discuss how to assess and compute the information content in multivariate hydrological data, (b) present practical methods for quantifying the uncertainty and shared information in data while accounting for heteroscedasticity, (c) show how these tools can be used to estimate the best achievable predictive performance of a model (for a system given the available data), and (d) show how model adequacy can be characterized in terms of the magnitude and nature of its aleatory uncertainty that cannot be diminished (and is resolvable only up to specification of its density), and its epistemic uncertainty that can, in principle, be suitably resolved by improving the model. An illustrative modeling example is provided using catchment-scale data from three river basins, the Leaf and Chunky River basins in the United States and the Chuzhou basin in China. Our analysis shows that the aleatory uncertainty associated with making catchment simulations using this data set is significant (˜50%). Further, estimated epistemic uncertainties of the HyMod, SAC-SMA, and Xinanjiang model hypotheses indicate that considerable room for model structural improvements remain.

  19. The GUM revision: the Bayesian view toward the expression of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Lira, I.

    2016-03-01

    The ‘Guide to the Expression of Uncertainty in Measurement’ (GUM) has been in use for more than 20 years, serving its purposes worldwide at all levels of metrology, from scientific to industrial and commercial applications. However, the GUM presents some inconsistencies, both internally and with respect to its two later Supplements. For this reason, the Joint Committee for Guides in Metrology, which is responsible for these documents, has decided that a major revision of the GUM is needed. This will be done by following the principles of Bayesian statistics, a concise summary of which is presented in this article. Those principles should be useful in physics and engineering laboratory courses that teach the fundamentals of data analysis and measurement uncertainty evaluation.

  20. Uncertainty Analysis by the "Worst Case" Method.

    ERIC Educational Resources Information Center

    Gordon, Roy; And Others

    1984-01-01

    Presents a new method of uncertainty propagation which concentrates on the calculation of upper and lower limits (the "worst cases"), bypassing absolute and relative uncertainties. Includes advantages of this method and its use in freshmen laboratories, advantages of the traditional method, and a numerical example done by both methods. (JN)

  1. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  2. The Stock Market: Risk vs. Uncertainty.

    ERIC Educational Resources Information Center

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…

  3. Nonclassicality in phase-number uncertainty relations

    SciTech Connect

    Matia-Hernando, Paloma; Luis, Alfredo

    2011-12-15

    We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.

  4. Uncertainty Propagation in an Ecosystem Nutrient Budget.

    EPA Science Inventory

    New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

  5. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  6. Microform calibration uncertainties of Rockwell diamond indenters

    SciTech Connect

    Song, J.F.; Rudder, F.F. Jr.; Vorburger, T.V.; Smith, J.H.

    1995-09-01

    The Rockwell hardness test is a mechanical testing method for evaluating a property of metal products. National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of calibration and check standards, and calibration and uncertainty calculation procedures, the authors have calibrated the microform geometric parameters of Rockwell diamond indenters. These calibrations are traceable to fundamental standards. The expanded uncertainties are {+-} 0.3 {micro}m for the least-squares radius; {+-} 0.01{degree} for the cone angle; and {+-} 0.025 for the holder axis alignment calibrations. Under ISO and NIST guidelines for expressing measurement uncertainties, the calibration and uncertainty calculation procedure, error sources, and uncertainty components are described, and the expanded uncertainties are calculated. The instrumentation and calibration procedure also allows the measurement of profile deviation from the least-squares radius and cone flank straightness. The surface roughness and the shape of the spherical tip of the diamond indenter can also be explored and quantified. The calibration approach makes it possible to quantify the uncertainty, uniformity, and reproducibility of Rockwell diamond indenter microform geometry, as well as to unify the Rockwell hardness standards, through fundamental measurements rather than performance comparisons.

  7. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  8. Uncertainty--We Do Need It.

    ERIC Educational Resources Information Center

    Cothern, C. Richard; Cothern, Margaret Fogt

    1980-01-01

    The precision of measurements in today's society is discussed and is related to the range of uncertainty or variation of measurement. Numerous examples provide insight into the margin of error in any measurement. The issue of uncertainty is particularly applicable to levels of toxic chemicals in the environment. (SA)

  9. Accounting for uncertainty in marine reserve design.

    PubMed

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals. PMID:16958861

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were

  12. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  13. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  14. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  15. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  16. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  17. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  18. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  19. Critical analysis of uncertainties during particle filtration

    NASA Astrophysics Data System (ADS)

    Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

    2012-09-01

    Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

  20. Critical analysis of uncertainties during particle filtration.

    PubMed

    Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

    2012-09-01

    Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data. PMID:23020418

  1. DO MODEL UNCERTAINTY WITH CORRELATED INPUTS

    EPA Science Inventory

    The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

  2. Uncertainties in the JPL planetary ephemeris

    NASA Astrophysics Data System (ADS)

    Folkner, W. M.

    2011-10-01

    The numerically integrated planetary ephemerides by JPL, IMCCE, and IPA are largely based on the same observation set and dynamical models. The differences between ephemerides are expected to be consistent within uncertainties. Uncertainties in the orbits of the major planets and the dwarf planet Pluto based on recent analysis at JPL are described.

  3. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  4. Uncertainty quantification in hybrid dynamical systems

    NASA Astrophysics Data System (ADS)

    Sahai, Tuhin; Pasini, Jos Miguel

    2013-03-01

    Uncertainty quantification (UQ) techniques are frequently used to ascertain output variability in systems with parametric uncertainty. Traditional algorithms for UQ are either system-agnostic and slow (such as Monte Carlo) or fast with stringent assumptions on smoothness (such as polynomial chaos and Quasi-Monte Carlo). In this work, we develop a fast UQ approach for hybrid dynamical systems by extending the polynomial chaos methodology to these systems. To capture discontinuities, we use a wavelet-based Wiener-Haar expansion. We develop a boundary layer approach to propagate uncertainty through separable reset conditions. We also introduce a transport theory based approach for propagating uncertainty through hybrid dynamical systems. Here the expansion yields a set of hyperbolic equations that are solved by integrating along characteristics. The solution of the partial differential equation along the characteristics allows one to quantify uncertainty in hybrid or switching dynamical systems. The above methods are demonstrated on example problems.

  5. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  6. Habitable zone dependence on stellar parameter uncertainties

    SciTech Connect

    Kane, Stephen R.

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  7. Managing uncertainties in the surgical scheduling.

    PubMed

    Wiyartanti, Lisa; Park, Myon Woong; Chung, Dahee; Kim, Jae Kwan; Sohn, Young Tae; Kwon, Gyu Hyun

    2015-01-01

    Current surgical scheduling system has difficulties to handle unpredictable events or uncertainties. Source of uncertainties may come from the patient or the surgery itself, where several cases require immediate changes in data, such as when surgery delays or cancellation occurs on the same day. The study aimed to model the uncertainties for managing identified uncertainties during the continuous scheduling, framed by resilience concept to cope with the system fragility. In order to be able to control and adjust any changes which may affect the surgery schedule of the day, we provide alternatives of solution rather than strictly decide the best valued options. We identified dimensions of uncertainties and categorized them based on the resilience concept, computed the impact value of potentially conflicted resources as a result of schedule change. With the model applied, we would provide a list of most acceptable and less vulnerable alternatives for anesthesiologist as a scheduler to build resilience in the surgical scheduling. PMID:25991171

  8. Contending with uncertainty in conservation management decisions

    PubMed Central

    McCarthy, Michael A

    2014-01-01

    Efficient conservation management is particularly important because current spending is estimated to be insufficient to conserve the world's biodiversity. However, efficient management is confounded by uncertainty that pervades conservation management decisions. Uncertainties exist in objectives, dynamics of systems, the set of management options available, the influence of these management options, and the constraints on these options. Probabilistic and nonprobabilistic quantitative methods can help contend with these uncertainties. The vast majority of these account for known epistemic uncertainties, with methods optimizing the expected performance or finding solutions that achieve minimum performance requirements. Ignorance and indeterminacy continue to confound environmental management problems. While quantitative methods to account for uncertainty must aid decisions if the underlying models are sufficient approximations of reality, whether such models are sufficiently accurate has not yet been examined. PMID:25138920

  9. The principle of finiteness - a guideline for physical laws

    NASA Astrophysics Data System (ADS)

    Sternlieb, Abraham

    2013-04-01

    I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.

  10. Form of prior for constrained thermodynamic processes with uncertainty

    NASA Astrophysics Data System (ADS)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  11. The legal status of Uncertainty

    NASA Astrophysics Data System (ADS)

    Altamura, M.; Ferraris, L.; Miozzo, D.; Musso, L.; Siccardi, F.

    2011-03-01

    An exponential improvement of numerical weather prediction (NWP) models was observed during the last decade (Lynch, 2008). Civil Protection (CP) systems exploited Meteo services in order to redeploy their actions towards the prediction and prevention of events rather than towards an exclusively response-oriented mechanism1. Nevertheless, experience tells us that NWP models, even if assisted by real time observations, are far from being deterministic. Complications frequently emerge in medium to long range forecasting, which are subject to sudden modifications. On the other hand, short term forecasts, if seen through the lens of criminal trials2, are to the same extent, scarcely reliable (Molini et al., 2009). One particular episode related with wrong forecasts, in the Italian panorama, has deeply frightened CP operators as the NWP model in force missed a meteorological adversity which, in fact, caused death and dealt severe damage in the province of Vibo Valentia (2006). This event turned into a very discussed trial, lasting over three years, and intended against whom assumed the legal position of guardianship within the CP. A first set of data is now available showing that in concomitance with the trial of Vibo Valentia the number of alerts issued raised almost three folds. We sustain the hypothesis that the beginning of the process of overcriminalization (Husak, 2008) of CPs is currently increasing the number of false alerts with the consequent effect of weakening alert perception and response by the citizenship (Brezntiz, 1984). The common misunderstanding of such an issue, i.e. the inherent uncertainty in weather predictions, mainly by prosecutors and judges, and generally by whom deals with law and justice, is creating the basis for a defensive behaviour3 within CPs. This paper intends, thus, to analyse the social and legal relevance of uncertainty in the process of issuing meteo-hydrological alerts by CPs. Footnotes: 1 The Italian Civil Protection is working in this direction since 1992 (L. 225/92). An example of this effort is clearly given by the Prime Minister Decree (DPCM 20/12/2001 "Linee guida relative ai piani regionali per la programmazione delle attivita' di previsione, prevenzione e lotta attiva contro gli incendi boschivi - Guidelines for regional plans for the planning of prediction, prevention and forest fires fighting activities") that, already in 2001, emphasized "the most appropriate approach to pursue the preservation of forests is to promote and encourage prediction and prevention activities rather than giving priority to the emergency-phase focused on fire-fighting". 2 Supreme Court of the United States, In re Winship (No. 778), No. 778 argued: 20 January 1970, decided: 31 March 1970: Proof beyond a reasonable doubt, which is required by the Due Process Clause in criminal trials, is among the "essentials of due process and fair treatment" 3 In Kessler and McClellan (1996): "Defensive medicine is a potentially serious social problem: if fear of liability drives health care providers to administer treatments that do not have worthwhile medical benefits, then the current liability system may generate inefficiencies much larger than the costs of compensating malpractice claimants".

  12. The water sensitive city: principles for practice.

    PubMed

    Wong, T H F; Brown, R R

    2009-01-01

    With the widespread realisation of the significance of climate change, urban communities are increasingly seeking to ensure resilience to future uncertainties in urban water supplies, yet change seems slow with many cities facing ongoing investment in the conventional approach. This is because transforming cities to more sustainable urban water cities, or to Water Sensitive Cities, requires a major overhaul of the hydro-social contract that underpins conventional approaches. This paper provides an overview of the emerging research and practice focused on system resilience and principles of sustainable urban water management Three key pillars that need to underpin the development and practice of a Water Sensitive City are proposed: (i) access to a diversity of water sources underpinned by a diversity of centralised and decentralised infrastructure; (ii) provision of ecosystem services for the built and natural environment; and (iii) socio-political capital for sustainability and water sensitive behaviours. While there is not one example in the world of a Water Sensitive City, there are cities that lead on distinct and varying attributes of the water sensitive approach and examples from Australia and Singapore are presented. PMID:19657162

  13. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of meteorological, public health, media studies, social science, and psychology literature opens up a vast and interesting library that is not obvious to the volcanologist at a first glance. It shows that forecasts and their uncertainty can be phrased and delivered, and followed-up upon, in a manner that reduces the chance of message distortion. The mass-media delivery vehicle requires careful tracking because the potential for forecast distortion can result in a frame that the scientific response is "absurd", "confused", "shambolic", or "dysfunctional." This can help set up a "frightened", "frustrated", "angry", even "furious" reaction to the forecast and forecaster.

  14. Dealing with uncertainties - communication between disciplines

    NASA Astrophysics Data System (ADS)

    Overbeek, Bernadet; Bessembinder, Janette

    2013-04-01

    Climate adaptation research inevitably involves uncertainty issues - whether people are building a model, using climate scenarios, or evaluating policy processes. However, do they know which uncertainties are relevant in their field of work? And which uncertainties exist in the data from other disciplines that they use (e.g. climate data, land use, hydrological data) and how they propagate? From experiences in Dutch research programmes on climate change in the Netherlands we know that disciplines often deal differently with uncertainties. This complicates communication between disciplines and also with the various users of data and information on climate change and its impacts. In October 2012 an autumn school was organized within the Knowledge for Climate Research Programme in the Netherlands with as central theme dealing with and communicating about uncertainties, in climate- and socio-economic scenarios, in impact models and in the decision making process. The lectures and discussions contributed to the development of a common frame of reference (CFR) for dealing with uncertainties. The common frame contains the following: 1. Common definitions (typology of uncertainties, robustness); 2. Common understanding (why do we consider it important to take uncertainties into account) and aspects on which we disagree (how far should scientists go in communication?); 3. Documents that are considered important by all participants; 4. Do's and don'ts in dealing with uncertainties and communicating about uncertainties (e.g. know your audience, check how your figures are interpreted); 5. Recommendations for further actions (e.g. need for a platform to exchange experiences). The CFR is meant to help researchers in climate adaptation to work together and communicate together on climate change (better interaction between disciplines). It is also meant to help researchers to explain to others (e.g. decision makers) why and when researchers agree and when and why they disagree, and on what exactly. During the presentation some results of this autumn school will be presented.

  15. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and

  16. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    SciTech Connect

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab.

  17. Introducing a Simple Guide for the evaluation and expression of the uncertainty of NIST measurement results

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-02-01

    The current guidelines for the evaluation and expression of the uncertainty of NIST measurement results were originally published in 1993 as NIST Technical Note 1297, which was last revised in 1994. NIST is now updating its principles and procedures for uncertainty evaluation to address current and emerging needs in measurement science that Technical Note 1297 could not have anticipated or contemplated when it was first conceived. Although progressive and forward-looking, this update is also conservative because it does not require that current practices for uncertainty evaluation be abandoned or modified where they are fit for purpose and when there is no compelling reason to do otherwise. The updated guidelines are offered as a Simple Guide intended to be deployed under the NIST policy on Measurement Quality, and are accompanied by a rich collection of examples of application drawn from many different fields of measurement science.

  18. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    SciTech Connect

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  19. Evaluation of the measurement uncertainty in screening immunoassays in blood establishments: computation of diagnostic accuracy models.

    PubMed

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard

    2015-02-01

    The European Union regulation for blood establishments does not require the evaluation of measurement uncertainty in virology screening tests, which is required by ISO 15189 guideline following GUM principles. GUM modular approaches have been discussed by medical laboratory researchers but no consensus has been achieved regarding practical application. Meanwhile, the application of empirical approaches fulfilling GUM principles has gained support. Blood establishments' screening tests accredited by ISO 15189 need to select an appropriate model even GUM models are intended uniquely for quantitative examination procedures. Alternative (to GUM) models focused on probability have been proposed in medical laboratories' diagnostic tests. This article reviews, discusses and proposes models for diagnostic accuracy in blood establishments' screening tests. The output of these models is an alternative to VIM's measurement uncertainty concept. Example applications are provided for an anti-HCV test where calculations were performed using a commercial spreadsheet. The results show that these models satisfy ISO 15189 principles and that the estimation of clinical sensitivity, clinical specificity, binary results agreement and area under the ROC curve are alternatives to the measurement uncertainty concept. PMID:25617905

  20. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a valid measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an outlier measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.