Extended uncertainty from first principles
NASA Astrophysics Data System (ADS)
Costa Filho, Raimundo N.; Braga, João P. M.; Lira, Jorge H. S.; Andrade, José S.
2016-04-01
A translation operator acting in a space with a diagonal metric is introduced to describe the motion of a particle in a quantum system. We show that the momentum operator and, as a consequence, the uncertainty relation now depend on the metric. It is also shown that, for any metric expanded up to second order, this formalism naturally leads to an extended uncertainty principle (EUP) with a minimum momentum dispersion. The Ehrenfest theorem is modified to include an additional term related to a tidal force arriving from the space curvature introduced by the metric. For one-dimensional systems, we show how to map a harmonic potential to an effective potential in Euclidean space using different metrics.
Quantum mechanics and the generalized uncertainty principle
Bang, Jang Young; Berger, Micheal S.
2006-12-15
The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.
Gamma-Ray Telescope and Uncertainty Principle
ERIC Educational Resources Information Center
Shivalingaswamy, T.; Kagali, B. A.
2012-01-01
Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…
The Species Delimitation Uncertainty Principle
Adams, Byron J.
2001-01-01
If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874
Disturbance, the uncertainty principle and quantum optics
NASA Technical Reports Server (NTRS)
Martens, Hans; Demuynck, Willem M.
1993-01-01
It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.
Extrapolation, uncertainty factors, and the precautionary principle.
Steel, Daniel
2011-09-01
This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards.
Curriculum in Art Education: The Uncertainty Principle.
ERIC Educational Resources Information Center
Sullivan, Graeme
1989-01-01
Identifies curriculum as the pivotal link between theory and practice, noting that all stages of curriculum research and development are characterized by elements of uncertainty. States that this uncertainty principle reflects the reality of practice as it mirrors the contradictory nature of art, the pluralism of schools and society, and the…
Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.
ERIC Educational Resources Information Center
McKerrow, K. Kelly; McKerrow, Joan E.
1991-01-01
The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)
An uncertainty principle for unimodular quantum groups
Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca
2014-08-15
We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.
A Principle of Uncertainty for Information Seeking.
ERIC Educational Resources Information Center
Kuhlthau, Carol C.
1993-01-01
Proposes an uncertainty principle for information seeking based on the results of a series of studies that investigated the user's perspective of the information search process. Constructivist theory is discussed as a conceptual framework for studying the user's perspective, and areas for further research are suggested. (Contains 44 references.)…
Geometric formulation of the uncertainty principle
NASA Astrophysics Data System (ADS)
Bosyk, G. M.; Osán, T. M.; Lamberti, P. W.; Portesi, M.
2014-03-01
A geometric approach to formulate the uncertainty principle between quantum observables acting on an N-dimensional Hilbert space is proposed. We consider the fidelity between a density operator associated with a quantum system and a projector associated with an observable, and interpret it as the probability of obtaining the outcome corresponding to that projector. We make use of fidelity-based metrics such as angle, Bures, and root infidelity to propose a measure of uncertainty. The triangle inequality allows us to derive a family of uncertainty relations. In the case of the angle metric, we recover the Landau-Pollak inequality for pure states and show, in a natural way, how to extend it to the case of mixed states in arbitrary dimension. In addition, we derive and compare alternative uncertainty relations when using other known fidelity-based metrics.
A review of the generalized uncertainty principle.
Tawfik, Abdel Nasser; Diab, Abdel Magied
2015-12-01
Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022
A review of the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Nasser Tawfik, Abdel; Magied Diab, Abdel
2015-12-01
Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.
A review of the generalized uncertainty principle.
Tawfik, Abdel Nasser; Diab, Abdel Magied
2015-12-01
Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.
Semiquantum chaos and the uncertainty principle
NASA Astrophysics Data System (ADS)
Kowalski, A. M.; Martin, M. T.; Nuñez, J.; Plastino, A.; Proto, A. N.
2000-01-01
With reference to a recently advanced semi-classical model (Cooper et al., Phys. Rev. Lett. 72 (1994) 1337), we study the quantum-averaged behaviour of the coupling between a classical and a quantum system. This composite system is seen to be described in the language of a classical dynamical system. We show that some characteristics of the putative classical-quantum border become amenable to a type of quantitative analysis involving the uncertainty principle.
Hardy Uncertainty Principle, Convexity and Parabolic Evolutions
NASA Astrophysics Data System (ADS)
Escauriaza, L.; Kenig, C. E.; Ponce, G.; Vega, L.
2016-09-01
We give a new proof of the L 2 version of Hardy's uncertainty principle based on calculus and on its dynamical version for the heat equation. The reasonings rely on new log-convexity properties and the derivation of optimal Gaussian decay bounds for solutions to the heat equation with Gaussian decay at a future time.We extend the result to heat equations with lower order variable coefficient.
Uncertainty principle in human visual perception
NASA Astrophysics Data System (ADS)
Trifonov, Mikhael I.; Ugolev, Dmitry A.
1994-05-01
The orthodox data concerning the contrast sensitivity estimation for sine-wave gratings were formally analyzed. The result of our analysis made feasible a threshold energy value (Delta) E -- energetic equivalent to quantum of perception -- as (Delta) E equals (alpha) (Delta) L(Delta) X2, where (alpha) is a proportionality coefficient, (Delta) L is a threshold luminance, and (Delta) X is a half-period of grating. The value of (Delta) E is a constant for a given value of mean luminance L of the grating and for a middle spatial frequency region. So the `exchange' between luminance threshold (Delta) L and spatial resolution (Delta) X2 values takes place; the increasing of one is followed by the decreasing of the other. We treated this phenomenon as a principle of uncertainty in human visual perception and proved its correctness for other spatial frequencies. Taking into account threshold wavelength ((Delta) (lambda) ) and time ((Delta) t) the uncertainty principle may be extended to a wider class of visual perception problems, including color and flicker objects recognition. So, we suggest the uncertainty principle proposed above is to be one of the cornerstones of the evolution of cognitive systems.
Path detection and the uncertainty principle
NASA Astrophysics Data System (ADS)
Storey, Pippa; Tan, Sze; Collett, Matthew; Walls, Daniel
1994-02-01
QUANTUM mechanics predicts that any detector capable of determining the path taken by a particle through a double slit will destroy the interference. This follows from the principle of complementarity formulated by Niels Bohr: simultaneous observation of wave and particle behaviour is prohibited. But such a description makes no reference to the physical mechanism by which the interference is lost. In the best studied welcher Weg (`which path') detection schemes1,2, interference is lost by the transfer of momentum to the particle whose path is being determined, the extent of momentum transfer satisfying the position-momentum uncertainty relation. This has prompted the question as to whether complementarity is always enforced in welcher Weg schemes by momentum transfer. Scully et al 3. have recently responded in the negative, suggesting that complementarity must be accepted as an independent component of quantum mechanics, rather than as simply a consequence of the uncertainty principle. But we show here that, in any path detection scheme involving a fixed double slit, the amount of momentum transferred to the particle by a perfectly efficient detector (one capable of resolving the path unambiguously) is related to the slit separation in accordance with the uncertainty principle. If less momentum than this is transferred, interference is not completely destroyed and the path detector cannot be perfectly efficient.
Quantum randomness certified by the uncertainty principle
NASA Astrophysics Data System (ADS)
Vallone, Giuseppe; Marangon, Davide G.; Tomasin, Marco; Villoresi, Paolo
2014-11-01
We present an efficient method to extract the amount of true randomness that can be obtained by a quantum random number generator (QRNG). By repeating the measurements of a quantum system and by swapping between two mutually unbiased bases, a lower bound of the achievable true randomness can be evaluated. The bound is obtained thanks to the uncertainty principle of complementary measurements applied to min-entropy and max-entropy. We tested our method with two different QRNGs by using a train of qubits or ququart and demonstrated the scalability toward practical applications.
The uncertainty principle and quantum chaos
NASA Technical Reports Server (NTRS)
Chirikov, Boris V.
1993-01-01
The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.
Dilaton cosmology, noncommutativity, and generalized uncertainty principle
Vakili, Babak
2008-02-15
The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.
Lorentz invariance violation and generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag
2016-01-01
There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.
Signals on Graphs: Uncertainty Principle and Sampling
NASA Astrophysics Data System (ADS)
Tsitsvero, Mikhail; Barbarossa, Sergio; Di Lorenzo, Paolo
2016-09-01
In many applications, the observations can be represented as a signal defined over the vertices of a graph. The analysis of such signals requires the extension of standard signal processing tools. In this work, first, we provide a class of graph signals that are maximally concentrated on the graph domain and on its dual. Then, building on this framework, we derive an uncertainty principle for graph signals and illustrate the conditions for the recovery of band-limited signals from a subset of samples. We show an interesting link between uncertainty principle and sampling and propose alternative signal recovery algorithms, including a generalization to frame-based reconstruction methods. After showing that the performance of signal recovery algorithms is significantly affected by the location of samples, we suggest and compare a few alternative sampling strategies. Finally, we provide the conditions for perfect recovery of a useful signal corrupted by sparse noise, showing that this problem is also intrinsically related to vertex-frequency localization properties.
Generalized uncertainty principle and black hole thermodynamics
NASA Astrophysics Data System (ADS)
Gangopadhyay, Sunandan; Dutta, Abhijit; Saha, Anirban
2014-02-01
We study the Schwarzschild and Reissner-Nordström black hole thermodynamics using the simplest form of the generalized uncertainty principle (GUP) proposed in the literature. The expressions for the mass-temperature relation, heat capacity and entropy are obtained in both cases from which the critical and remnant masses are computed. Our results are exact and reveal that these masses are identical and larger than the so called singular mass for which the thermodynamics quantities become ill-defined. The expression for the entropy reveals the well known area theorem in terms of the horizon area in both cases upto leading order corrections from GUP. The area theorem written in terms of a new variable which can be interpreted as the reduced horizon area arises only when the computation is carried out to the next higher order correction from GUP.
Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.
ERIC Educational Resources Information Center
Roth, Wolff-Michael
1993-01-01
Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)
Incorporation of generalized uncertainty principle into Lifshitz field theories
Faizal, Mir; Majumder, Barun
2015-06-15
In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.
Open timelike curves violate Heisenberg's uncertainty principle.
Pienaar, J L; Ralph, T C; Myers, C R
2013-02-01
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?
ERIC Educational Resources Information Center
Robertson, Bill
2016-01-01
Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…
The uncertainty principle determines the nonlocality of quantum mechanics.
Oppenheim, Jonathan; Wehner, Stephanie
2010-11-19
Two central concepts of quantum mechanics are Heisenberg's uncertainty principle and a subtle form of nonlocality that Einstein famously called "spooky action at a distance." These two fundamental features have thus far been distinct concepts. We show that they are inextricably and quantitatively linked: Quantum mechanics cannot be more nonlocal with measurements that respect the uncertainty principle. In fact, the link between uncertainty and nonlocality holds for all physical theories. More specifically, the degree of nonlocality of any theory is determined by two factors: the strength of the uncertainty principle and the strength of a property called "steering," which determines which states can be prepared at one location given a measurement at another.
Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.
ERIC Educational Resources Information Center
Bartell, Lawrence S.
1985-01-01
Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)
Uncertainty Principle, Squeezing, and Quantum Groups
NASA Astrophysics Data System (ADS)
Rajagopal, A. K.; Gupta, Virendra
It is shown that the complete form of the Heisenberg Uncertainty Relation (HUR) must be employed in introducing the concepts of squeezing and coherent state in q-quantum mechanics. An important feature of this form of the HUR is that it is invariant under unitary transformation of the operators appearing in it and consequences of this are pointed out.
Entanglement, Identical Particles and the Uncertainty Principle
NASA Astrophysics Data System (ADS)
Rigolin, Gustavo
2016-08-01
A new uncertainty relation (UR) is obtained for a system of N identical pure entangled particles if we use symmetrized observables when deriving the inequality. This new expression can be written in a form where we identify a term which explicitly shows the quantum correlations among the particles that constitute the system. For the particular cases of two and three particles, making use of the Schwarz inequality, we obtain new lower bounds for the UR that are different from the standard one.
Thermodynamics of Black Holes and the Symmetric Generalized Uncertainty Principle
NASA Astrophysics Data System (ADS)
Dutta, Abhijit; Gangopadhyay, Sunandan
2016-06-01
In this paper, we have investigated the thermodynamics of Schwarzschild and Reissner-Nordström black holes using the symmetric generalised uncertainty principle which contains correction terms involving momentum and position uncertainty. The mass-temperature relationship and the heat capacity for these black holes have been computed using which the critical and remnant masses have been obtained. The entropy is found to satisfy the area law upto leading order logarithmic corrections and corrections of the form A 2 (which is a new finding in this paper) from the symmetric generalised uncertainty principle.
Risks, scientific uncertainty and the approach of applying precautionary principle.
Lo, Chang-fa
2009-03-01
The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures.
Erythropoietin, uncertainty principle and cancer related anaemia
Clark, Otavio; Adams, Jared R; Bennett, Charles L; Djulbegovic, Benjamin
2002-01-01
Background This study was designed to evaluate if erythropoietin (EPO) is effective in the treatment of cancer related anemia, and if its effect remains unchanged when data are analyzed according to various clinical and methodological characteristics of the studies. We also wanted to demonstrate that cumulative meta-analysis (CMA) can be used to resolve uncertainty regarding clinical questions. Methods Systematic Review (SR) of the published literature on the role of EPO in cancer-related anemia. A cumulative meta-analysis (CMA) using a conservative approach was performed to determine the point in time when uncertainty about the effect of EPO on transfusion-related outcomes could be considered resolved. Participants: Patients included in randomized studies that compared EPO versus no therapy or placebo. Main outcome measures: Number of patients requiring transfusions. Results Nineteen trials were included. The pooled results indicated a significant effect of EPO in reducing the number of patients requiring transfusions [odds ratio (OR) = 0.41; 95%CI: 0.33 to 0.5; p < 0.00001;relative risk (RR) = 0.61; 95% CI: 0.54 to 0.68]. The results remain unchanged after the sensitivity analyses were performed according to the various clinical and methodological characteristics of the studies. The heterogeneity was less pronounced when OR was used instead of RR as the measure of the summary point estimate. Analysis according to OR was not heterogeneous, but the pooled RR was highly heterogeneous. A stepwise metaregression analysis did point to the possibility that treatment effect could have been exaggerated by inadequacy in allocation concealment and that larger treatment effects are seen at hb level > 11.5 g/dl. We identified 1995 as the point in time when a statistically significant effect of EPO was demonstrated and after which we considered that uncertainty about EPO efficacy was resolved. Conclusion EPO is effective in the treatment of anemia in cancer patients. This
An uncertainty principle underlying the functional architecture of V1.
Barbieri, Davide; Citti, Giovanna; Sanguinetti, Gonzalo; Sarti, Alessandro
2012-01-01
We present a model of the morphology of orientation maps in V1 based on the uncertainty principle of the SE(2) group. Starting from the symmetries of the cortex, suitable harmonic analysis instruments are used to obtain coherent states in the Fourier domain as minimizers of the uncertainty. Cortical activities related to orientation maps are then obtained by projection on a suitable cortical Fourier basis.
Single-Slit Diffraction and the Uncertainty Principle
ERIC Educational Resources Information Center
Rioux, Frank
2005-01-01
A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.
The Uncertainty Principle, Virtual Particles and Real Forces
ERIC Educational Resources Information Center
Jones, Goronwy Tudor
2002-01-01
This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…
The 'Herbivory Uncertainty Principle': application in a cerrado site.
Gadotti, C A; Batalha, M A
2010-05-01
Researchers may alter the ecology of their studied organisms, even carrying out apparently beneficial activities, as in herbivory studies, when they may alter herbivory damage. We tested whether visit frequency altered herbivory damage, as predicted by the 'Herbivory Uncertainty Principle'. In a cerrado site, we established 80 quadrats, in which we sampled all woody individuals. We used four visit frequencies (high, medium, low, and control), quantifying, at the end of three months, herbivory damage for each species in each treatment. We did not corroborate the 'Herbivory Uncertainty Principle', since visiting frequency did not alter herbivory damage, at least when the whole plant community was taken into account. However, when we analysed each species separately, four out of 11 species presented significant differences in herbivory damage, suggesting that the researcher is not independent of its measurements. The principle could be tested in other ecological studies in which it may occur, such as those on animal behaviour, human ecology, population dynamics, and conservation.
Gauge theories under incorporation of a generalized uncertainty principle
Kober, Martin
2010-10-15
There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.
NASA Technical Reports Server (NTRS)
Athans, M.; Ku, R.; Gershwin, S. B.
1977-01-01
This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.
Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle
NASA Astrophysics Data System (ADS)
Oppenheim, Jacob N.; Magnasco, Marcelo O.
2013-01-01
The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.
Human time-frequency acuity beats the Fourier uncertainty principle.
Oppenheim, Jacob N; Magnasco, Marcelo O
2013-01-25
The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4 π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple "linear filter" models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.
Constraining the generalized uncertainty principle with cold atoms
NASA Astrophysics Data System (ADS)
Gao, Dongfeng; Zhan, Mingsheng
2016-07-01
Various theories of quantum gravity predict the existence of a minimum length scale, which implies the Planck-scale modifications of the Heisenberg uncertainty principle to a so-called generalized uncertainty principle (GUP). Previous studies of the GUP focused on its implications for high-energy physics, cosmology, and astrophysics. Here the application of the GUP to low-energy quantum systems, and particularly cold atoms, is studied. Results from the 87Rb atom recoil experiment are used to set upper bounds on parameters in three different GUP proposals. A 1014-level bound on the Ali-Das-Vagenas proposal is found, which is the second best bound so far. A 1026-level bound on Maggiore's proposal is obtained, which turns out to be the best available bound on it.
Quantum black hole in the generalized uncertainty principle framework
Bina, A.; Moslehi, A.; Jalalzadeh, S.
2010-01-15
In this paper we study the effects of the generalized uncertainty principle (GUP) on canonical quantum gravity of black holes. Through the use of modified partition function that involves the effects of the GUP, we obtain the thermodynamical properties of the Schwarzschild black hole. We also calculate the Hawking temperature and entropy for the modification of the Schwarzschild black hole in the presence of the GUP.
Generalized uncertainty principle: implications for black hole complementarity
NASA Astrophysics Data System (ADS)
Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han
2014-12-01
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.
NASA Technical Reports Server (NTRS)
Athans, M.; Ku, R.; Gershwin, S. B.
1976-01-01
The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.
The Precautionary Principle and statistical approaches to uncertainty.
Keiding, Niels; Budtz-Jørgensen, Esben
2004-01-01
The central challenge from the Precautionary Principle to statistical methodology is to help delineate (preferably quantitatively) the possibility that some exposure is hazardous, even in cases where this is not established beyond reasonable doubt. The classical approach to hypothesis testing is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model uncertainty: usually these procedures assume that the class of models describing dose/response is known with certainty; this assumption is, however, often violated, perhaps particularly often when epidemiological data form the source of the risk assessment, and regulatory authorities have occasionally resorted to some average based on competing models. The recent methodology of the Bayesian model averaging might be a systematic version of this, but is this an arena for the Precautionary Principle to come into play?
Nonequilibrium fluctuation-dissipation inequality and nonequilibrium uncertainty principle.
Fleming, C H; Hu, B L; Roura, Albert
2013-07-01
The fluctuation-dissipation relation is usually formulated for a system interacting with a heat bath at finite temperature, and often in the context of linear response theory, where only small deviations from the mean are considered. We show that for an open quantum system interacting with a nonequilibrium environment, where temperature is no longer a valid notion, a fluctuation-dissipation inequality exists. Instead of being proportional, quantum fluctuations are bounded below by quantum dissipation, whereas classically the fluctuations vanish at zero temperature. The lower bound of this inequality is exactly satisfied by (zero-temperature) quantum noise and is in accord with the Heisenberg uncertainty principle, in both its microscopic origins and its influence upon systems. Moreover, it is shown that there is a coupling-dependent nonequilibrium fluctuation-dissipation relation that determines the nonequilibrium uncertainty relation of linear systems in the weak-damping limit.
Nondivergent classical response functions from uncertainty principle: quasiperiodic systems.
Kryvohuz, Maksym; Cao, Jianshu
2005-01-01
Time-divergence in linear and nonlinear classical response functions can be removed by taking a phase-space average within the quantized uncertainty volume O(hn) around the microcanonical energy surface. For a quasiperiodic system, the replacement of the microcanonical distribution density in the classical response function with the quantized uniform distribution density results in agreement of quantum and classical expressions through Heisenberg's correspondence principle: each matrix element (u/alpha(t)/v) corresponds to the (u-v)th Fourier component of alpha(t) evaluated along the classical trajectory with mean action (Ju+Jv)/2. Numerical calculations for one- and two-dimensional systems show good agreement between quantum and classical results. The generalization to the case of N degrees of freedom is made. Thus, phase-space averaging within the quantized uncertainty volume provides a useful way to establish the classical-quantum correspondence for the linear and nonlinear response functions of a quasiperiodic system.
Minisuperspace dynamics in a generalized uncertainty principle framework
Battisti, Marco Valerio; Montani, Giovanni
2008-01-03
The minisuperspace dynamics of the Friedmann-Robertson-Walker (FRW) and of the Taub Universes in the context of a Generalized Uncertainty Principle is analyzed in detail. In particular, the motion of the wave packets is investigated and, in both the models, the classical singularity appear to be probabilistic suppressed. Moreover, the FRW wave packets approach the Planckian region in a stationary way and no evidences for a Big-Bounce, as predicted in Loop Quantum Cosmology, appear. On the other hand, the Taub wave packets provide the right behavior in predicting an isotropic Universe.
Classical Dynamics Based on the Minimal Length Uncertainty Principle
NASA Astrophysics Data System (ADS)
Chung, Won Sang
2016-02-01
In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.
Generalized uncertainty principle in Bianchi type I quantum cosmology
NASA Astrophysics Data System (ADS)
Vakili, B.; Sepangi, H. R.
2007-07-01
We study a quantum Bianchi type I model in which the dynamical variables of the corresponding minisuperspace obey the generalized Heisenberg algebra. Such a generalized uncertainty principle has its origin in the existence of a minimal length suggested by quantum gravity and sting theory. We present approximate analytical solutions to the corresponding Wheeler DeWitt equation in the limit where the scale factor of the universe is small and compare the results with the standard commutative and noncommutative quantum cosmology. Similarities and differences of these solutions are also discussed.
Conflict between the Uncertainty Principle and wave mechanics
NASA Astrophysics Data System (ADS)
Bourdillon, Antony
The traveling wave group that is defined on conserved physical values is the vehicle of transmission for a unidirectional photon or free particle having a wide wave front. As a stable wave packet, it expresses internal periodicity combined with group localization. Heisenberg's Uncertainty Principle is precisely derived from it. The wave group demonstrates serious conflict between the Principle and wave mechanics. Also derived is the phase velocity beyond the horizon set by the speed of light. In this space occurs the reduction of the wave packet which occurs in measurement and which is represented by comparing phase velocities in the direction of propagation with the transverse plane. The new description of the wavefunction for the stable free particle or antiparticle contains variables that were previously ignored. Deterministic physics must always appear probabilistic when hidden variables are bypassed. Secondary hidden variables always occur in measurement. The wave group turns out to be probabilistic. It is ubiquitous in physics and has many consequences.
Uncertainty, imprecision, and the precautionary principle in climate change assessment.
Borsuk, M E; Tomassini, L
2005-01-01
Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.
Generalized Uncertainty Principle and Parikh-Wilczek Tunneling
NASA Astrophysics Data System (ADS)
Mehdipour, S. Hamid
We investigate the modifications of the Hawking radiation by the Generalized Uncertainty Principle (GUP) and the tunneling process. By using the GUP-corrected de Broglie wavelength, the squeezing of the fundamental momentum cell, and consequently a GUP-corrected energy, we find the nonthermal effects which lead to a nonzero statistical correlation function between probabilities of tunneling of two massive particles with different energies. Then the recovery of part of the information from the black hole radiation is feasible. From the other point of view, the inclusion of the effects of quantum gravity as the GUP expression can halt the evaporation process, so that a stable black hole remnant is left behind, including the other part of the black hole information content. Therefore, these features of the Planck-scale corrections may solve the information problem in black hole evaporation.
Generalized Uncertainty Principle and Thermostatistics: A Semiclassical Approach
NASA Astrophysics Data System (ADS)
Abbasiyan-Motlaq, Mohammad; Pedram, Pouria
2016-04-01
We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.
Molecular Response Theory in Terms of the Uncertainty Principle.
Harde, Hermann; Grischkowsky, Daniel
2015-08-27
We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared. PMID:26280761
Molecular Response Theory in Terms of the Uncertainty Principle.
Harde, Hermann; Grischkowsky, Daniel
2015-08-27
We investigate the time response of molecular transitions by observing the pulse reshaping of femtosecond THz-pulses propagating through polar vapors. By precisely modeling the pulse interaction with the molecular vapors, we derive detailed insight into this time response after an excitation. The measurements, which were performed by applying the powerful technique of THz time domain spectroscopy, are analyzed directly in the time domain or parallel in the frequency domain by Fourier transforming the pulses and comparing them with the molecular response theory. New analyses of the molecular response allow a generalized unification of the basic collision and line-shape theories of Lorentz, van Vleck-Weisskopf, and Debye described by molecular response theory. In addition, they show that the applied THz experimental setup allows the direct observation of the ultimate time response of molecules to an external applied electric field in the presence of molecular collisions. This response is limited by the uncertainty principle and is determined by the inverse spitting frequency between adjacent levels. At the same time, this response reflects the transition time of a rotational transition to switch from one molecular state to another or to form a coherent superposition of states oscillating with the splitting frequency. The presented investigations are also of fundamental importance for the description of the far-wing absorption of greenhouse gases like water vapor, carbon dioxide, or methane, which have a dominant influence on the radiative exchange in the far-infrared.
Effect of the Generalized Uncertainty Principle on post-inflation preheating
Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr
2011-12-01
We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.
Verification of the Uncertainty Principle by Using Diffraction of Light Waves
ERIC Educational Resources Information Center
Nikolic, D.; Nesic, Lj
2011-01-01
We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean.
Entropy of the Randall-Sundrum brane world with the generalized uncertainty principle
Kim, Wontae; Park, Young-Jai; Kim, Yong-Wan
2006-11-15
By introducing the generalized uncertainty principle, we calculate the entropy of the bulk scalar field on the Randall-Sundrum brane background without any cutoff. We obtain the entropy of the massive scalar field proportional to the horizon area. Here, we observe that the mass contribution to the entropy exists in contrast to all previous results of the usual black hole cases with the generalized uncertainty principle.
van Asselt, M B A; Vos, E
2005-01-01
This article explores the use of the precautionary principle in situations of intermingled uncertainty and risk. It analyses how the so-called uncertainty paradox works out by examining the Pfizer case. It reveals regulatory complexities that result from contradictions in precautionary thinking. In conclusion, a plea is made for embedment of uncertainty information, while stressing the need to rethink regulatory reform in the broader sense. PMID:16304932
A discussion on the Heisenberg uncertainty principle from the perspective of special relativity
NASA Astrophysics Data System (ADS)
Nanni, Luca
2016-09-01
In this note, we consider the implications of the Heisenberg uncertainty principle (HUP) when computing uncertainties that affect the main dynamical quantities, from the perspective of special relativity. Using the well-known formula for propagating statistical errors, we prove that the uncertainty relations between the moduli of conjugate observables are not relativistically invariant. The new relationships show that, in experiments involving relativistic particles, limitations of the precision of a quantity obtained by indirect calculations may affect the final result.
van Asselt, M B A; Vos, E
2005-01-01
This article explores the use of the precautionary principle in situations of intermingled uncertainty and risk. It analyses how the so-called uncertainty paradox works out by examining the Pfizer case. It reveals regulatory complexities that result from contradictions in precautionary thinking. In conclusion, a plea is made for embedment of uncertainty information, while stressing the need to rethink regulatory reform in the broader sense.
Path Integral for Dirac oscillator with generalized uncertainty principle
Benzair, H.; Boudjedaa, T.; Merad, M.
2012-12-15
The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.
Uncertainty Principle--Limited Experiments: Fact or Academic Pipe-Dream?
ERIC Educational Resources Information Center
Albergotti, J. Clifton
1973-01-01
The question of whether modern experiments are limited by the uncertainty principle or by the instruments used to perform the experiments is discussed. Several key experiments show that the instruments limit our knowledge and the principle remains of strictly academic concern. (DF)
Uncertainty principle for experimental measurements: Fast versus slow probes.
Hansmann, P; Ayral, T; Tejeda, A; Biermann, S
2016-01-01
The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments--angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy--suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates. PMID:26829902
Uncertainty principle for experimental measurements: Fast versus slow probes
NASA Astrophysics Data System (ADS)
Hansmann, P.; Ayral, T.; Tejeda, A.; Biermann, S.
2016-02-01
The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments - angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy - suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates.
Uncertainty principle for experimental measurements: Fast versus slow probes.
Hansmann, P; Ayral, T; Tejeda, A; Biermann, S
2016-02-01
The result of a physical measurement depends on the time scale of the experimental probe. In solid-state systems, this simple quantum mechanical principle has far-reaching consequences: the interplay of several degrees of freedom close to charge, spin or orbital instabilities combined with the disparity of the time scales associated to their fluctuations can lead to seemingly contradictory experimental findings. A particularly striking example is provided by systems of adatoms adsorbed on semiconductor surfaces where different experiments--angle-resolved photoemission, scanning tunneling microscopy and core-level spectroscopy--suggest different ordering phenomena. Using most recent first principles many-body techniques, we resolve this puzzle by invoking the time scales of fluctuations when approaching the different instabilities. These findings suggest a re-interpretation of ordering phenomena and their fluctuations in a wide class of solid-state systems ranging from organic materials to high-temperature superconducting cuprates.
Nonlinear Schrödinger equation from generalized exact uncertainty principle
NASA Astrophysics Data System (ADS)
Rudnicki, Łukasz
2016-09-01
Inspired by the generalized uncertainty principle, which adds gravitational effects to the standard description of quantum uncertainty, we extend the exact uncertainty principle approach by Hall and Reginatto (2002 J. Phys. A: Math. Gen. 35 3289), and obtain a (quasi)nonlinear Schrödinger equation. This quantum evolution equation of unusual form, enjoys several desired properties like separation of non-interacting subsystems or plane-wave solutions for free particles. Starting with the harmonic oscillator example, we show that every solution of this equation respects the gravitationally induced minimal position uncertainty proportional to the Planck length. Quite surprisingly, our result successfully merges the core of classical physics with non-relativistic quantum mechanics in its extremal form. We predict that the commonly accepted phenomenon, namely a modification of a free-particle dispersion relation due to quantum gravity might not occur in reality.
NASA Technical Reports Server (NTRS)
Athans, M.; Ku, R.; Gershwin, S. B.
1977-01-01
This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.
The entropy of the noncommutative acoustic black hole based on generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Anacleto, M. A.; Brito, F. A.; Passos, E.; Santos, W. P.
2014-10-01
In this paper we investigate statistical entropy of a 3-dimensional rotating acoustic black hole based on generalized uncertainty principle. In our results we obtain an area entropy and a correction term associated with the noncommutative acoustic black hole when λ introduced in the generalized uncertainty principle takes a specific value. However, in this method, it is not needed to introduce the ultraviolet cut-off and divergences are eliminated. Moreover, the small mass approximation is not necessary in the original brick-wall model.
Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models
NASA Technical Reports Server (NTRS)
Terazawa, Hidezumi
1996-01-01
The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.
Tameshtit, Allan
2012-04-01
High-temperature and white-noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white-noise approximation is avoided, it is shown that if the zero-point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.
The uncertainty principle enables non-classical dynamics in an interferometer.
Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko
2014-08-08
The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics.
Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.
Rogers, Michael D
2003-06-01
Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
NASA Astrophysics Data System (ADS)
Mejjaoli, Hatem; Trimèche, Khalifa
2016-06-01
In this paper, we prove various mathematical aspects of the qualitative uncertainty principle, including Hardy's, Cowling-Price's theorem, Morgan's theorem, Beurling, Gelfand-Shilov, Miyachi theorems.
ERIC Educational Resources Information Center
Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie
2011-01-01
Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…
Tawfik, A.
2013-07-01
We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.
ERIC Educational Resources Information Center
Harbola, Varun
2011-01-01
In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…
Story, Lachel; Butts, Janie
2014-03-01
Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosità principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience.
Genome wide expression profiling of angiogenic signaling and the Heisenberg uncertainty principle.
Huber, Peter E; Hauser, Kai; Abdollahi, Amir
2004-11-01
Genome wide DNA expression profiling coupled with antibody array experiments using endostatin to probe the angiogenic signaling network in human endothelial cells were performed. The results reveal constraints on the measuring process that are of a similar kind as those implied by the uncertainty principle of quantum mechanics as described by Werner Heisenberg. We describe this analogy and argue for its heuristic utility in the conceptualization of angiogenesis as an important step in tumor formation.
Energy-Time Uncertainty Principle and Lower Bounds on Sojourn Time
NASA Astrophysics Data System (ADS)
Asch, Joachim; Bourget, Olivier; Cortés, Victor; Fernandez, Claudio
2016-09-01
One manifestation of quantum resonances is a large sojourn time, or autocorrelation, for states which are initially localized. We elaborate on Lavine's time-energy uncertainty principle and give an estimate on the sojourn time. For the case of perturbed embedded eigenstates the bound is explicit and involves Fermi's Golden Rule. It is valid for a very general class of systems. We illustrate the theory by applications to resonances for time dependent systems including the AC Stark effect as well as multistate systems.
Certifying Einstein-Podolsky-Rosen steering via the local uncertainty principle
NASA Astrophysics Data System (ADS)
Zhen, Yi-Zheng; Zheng, Yu-Lin; Cao, Wen-Fei; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai
2016-01-01
Uncertainty principle lies at the heart of quantum mechanics, while nonlocality is an intriguing phenomenon of quantum mechanics to rule out local causal theories. One subtle form of nonlocality is so-called Einstein-Podolsky-Rosen (EPR) steering, which holds the potential for shared entanglement verification even if the one-sided measurement device is untrusted. However, certifying EPR steering remains a big challenge presently. Here, we employ the local uncertainty relation to provide an experimental friendly approach for EPR steering verification. We show that the strength of EPR steering is quantitatively linked to the strength of the uncertainty relation, as well as the amount of entanglement. We find also that the realignment method works for detecting EPR steering of an arbitrary dimensional system.
A computational model of limb impedance control based on principles of internal model uncertainty.
Mitrovic, Djordje; Klanke, Stefan; Osu, Rieko; Kawato, Mitsuo; Vijayakumar, Sethu
2010-10-26
Efficient human motor control is characterized by an extensive use of joint impedance modulation, which is achieved by co-contracting antagonistic muscles in a way that is beneficial to the specific task. While there is much experimental evidence available that the nervous system employs such strategies, no generally-valid computational model of impedance control derived from first principles has been proposed so far. Here we develop a new impedance control model for antagonistic limb systems which is based on a minimization of uncertainties in the internal model predictions. In contrast to previously proposed models, our framework predicts a wide range of impedance control patterns, during stationary and adaptive tasks. This indicates that many well-known impedance control phenomena naturally emerge from the first principles of a stochastic optimization process that minimizes for internal model prediction uncertainties, along with energy and accuracy demands. The insights from this computational model could be used to interpret existing experimental impedance control data from the viewpoint of optimality or could even govern the design of future experiments based on principles of internal model uncertainty.
Li Zhongheng
2009-10-15
We derive new formulas for the spectral energy density and total energy density of massless particles in a general spherically symmetric static metric from a generalized uncertainty principle. Compared with blackbody radiation, the spectral energy density is strongly damped at high frequencies. For large values of r, the spectral energy density diminishes when r grows, but at the event horizon, the spectral energy density vanishes and therefore thermodynamic quantities near a black hole, calculated via the generalized uncertainty principle, do not require any cutoff parameter. We find that the total energy density can be expressed in terms of Hurwitz zeta functions. It should be noted that at large r (low local temperature), the difference between the total energy density and the Stefan-Boltzmann law is too small to be observed. However, as r approaches an event horizon, the effect of the generalized uncertainty principle becomes more and more important, which may be observable. As examples, the spectral energy densities in the background metric of a Schwarzschild black hole and of a Schwarzschild black hole plus quintessence are discussed. It is interesting to note that the maximum of the distribution shifts to higher frequencies when the quintessence equation of state parameter w decreases.
NASA Astrophysics Data System (ADS)
Bosyk, G. M.; Portesi, M.; Holik, F.; Plastino, A.
2013-06-01
We revisit the connection between the complementarity and uncertainty principles of quantum mechanics within the framework of Mach-Zehnder interferometry. We focus our attention on the trade-off relation between complementary path information and fringe visibility. This relation is equivalent to the uncertainty relation of Schrödinger and Robertson for a suitably chosen pair of observables. We show that it is equivalent as well to the uncertainty inequality provided by Landau and Pollak. We also study the relationship of this trade-off relation with a family of entropic uncertainty relations based on Rényi entropies. There is no equivalence in this case, but the different values of the entropic parameter do define regimes that provides us with a tool to discriminate between non-trivial states of minimum uncertainty. The existence of such regimes agrees with previous results of Luis (2011 Phys. Rev. A 84 034101), although their meaning was not sufficiently clear. We discuss the origin of these regimes with the intention of gaining a deeper understanding of entropic measures.
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
2015-09-05
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
2015-09-05
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less
Using the uncertainty principle to design simple interactions for targeted self-assembly.
Edlund, E; Lindgren, O; Jacobi, M Nilsson
2013-07-14
We present a method that systematically simplifies isotropic interactions designed for targeted self-assembly. The uncertainty principle is used to show that an optimal simplification is achieved by a combination of heat kernel smoothing and Gaussian screening of the interaction potential in real and reciprocal space. We use this method to analytically design isotropic interactions for self-assembly of complex lattices and of materials with functional properties. The derived interactions are simple enough to narrow the gap between theory and experimental implementation of theory based designed self-assembling materials.
Generalized uncertainty principle in f(R) gravity for a charged black hole
Said, Jackson Levi; Adami, Kristian Zarb
2011-02-15
Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.
The Symplectic Camel and the Uncertainty Principle: The Tip of an Iceberg?
NASA Astrophysics Data System (ADS)
de Gosson, Maurice A.
2009-02-01
We show that the strong form of Heisenberg’s inequalities due to Robertson and Schrödinger can be formally derived using only classical considerations. This is achieved using a statistical tool known as the “minimum volume ellipsoid” together with the notion of symplectic capacity, which we view as a topological measure of uncertainty invariant under Hamiltonian dynamics. This invariant provides a right measurement tool to define what “quantum scale” is. We take the opportunity to discuss the principle of the symplectic camel, which is at the origin of the definition of symplectic capacities, and which provides an interesting link between classical and quantum physics.
Uncertainty principle for control of ensembles of oscillators driven by common noise
NASA Astrophysics Data System (ADS)
Goldobin, D. S.
2014-04-01
We discuss control techniques for noisy self-sustained oscillators with a focus on reliability, stability of the response to noisy driving, and oscillation coherence understood in the sense of constancy of oscillation frequency. For any kind of linear feedback control — single and recursive delay feedback, linear frequency filter, etc. — the phase diffusion constant, quantifying coherence, and the Lyapunov exponent, quantifying reliability, can be efficiently controlled but their ratio remains constant. Thus, an "uncertainty principle" can be formulated: the loss of reliability occurs when coherence is enhanced and, vice versa, coherence is weakened when reliability is enhanced. Treatment of this principle for ensembles of oscillators synchronized by common noise or global coupling reveals a substantial difference between the cases of slightly non-identical oscillators and identical ones with intrinsic noise.
Before and beyond the precautionary principle: Epistemology of uncertainty in science and law
Tallacchini, Mariachiara . E-mail: mariachiara.tallacchini@unimi.it
2005-09-01
The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.
Before and beyond the precautionary principle: epistemology of uncertainty in science and law.
Tallacchini, Mariachiara
2005-09-01
The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that "[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation". By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.
Covariant energy–momentum and an uncertainty principle for general relativity
Cooperstock, F.I.; Dupre, M.J.
2013-12-15
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.
Galilean and Lorentz Transformations in a Space with Generalized Uncertainty Principle
NASA Astrophysics Data System (ADS)
Tkachuk, V. M.
2016-07-01
We consider a space with Generalized Uncertainty Principle (GUP) which can be obtained in the frame of the deformed commutation relations. In the space with GUP we have found transformations relating coordinates and times of moving and rest frames of reference in the first order over the parameter of deformation. In the non-relativistic case we find the deformed Galilean transformation which is rotation in Euclidian space-time. This transformation is similar to the Lorentz one but written for Euclidean space-time where the speed of light is replaced by some velocity related to the parameter of deformation. We show that for relativistic particle in the space with GUP the coordinates of the rest and moving frames of reference satisfy the Lorentz transformation with some effective speed of light.
NASA Astrophysics Data System (ADS)
Feng, Z. W.; Li, H. L.; Zu, X. T.; Yang, S. Z.
2016-04-01
We investigate the thermodynamics of Schwarzschild-Tangherlini black hole in the context of the generalized uncertainty principle (GUP). The corrections to the Hawking temperature, entropy and the heat capacity are obtained via the modified Hamilton-Jacobi equation. These modifications show that the GUP changes the evolution of the Schwarzschild-Tangherlini black hole. Specially, the GUP effect becomes susceptible when the radius or mass of the black hole approaches the order of Planck scale, it stops radiating and leads to a black hole remnant. Meanwhile, the Planck scale remnant can be confirmed through the analysis of the heat capacity. Those phenomena imply that the GUP may give a way to solve the information paradox. Besides, we also investigate the possibilities to observe the black hole at the Large Hadron Collider (LHC), and the results demonstrate that the black hole cannot be produced in the recent LHC.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
G. Youinou; G. Palmiotti; M. Salvatorre; G. Imel; R. Pardo; F. Kondev; M. Paul
2010-01-01
An integral reactor physics experiment devoted to infer higher actinide (Am, Cm, Bk, Cf) neutron cross sections will take place in the US. This report presents the principle of the planned experiment as well as a first exercise aiming at quantifying the uncertainties related to the inferred quantities. It has been funded in part by the DOE Office of Science in the framework of the Recovery Act and has been given the name MANTRA for Measurement of Actinides Neutron TRAnsmutation. The principle is to irradiate different pure actinide samples in a test reactor like INL’s Advanced Test Reactor, and, after a given time, determine the amount of the different transmutation products. The precise characterization of the nuclide densities before and after neutron irradiation allows the energy integrated neutron cross-sections to be inferred since the relation between the two are the well-known neutron-induced transmutation equations. This approach has been used in the past and the principal novelty of this experiment is that the atom densities of the different transmutation products will be determined with the Accelerator Mass Spectroscopy (AMS) facility located at ANL. While AMS facilities traditionally have been limited to the assay of low-to-medium atomic mass materials, i.e., A < 100, there has been recent progress in extending AMS to heavier isotopes – even to A > 200. The detection limit of AMS being orders of magnitude lower than that of standard mass spectroscopy techniques, more transmutation products could be measured and, potentially, more cross-sections could be inferred from the irradiation of a single sample. Furthermore, measurements will be carried out at the INL using more standard methods in order to have another set of totally uncorrelated information.
Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar
2012-05-01
Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.
Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.
Hsieh, I-Hui; Saberi, Kourosh
2016-02-01
How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction. PMID:26022837
Femtoscopic scales in p + p and p + Pb collisions in view of the uncertainty principle
NASA Astrophysics Data System (ADS)
Shapoval, V. M.; Braun-Munzinger, P.; Karpenko, Iu. A.; Sinyukov, Yu. M.
2013-08-01
A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in p + p collisions at LHC energy (√{ s} = 7 TeV). A prediction is also presented of pion interferometric radii for p + Pb collisions at √{ s} = 5.02 TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as 'afterburner' is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of p + p collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic A + A collisions, also for high multiplicity p + p and p + Pb events.
Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.
Hsieh, I-Hui; Saberi, Kourosh
2016-02-01
How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.
Revisiting the Calculation of I/V Profiles in Molecular Junctions Using the Uncertainty Principle.
Ramos-Berdullas, Nicolás; Mandado, Marcos
2014-04-17
Ortiz and Seminario (J. Chem. Phys. 2007, 127, 111106/1-3) proposed some years ago a simple and direct approach to obtain I/V profiles from the combination of ab initio equilibrium electronic structure calculations and the uncertainty principle as an alternative or complementary tool to more sophisticated nonequilibrium Green's functions methods. In this work, we revisit the fundamentals of this approach and reformulate accordingly the expression of the electric current. By analogy to the spontaneous electron decay process in electron transitions, in our revision, the current is calculated upon the relaxing process from the "polarized" state induced by the external electric field to the electronic ground state. The electric current is obtained from the total charge transferred through the molecule and the corresponding electronic energy relaxation. The electric current expression proposed is more general compared with the previous expression employed by Ortiz and Seminario, where the charge variation must be tested among different slabs of atoms at the contact. This new approach has been tested on benzene-1,4-dithiolate attached to different gold clusters that represent the contact with the electrodes. Analysis of the total electron deformation density induced by the external electric voltage and properties associated with the electron deformation orbitals supports the conclusions obtained from the I/V profiles.
Lierman, S; Veuchelen, L
2005-01-01
The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law.
A violation of the uncertainty principle implies a violation of the second law of thermodynamics.
Hänggi, Esther; Wehner, Stephanie
2013-01-01
Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.
Liu Molin; Gui Yuanxing; Liu Hongya
2008-12-15
In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.
NASA Astrophysics Data System (ADS)
Ahmad, Zeeshan; Viswanathan, Venkatasubramanian
2016-08-01
Computationally-guided material discovery is being increasingly employed using a descriptor-based screening through the calculation of a few properties of interest. A precise understanding of the uncertainty associated with first-principles density functional theory calculated property values is important for the success of descriptor-based screening. The Bayesian error estimation approach has been built in to several recently developed exchange-correlation functionals, which allows an estimate of the uncertainty associated with properties related to the ground state energy, for example, adsorption energies. Here, we propose a robust and computationally efficient method for quantifying uncertainty in mechanical properties, which depend on the derivatives of the energy. The procedure involves calculating energies around the equilibrium cell volume with different strains and fitting the obtained energies to the corresponding energy-strain relationship. At each strain, we use instead of a single energy, an ensemble of energies, giving us an ensemble of fits and thereby, an ensemble of mechanical properties associated with each fit, whose spread can be used to quantify its uncertainty. The generation of ensemble of energies is only a post-processing step involving a perturbation of parameters of the exchange-correlation functional and solving for the energy non-self-consistently. The proposed method is computationally efficient and provides a more robust uncertainty estimate compared to the approach of self-consistent calculations employing several different exchange-correlation functionals. We demonstrate the method by calculating the uncertainty bounds for several materials belonging to different classes and having different structures using the developed method. We show that the calculated uncertainty bounds the property values obtained using three different GGA functionals: PBE, PBEsol, and RPBE. Finally, we apply the approach to calculate the uncertainty
Quantifying uncertainties in first-principles alloy thermodynamics using cluster expansions
NASA Astrophysics Data System (ADS)
Aldegunde, Manuel; Zabaras, Nicholas; Kristensen, Jesper
2016-10-01
The cluster expansion is a popular surrogate model for alloy modeling to avoid costly quantum mechanical simulations. As its practical implementations require approximations, its use trades efficiency for accuracy. Furthermore, the coefficients of the model need to be determined from some known data set (training set). These two sources of error, if not quantified, decrease the confidence we can put in the results obtained from the surrogate model. This paper presents a framework for the determination of the cluster expansion coefficients using a Bayesian approach, which allows for the quantification of uncertainties in the predictions. In particular, a relevance vector machine is used to automatically select the most relevant terms of the model while retaining an analytical expression for the predictive distribution. This methodology is applied to two binary alloys, SiGe and MgLi, including the temperature dependence in their effective cluster interactions. The resulting cluster expansions are used to calculate the uncertainty in several thermodynamic quantities: ground state line, including the uncertainty in which structures are thermodynamically stable at 0 K, phase diagrams and phase transitions. The uncertainty in the ground state line is found to be of the order of meV/atom, showing that the cluster expansion is reliable to ab initio level accuracy even with limited data. We found that the uncertainty in the predicted phase transition temperature increases when including the temperature dependence of the effective cluster interactions. Also, the use of the bond stiffness versus bond length approximation to calculate temperature dependent properties from a reduced set of alloy configurations showed similar uncertainty to the approach where all training configurations are considered but at a much reduced computational cost.
Oye, K A
2005-01-01
Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs.
The effect of generalized uncertainty principle on square well, a case study
Ma, Meng-Sen; Zhao, Ren
2014-08-15
According to a special case (β = 0) of the generalized uncertainty relation we derive the energy eigenvalues of the infinite potential well. It is shown that the obtained energy levels are different from the usual result with some correction terms. And the correction terms of the energy eigenvalues are independent of other parameters except α. But the eigenstates will depend on another two parameters besides α.
NASA Astrophysics Data System (ADS)
McLeod, David; McLeod, Roger
2008-04-01
The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.
Marchiolli, M.A.; Mendonça, P.E.M.F.
2013-09-15
We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.
Mehra, J.
1987-05-01
In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger during 1920-1927 are treated. From the formulation of quantum mechanics in 1925-1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory - formulated in fall 1926 by Dirac, London, and Jordan - Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publicly enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of quantum mechanics and of physical theory as such - formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930 - were continued during the next decades. All these aspects are briefly summarized.
NASA Technical Reports Server (NTRS)
Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.
1992-01-01
The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.
NASA Astrophysics Data System (ADS)
Mazurova, Elena; Lapshin, Aleksey
2013-04-01
The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell
NASA Astrophysics Data System (ADS)
Bougouffa, Smail; Ficek, Zbigniew
2016-06-01
The link of two concepts, indistinguishability and entanglement, with the energy-time uncertainty principle is demonstrated in a system composed of two strongly coupled bosonic modes. Working in the limit of a short interaction time, we find that the inclusion of the antiresonant terms to the coupling Hamiltonian leads the system to relax to a state which is not the ground state of the system. This effect occurs passively by just presence of the antiresonant terms and is explained in terms of the time-energy uncertainty principle for the simple reason that at a very short interaction time, the uncertainty in the energy is of order of the energy of a single excitation, thereby leading to a distribution of the population among the zero, singly and doubly excited states. The population distribution, correlations, and entanglement are shown to substantially dependent on whether the modes decay independently or collectively to an exterior reservoir. In particular, when the modes decay independently with equal rates, entanglement with the complete distinguishability of the modes is observed. The modes can be made mutually coherent if they decay with unequal rates. However, the visibility in the single-photon interference cannot exceed 50 % . When the modes experience collective damping, they are indistinguishable even if decay with equal rates and the visibility can, in principle, be as large as unity. We find that this feature derives from the decay of the system to a pure entangled state rather than the expected mixed state. When the modes decay with equal rates, the steady-state values of the density matrix elements are found dependent on their initial values.
Grote, Gudela
2014-01-01
It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore.
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected. PMID:17223124
Mezzasalma, Stefano A
2007-03-15
The theoretical basis of a recent theory of Brownian relativity for polymer solutions is deepened and reexamined. After the problem of relative diffusion in polymer solutions is addressed, its two postulates are formulated in all generality. The former builds a statistical equivalence between (uncorrelated) timelike and shapelike reference frames, that is, among dynamical trajectories of liquid molecules and static configurations of polymer chains. The latter defines the "diffusive horizon" as the invariant quantity to work with in the special version of the theory. Particularly, the concept of universality in polymer physics corresponds in Brownian relativity to that of covariance in the Einstein formulation. Here, a "universal" law consists of a privileged observation, performed from the laboratory rest frame and agreeing with any diffusive reference system. From the joint lack of covariance and simultaneity implied by the Brownian Lorentz-Poincaré transforms, a relative uncertainty arises, in a certain analogy with quantum mechanics. It is driven by the difference between local diffusion coefficients in the liquid solution. The same transformation class can be used to infer Fick's second law of diffusion, playing here the role of a gauge invariance preserving covariance of the spacetime increments. An overall, noteworthy conclusion emerging from this view concerns the statistics of (i) static macromolecular configurations and (ii) the motion of liquid molecules, which would be much more related than expected.
NASA Astrophysics Data System (ADS)
Mehra, Jagdish
1987-05-01
In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schrödinger during 1920 1927 are treated. From the formulation of quantum mechanics in 1925 1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory—formulated in fall 1926 by Dirac, London, and Jordan—Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publicly enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of qnautum mechanics and of physical theory as such—formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930—were continued during the next decades. All these aspects are briefly summarized.
NASA Astrophysics Data System (ADS)
Chen, Shao-Guang
average number density of CMB photons is about 200/cm3 (or 5.9/cm) measured on U2 airplane. The reciprocal 0.17cm of 5.9/cm is just the average freedom path S of the particle impacting with CMB photons. The virtual photons possess e0 and p0 of CMB photons owing to the energy-exchange in long-time coexist. The metrical value of Casimir force shows that the number density of virtual photons is far larger than that of CMB photons. The most collisions of virtual photons with particle have no measurable effect (self-counteracting momentum-balance). The residual virtual photons in imbalanced collisions with CMB photons are again in a dynamical balance and both number and both average freedom paths will be equal when a particle has no macro-displacement. In the cosmic space the virtual photons and CMB photons gather together, the total valid average freedom path of a particle will be equal to 0.085cm. The action-quantity p0 S on a particle by CMB photons and virtual photons is: p0 S =1.24•10-26 g cm s-1 • 0.085cm =1.054•10-27 erg • s. The metrical Planck constant is: h / 2π =1.0546•10-27 erg • s. It is worth thinking that both p0 S and h /2 π have the same dimension and their magnitudes are also very approaching. If we think that the quantum effect comes from the action on the particle by the vacuum virtual photons and CMB photons, then the action-quantity 2 π p0 S is just the Planck constant h and ∆x•∆p= h (8). It is just the uncertainty principle, now it is the metrical results of Doppler effects in two contrary directions. The wave-particle duality is likely a quasi-Brownian motion of a particle in vacuum. The nonzero time in measuring course and the particle's quasi-Brownian motion make it impossible to measure accurately the position x and the momentum p of a particle. Then the uncertainty principle becomes a metrical theorem of the generalized Newton mechanics.
Cope, F W
1981-01-01
The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature.
Sanderson, H; Stahl, C H; Irwin, R; Rogers, M D
2005-01-01
Quantitative uncertainty assessments and the distribution of risk are under scrutiny and significant criticism has been made of null hypothesis testing when careful consideration of Type I (false positive) and II (false negative) error rates have not been taken into account. An alternative method, equivalence testing, is discussed yielding more transparency and potentially more precaution in the quantifiable uncertainty assessments. With thousands of chemicals needing regulation in the near future and low public trust in the regulatory process, decision models are required with transparency and learning processes to manage this task. Adaptive, iterative, and learning decision making tools and processes can help decision makers evaluate the significance of Type I or Type II errors on decision alternatives and can reduce the risk of committing Type III errors (accurate answers to the wrong questions). Simplistic cost-benefit based decision-making tools do not incorporate the complex interconnectedness characterizing environmental risks, nor do they enhance learning, participation, or include social values and ambiguity. Hence, better decision-making tools are required, and MIRA is an attempt to include some of the critical aspects.
Two new kinds of uncertainty relations
NASA Technical Reports Server (NTRS)
Uffink, Jos
1994-01-01
We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.
Uncertainty in Computational Aerodynamics
NASA Technical Reports Server (NTRS)
Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.
2003-01-01
An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.
Bartley, David; Lidén, Göran
2008-08-01
The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill.
Uncertainty in QSAR predictions.
Sahlin, Ullrika
2013-03-01
It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle.
NASA Astrophysics Data System (ADS)
Koch, Michael
Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.
Comparison of Classical and Quantum Mechanical Uncertainties.
ERIC Educational Resources Information Center
Peslak, John, Jr.
1979-01-01
Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)
Quantal localization and the uncertainty principle
Leopold, J.G.; Richards, D.
1988-09-01
We give a dynamical explanation for the localization of the wave function for the one-dimensional hydrogen atom, with the Coulomb singularity, in a high-frequency electric field, which leads to a necessary condition for classical dynamics to be valid. Numerical tests confirm the accuracy of the condition. Our analysis is relevant to the comparison between the classical and quantal dynamics of the kicked rotor and standard map.
Nab: Measurement Principles, Apparatus and Uncertainties
Pocanic, Dinko; Bowman, James D; Cianciolo, Vince; Greene, Geoffrey; Grzywacz, Robert; Penttila, Seppo; Rykaczewski, Krzysztof Piotr; Young, Glenn R; The, Nab
2009-01-01
The Nab collaboration will perform a precise measurement of a, the electron-neutrino correlation parameter, and b, the Fierz interference term in neutron beta decay, in the Fundamental Neutron Physics Beamline at the SNS, using a novel electric/magnetic field spectrometer and detector design. The experiment is aiming at the 10{sup -3} accuracy level in {Delta}a/a, and will provide an independent measurement of {lambda} = G{sub A}/G{sub V}, the ratio of axial-vector to vector coupling constants of the nucleon. Nab also plans to perform the first ever measurement of b in neutron decay, which will provide an independent limit on the tensor weak coupling.
Interpreting uncertainty terms.
Holtgraves, Thomas
2014-08-01
Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.
Reformulating the Quantum Uncertainty Relation.
Li, Jun-Li; Qiao, Cong-Feng
2015-01-01
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197
Reformulating the Quantum Uncertainty Relation
NASA Astrophysics Data System (ADS)
Li, Jun-Li; Qiao, Cong-Feng
2015-08-01
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.
Reformulating the Quantum Uncertainty Relation.
Li, Jun-Li; Qiao, Cong-Feng
2015-08-03
Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.
ERIC Educational Resources Information Center
Duerdoth, Ian
2009-01-01
The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…
Majorization formulation of uncertainty in quantum mechanics
Partovi, M. Hossein
2011-11-15
Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.
Generalized Entropic Uncertainty Relations with Tsallis' Entropy
NASA Technical Reports Server (NTRS)
Portesi, M.; Plastino, A.
1996-01-01
A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.
ERIC Educational Resources Information Center
Hewitt, Paul G.
2004-01-01
Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.
Entropic uncertainty relation in de Sitter space
NASA Astrophysics Data System (ADS)
Jia, Lijuan; Tian, Zehua; Jing, Jiliang
2015-02-01
The uncertainty principle restricts our ability to simultaneously predict the measurement outcomes of two incompatible observables of a quantum particle. However, this uncertainty could be reduced and quantified by a new Entropic Uncertainty Relation (EUR). By the open quantum system approach, we explore how the nature of de Sitter space affects the EUR. When the quantum memory A freely falls in the de Sitter space, we demonstrate that the entropic uncertainty acquires an increase resulting from a thermal bath with the Gibbons-Hawking temperature. And for the static case, we find that the temperature coming from both the intrinsic thermal nature of the de Sitter space and the Unruh effect associated with the proper acceleration of A also brings effect on entropic uncertainty, and the higher the temperature, the greater the uncertainty and the quicker the uncertainty reaches the maximal value. And finally the possible mechanism behind this phenomenon is also explored.
Clarifying types of uncertainty: when are models accurate, and uncertainties small?
Cox, Louis Anthony Tony
2011-10-01
Professor Aven has recently noted the importance of clarifying the meaning of terms such as "scientific uncertainty" for use in risk management and policy decisions, such as when to trigger application of the precautionary principle. This comment examines some fundamental conceptual challenges for efforts to define "accurate" models and "small" input uncertainties by showing that increasing uncertainty in model inputs may reduce uncertainty in model outputs; that even correct models with "small" input uncertainties need not yield accurate or useful predictions for quantities of interest in risk management (such as the duration of an epidemic); and that accurate predictive models need not be accurate causal models.
The link between entropic uncertainty and nonlocality
NASA Astrophysics Data System (ADS)
Tomamichel, Marco; Hänggi, Esther
2013-02-01
Two of the most intriguing features of quantum physics are the uncertainty principle and the occurrence of nonlocal correlations. The uncertainty principle states that there exist pairs of incompatible measurements on quantum systems such that their outcomes cannot both be predicted. On the other hand, nonlocal correlations of measurement outcomes at different locations cannot be explained by classical physics, but appear in the presence of entanglement. Here, we show that these two fundamental quantum effects are quantitatively related. Namely, we provide an entropic uncertainty relation for the outcomes of two binary measurements, where the lower bound on the uncertainty is quantified in terms of the maximum Clauser-Horne-Shimony-Holt value that can be achieved with these measurements. We discuss applications of this uncertainty relation in quantum cryptography, in particular, to certify quantum sources using untrusted devices.
Uncertainty in the Classroom--Teaching Quantum Physics
ERIC Educational Resources Information Center
Johansson, K. E.; Milstead, D.
2008-01-01
The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how…
Entropic uncertainty relations in multidimensional position and momentum spaces
Huang Yichen
2011-05-15
Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.
Investment, regulation, and uncertainty
Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose
2014-01-01
As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745
Calculating Measurement Uncertainties for Mass Spectrometry Data
NASA Astrophysics Data System (ADS)
Essex, R. M.; Goldberg, S. A.
2006-12-01
A complete and transparent characterization of measurement uncertainty is fundamentally important to the interpretation of analytical results. We have observed that the calculation and reporting of uncertainty estimates for isotopic measurement from a variety of analytical facilities are inconsistent, making it difficult to compare and evaluate data. Therefore, we recommend an approach to uncertainty estimation that has been adopted by both US national metrology facilities and is becoming widely accepted within the analytical community. This approach is outlined in the ISO "Guide to the Expression of Uncertainty in Measurement" (GUM). The GUM approach to uncertainty estimation includes four major steps: 1) Specify the measurand; 2) Identify uncertainty sources; 3) Quantify components by determining the standard uncertainty (u) for each component; and 4) Calculate combined standard uncertainty (u_c) by using established propagation laws to combine the various components. To obtain a desired confidence level, the combined standard uncertainty is multiplied by a coverage factor (k) to yield an expanded uncertainty (U). To be consistent with the GUM principles, it is also necessary create an uncertainty budget, which is a listing of all the components comprising the uncertainty and their relative contribution to the combined standard uncertainty. In mass spectrometry, Step 1 is normally the determination of an isotopic ratio for a particular element. Step 2 requires the identification of the many potential sources of measurement variability and bias including: gain, baseline, cup efficiency, Schottky noise, counting statistics, CRM uncertainties, yield calibrations, linearity calibrations, run conditions, and filament geometry. Then an equation expressing the relationship of all of the components to the measurement value must be written. To complete Step 3, these potential sources of uncertainty must be characterized (Type A or Type B) and quantified. This information
Entropic uncertainty and measurement reversibility
NASA Astrophysics Data System (ADS)
Berta, Mario; Wehner, Stephanie; Wilde, Mark M.
2016-07-01
The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.
ERIC Educational Resources Information Center
MacBeath, John; Swaffield, Sue; Frost, David
2009-01-01
This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…
NASA Astrophysics Data System (ADS)
Lamport, Leslie
2012-08-01
Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.
Abolishing the maximum tension principle
NASA Astrophysics Data System (ADS)
Dąbrowski, Mariusz P.; Gohar, H.
2015-09-01
We find the series of example theories for which the relativistic limit of maximum tension Fmax =c4 / 4 G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Laser wavelength meter: analysis of measurement uncertainties
NASA Astrophysics Data System (ADS)
Skrzeczanowski, Wojciech; Zyczkowski, Marek; Dlugaszek, Andrzej
1999-08-01
Principle of operation of laser radiation wavelength meter based on Fabry-Perot interferometer and linear CCD camera is presented in the paper. A dependence, on the base of which laser wavelength can be calculated, is found and a way of defining of all component uncertainties of a measurement is shown. An analysis of an influence and examples of definition of uncertainties of a measurement for four wavelength meter structural sets of different objective focal lengths are presented.
Angular performance measure for tighter uncertainty relations
Hradil, Z.; Rehacek, J.; Klimov, A. B.; Rigas, I.; Sanchez-Soto, L. L.
2010-01-15
The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.
Estimating uncertainties in complex joint inverse problems
NASA Astrophysics Data System (ADS)
Afonso, Juan Carlos
2016-04-01
Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related
Communication and Uncertainty Management.
ERIC Educational Resources Information Center
Brashers, Dale E.
2001-01-01
Suggests the fundamental challenge for refining theories of communication and uncertainty is to abandon the assumption that uncertainty will produce anxiety. Outlines and extends a theory of uncertainty management and reviews current theory and research. Concludes that people want to reduce uncertainty because it is threatening, but uncertainty…
The physical origins of the uncertainty theorem
NASA Astrophysics Data System (ADS)
Giese, Albrecht
2013-10-01
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.
Uncertainty relations for general unitary operators
NASA Astrophysics Data System (ADS)
Bagchi, Shrobona; Pati, Arun Kumar
2016-10-01
We derive several uncertainty relations for two arbitrary unitary operators acting on physical states of a Hilbert space. We show that our bounds are tighter in various cases than the ones existing in the current literature. Using the uncertainty relation for the unitary operators, we obtain the tight state-independent lower bound for the uncertainty of two Pauli observables and anticommuting observables in higher dimensions. With regard to the minimum-uncertainty states, we derive the minimum-uncertainty state equation by the analytic method and relate this to the ground-state problem of the Harper Hamiltonian. Furthermore, the higher-dimensional limit of the uncertainty relations and minimum-uncertainty states are explored. From an operational point of view, we show that the uncertainty in the unitary operator is directly related to the visibility of quantum interference in an interferometer where one arm of the interferometer is affected by a unitary operator. This shows a principle of preparation uncertainty, i.e., for any quantum system, the amount of visibility for two general noncommuting unitary operators is nontrivially upper bounded.
Aspects of complementarity and uncertainty
NASA Astrophysics Data System (ADS)
Vathsan, Radhika; Qureshi, Tabish
2016-08-01
The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.
Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.
2015-01-01
This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108
Principles of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Landé, Alfred
2013-10-01
Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schr
Abuelo, D
1987-01-01
The author discusses the basic principles of genetics, including the classification of genetic disorders and a consideration of the rules and mechanisms of inheritance. The most common pitfalls in clinical genetic diagnosis are described, with emphasis on the problem of the negative or misleading family history.
Role of information theoretic uncertainty relations in quantum theory
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Role of information theoretic uncertainty relations in quantum theory
NASA Astrophysics Data System (ADS)
Jizba, Petr; Dunningham, Jacob A.; Joo, Jaewoo
2015-04-01
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson-Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again, improvement over both the Robertson-Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.
Hydrological model uncertainty assessment in southern Africa
NASA Astrophysics Data System (ADS)
Hughes, D. A.; Kapangaziwiri, E.; Sawunyama, T.
2010-06-01
The importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures used in the southern Africa region. The region is characterized by a paucity of accurate data and limited human resources, but the need for informed development decisions is critical to social and economic development. One of the main sources of uncertainty is related to the estimation of the parameters of hydrological models. This paper proposes a framework for establishing parameter values, exploring parameter inter-dependencies and setting parameter uncertainty bounds for a monthly time-step rainfall-runoff model (Pitman model) that is widely used in the region. The method is based on well-documented principles of sensitivity and uncertainty analysis, but recognizes the limitations that exist within the region (data scarcity and accuracy, model user attitudes, etc.). Four example applications taken from different climate and physiographic regions of South Africa illustrate that the methods are appropriate for generating behavioural stream flow simulations which include parameter uncertainty. The parameters that dominate the model response and their degree of uncertainty vary between regions. Some of the results suggest that the uncertainty bounds will be too wide for effective water resources decision making. Further work is required to reduce some of the subjectivity in the methods and to investigate other approaches for constraining the uncertainty. The paper recognizes that probability estimates of uncertainty and methods to include input climate data uncertainties need to be incorporated into the framework in the future.
Fission Spectrum Related Uncertainties
G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores
2007-10-01
The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.
Direct Aerosol Forcing Uncertainty
Mccomiskey, Allison
2008-01-15
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
Reproducibility and uncertainty of wastewater turbidity measurements.
Joannis, C; Ruban, G; Gromaire, M-C; Chebbo, G; Bertrand-Krajewski, J-L; Joannis, C; Ruban, G
2008-01-01
Turbidity monitoring is a valuable tool for operating sewer systems, but it is often considered as a somewhat tricky parameter for assessing water quality, because measured values depend on the model of sensor, and even on the operator. This paper details the main components of the uncertainty in turbidity measurements with a special focus on reproducibility, and provides guidelines for improving the reproducibility of measurements in wastewater relying on proper calibration procedures. Calibration appears to be the main source of uncertainties, and proper procedures must account for uncertainties in standard solutions as well as non linearity of the calibration curve. With such procedures, uncertainty and reproducibility of field measurement can be kept lower than 5% or 25 FAU. On the other hand, reproducibility has no meaning if different measuring principles (attenuation vs. nephelometry) or very different wavelengths are used.
Optimising uncertainty in physical sample preparation.
Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger
2005-11-01
Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation.
NASA Technical Reports Server (NTRS)
Sato, Toru
1989-01-01
Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.
Uncertainty and cognitive control.
Mushtaq, Faisal; Bland, Amy R; Schaefer, Alexandre
2011-01-01
A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the "need for control"; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.
[Ethics, empiricism and uncertainty].
Porz, R; Zimmermann, H; Exadaktylos, A K
2011-01-01
Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine.
Uncertainty in hydrological signatures
NASA Astrophysics Data System (ADS)
McMillan, Hilary; Westerberg, Ida
2015-04-01
Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty
Deterministic uncertainty analysis
Worley, B.A.
1987-01-01
Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig.
Uncertainty Analysis of Thermal Comfort Parameters
NASA Astrophysics Data System (ADS)
Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages
2015-08-01
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment
ERIC Educational Resources Information Center
Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea
2010-01-01
An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…
Generalized uncertainty principle and self-adjoint operators
Balasubramanian, Venkat; Das, Saurya; Vagenas, Elias C.
2015-09-15
In this work we explore the self-adjointness of the GUP-modified momentum and Hamiltonian operators over different domains. In particular, we utilize the theorem by von-Neumann for symmetric operators in order to determine whether the momentum and Hamiltonian operators are self-adjoint or not, or they have self-adjoint extensions over the given domain. In addition, a simple example of the Hamiltonian operator describing a particle in a box is given. The solutions of the boundary conditions that describe the self-adjoint extensions of the specific Hamiltonian operator are obtained.
NFkappaB in neurons? The uncertainty principle in neurobiology.
Massa, Paul T; Aleyasin, Hossein; Park, David S; Mao, Xianrong; Barger, Steven W
2006-05-01
Nuclear factor kappaB (NFkappaB) is a dynamically modulated transcription factor with an extensive literature pertaining to widespread actions across species, cell types and developmental stages. Analysis of NFkappaB in a complex environment such as neural tissue suffers from a difficulty in simultaneously establishing both activity and location. Much of the available data indicate a profound recalcitrance of NFkappaB activation in neurons, as compared with most other cell types. Few studies to date have sought to distinguish between the various combinatorial dimers of NFkappaB family members. Recent research has illuminated the importance of these problems, as well as opportunities to move past them to the nuances manifest through variable activation pathways, subunit complexity and target sequence preferences.
Economic uncertainty and econophysics
NASA Astrophysics Data System (ADS)
Schinckus, Christophe
2009-10-01
The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.
Physical Uncertainty Bounds (PUB)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.
Beier, Meghan L.
2015-01-01
Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700
NASA Astrophysics Data System (ADS)
Sciacchitano, Andrea; Wieneke, Bernhard
2016-08-01
This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5-10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.
Uncertainty relations and precession of perihelion
NASA Astrophysics Data System (ADS)
Scardigli, Fabio; Casadio, Roberto
2016-03-01
We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the perihelion precession for planets in the solar system, and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.
Uncertainty in quantum mechanics: faith or fantasy?
Penrose, Roger
2011-12-13
The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.
Optimal Universal Uncertainty Relations
Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi
2016-01-01
We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010
Adaptive framework for uncertainty analysis in electromagnetic field measurements.
Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano
2015-04-01
Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty.
Boltz, J P; Daigger, G T
2010-01-01
While biofilm reactors may be classified as one of seven different types, the design of each is unified by fundamental biofilm principles. It follows that state-of-the art design of each biofilm reactor type is subject to the same uncertainties (although the degree of uncertainty may vary). This paper describes unifying biofilm principles and uncertainties of importance in biofilm reactor design. This approach to biofilm reactor design represents a shift from the historical approach which was based on empirical criteria and design formulations. The use of such design criteria was largely due to inherent uncertainty over reactor-scale hydrodynamics and biofilm dynamics, which correlate with biofilm thickness, structure and function. An understanding of two fundamental concepts is required to rationally design biofilm reactors: bioreactor hydrodynamics and biofilm dynamics (with particular emphasis on mass transfer resistances). Bulk-liquid hydrodynamics influences biofilm thickness control, surface area, and development. Biofilm dynamics influences biofilm thickness, structure and function. While the complex hydrodynamics of some biofilm reactors such as trickling filters and biological filters have prevented the widespread use of fundamental biofilm principles and mechanistic models in practice, reactors utilizing integrated fixed-film activated sludge or moving bed technology provide a bulk-liquid hydrodynamic environment allowing for their application. From a substrate transformation perspective, mass transfer in biofilm reactors defines the primary difference between suspended growth and biofilm systems: suspended growth systems are kinetically (i.e., biomass) limited and biofilm reactors are primarily diffusion (i.e., biofilm growth surface area) limited.
Communicating scientific uncertainty
Fischhoff, Baruch; Davis, Alex L.
2014-01-01
All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390
Communicating scientific uncertainty.
Fischhoff, Baruch; Davis, Alex L
2014-09-16
All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science.
Evaluating prediction uncertainty
McKay, M.D.
1995-03-01
The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.
Conundrums with uncertainty factors.
Cooke, Roger
2010-03-01
The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767
[The precautionary principle and the environment].
de Cózar Escalante, José Manuel
2005-01-01
The precautionary principle is a response to uncertainty in the face of risks to health or the environment. In general, it involves taking measures to avoid potential harm, despite lack of scientific certainty. In recent years it has been applied, not without difficulties, as a legal and political principle in many countries, particularly on the European and International level. In spite of the controversy, the precautionary principle has become an integral component of a new paradigm for the creation of public policies needed to meet today's challenges and those of the future.
Dasymetric Modeling and Uncertainty
Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth
2014-01-01
Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications. PMID:25067846
Classification images with uncertainty
Tjan, Bosco S.; Nandy, Anirvan S.
2009-01-01
Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms. PMID:16889477
Classification images with uncertainty.
Tjan, Bosco S; Nandy, Anirvan S
2006-04-04
Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a "haze" that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms.
Network planning under uncertainties
NASA Astrophysics Data System (ADS)
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a
NASA Astrophysics Data System (ADS)
Jones, P. W.; Strelitz, R. A.
2012-12-01
The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs
Equivalence principles and electromagnetism
NASA Technical Reports Server (NTRS)
Ni, W.-T.
1977-01-01
The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.
Measurement uncertainty relations
Busch, Paul; Lahti, Pekka; Werner, Reinhard F.
2014-04-15
Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.
Serenity in political uncertainty.
Doumit, Rita; Afifi, Rema A; Devon, Holli A
2015-01-01
College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930
Serenity in political uncertainty.
Doumit, Rita; Afifi, Rema A; Devon, Holli A
2015-01-01
College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence.
Uncertainty and calibration analysis
Coutts, D.A.
1991-03-01
All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.
Weighted Uncertainty Relations
Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming
2016-01-01
Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295
NASA Technical Reports Server (NTRS)
Brown, Laurie M.
1993-01-01
An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.
Measurement Uncertainty Estimation in Amperometric Sensors: A Tutorial Review
Helm, Irja; Jalukse, Lauri; Leito, Ivo
2010-01-01
This tutorial focuses on measurement uncertainty estimation in amperometric sensors (both for liquid and gas-phase measurements). The main uncertainty sources are reviewed and their contributions are discussed with relation to the principles of operation of the sensors, measurement conditions and properties of the measured samples. The discussion is illustrated by case studies based on the two major approaches for uncertainty evaluation–the ISO GUM modeling approach and the Nordtest approach. This tutorial is expected to be of interest to workers in different fields of science who use measurements with amperometric sensors and need to evaluate the uncertainty of the obtained results but are new to the concept of measurement uncertainty. The tutorial is also expected to be educative in order to make measurement results more accurate. PMID:22399887
Asymptotic entropic uncertainty relations
NASA Astrophysics Data System (ADS)
Adamczak, Radosław; Latała, Rafał; Puchała, Zbigniew; Życzkowski, Karol
2016-03-01
We analyze entropic uncertainty relations for two orthogonal measurements on a N-dimensional Hilbert space, performed in two generic bases. It is assumed that the unitary matrix U relating both bases is distributed according to the Haar measure on the unitary group. We provide lower bounds on the average Shannon entropy of probability distributions related to both measurements. The bounds are stronger than those obtained with use of the entropic uncertainty relation by Maassen and Uffink, and they are optimal up to additive constants. We also analyze the case of a large number of measurements and obtain strong entropic uncertainty relations, which hold with high probability with respect to the random choice of bases. The lower bounds we obtain are optimal up to additive constants and allow us to prove a conjecture by Wehner and Winter on the asymptotic behavior of constants in entropic uncertainty relations as the dimension tends to infinity. As a tool we develop estimates on the maximum operator norm of a submatrix of a fixed size of a random unitary matrix distributed according to the Haar measure, which are of independent interest.
Uncertainties in repository modeling
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Bereby-Meyer, Yoella
2012-02-01
Guala points to a discrepancy between strong negative reciprocity observed in the lab and the way cooperation is sustained "in the wild." This commentary suggests that in lab experiments, strong negative reciprocity is limited when uncertainty exists regarding the players' actions and the intentions. Thus, costly punishment is indeed a limited mechanism for sustaining cooperation in an uncertain environment.
Bereby-Meyer, Yoella
2012-02-01
Guala points to a discrepancy between strong negative reciprocity observed in the lab and the way cooperation is sustained "in the wild." This commentary suggests that in lab experiments, strong negative reciprocity is limited when uncertainty exists regarding the players' actions and the intentions. Thus, costly punishment is indeed a limited mechanism for sustaining cooperation in an uncertain environment. PMID:22289307
An uncertainty inventory demonstration - a primary step in uncertainty quantification
Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J
2009-01-01
Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.
Uncertainty Analysis in Space Radiation Protection
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2011-01-01
Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.
Courtney, H; Kirkland, J; Viguerie, P
1997-01-01
At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.
The propagation of uncertainty for humidity calculations
NASA Astrophysics Data System (ADS)
Lovell-Smith, J.
2009-12-01
This paper addresses the international humidity community's need for standardization of methods for propagation of uncertainty associated with humidity generators and for handling uncertainty associated with the reference water vapour-pressure and enhancement-factor equations. The paper outlines uncertainty calculations for the mixing ratio, dew-point temperature and relative humidity output from humidity generators, and in particular considers controlling equations for a theoretical hybrid humidity generator combining single-pressure (1-P), two-pressure (2-P) and two-flow (2-F) principles. Also considered is the case where the humidity generator is used as a stable source with traceability derived from a reference hygrometer, i.e. a dew-point meter, a relative humidity meter or a wet-bulb psychrometer. Most humidity generators in use at national metrology institutes can be considered to be special cases of those considered here and sensitivity coefficients for particular types may be extracted. The ability to account for correlations between input variables and between different instances of the evaluation of the reference equations is discussed. The uncertainty calculation examples presented here are representative of most humidity calculations.
Asymmetric Uncertainty Expression for High Gradient Aerodynamics
NASA Technical Reports Server (NTRS)
Pinier, Jeremy T
2012-01-01
When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.
[Stereotactic body radiation therapy: uncertainties and margins].
Lacornerie, T; Marchesi, V; Reynaert, N
2014-01-01
The principles governing stereotactic body radiation therapy are tight margins and large dose gradients around targets. Every step of treatment preparation and delivery must be evaluated before applying this technique in the clinic. Uncertainties remain in each of these steps: delineation, prescription with the biological equivalent dose, treatment planning, patient set-up taking into account movements, the machine accuracy. The calculation of margins to take into account uncertainties differs from conventional radiotherapy because of the delivery of few fractions and large dose gradients around the target. The quest of high accuracy is complicated by the difficulty to reach it and the lack of consensus regarding the prescription. Many schemes dose/number of fractions are described in clinical studies and there are differences in the way describing the delivered doses. While waiting for the ICRU report dedicated to this technique, it seems desirable to use the quantities proposed in ICRU Report 83 (IMRT) to report the dose distribution. PMID:25023588
Simple Resonance Hierarchy for Surmounting Quantum Uncertainty
Amoroso, Richard L.
2010-12-22
For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.
Majorization entropic uncertainty relations
NASA Astrophysics Data System (ADS)
Puchała, Zbigniew; Rudnicki, Łukasz; Życzkowski, Karol
2013-07-01
Entropic uncertainty relations in a finite-dimensional Hilbert space are investigated. Making use of the majorization technique we derive explicit lower bounds for the sum of Rényi entropies describing probability distributions associated with a given pure state expanded in eigenbases of two observables. Obtained bounds are expressed in terms of the largest singular values of submatrices of the unitary rotation matrix. Numerical simulations show that for a generic unitary matrix of size N = 5, our bound is stronger than the well-known result of Maassen and Uffink (MU) with a probability larger than 98%. We also show that the bounds investigated are invariant under the dephasing and permutation operations. Finally, we derive a classical analogue of the MU uncertainty relation, which is formulated for stochastic transition matrices. Dedicated to Iwo Białynicki-Birula on the occasion of his 80th birthday.
Uncertainties in transpiration estimates.
Coenders-Gerrits, A M J; van der Ent, R J; Bogaard, T A; Wang-Erlandsson, L; Hrachowitz, M; Savenije, H H G
2014-02-13
arising from S. Jasechko et al. Nature 496, 347-350 (2013)10.1038/nature11983How best to assess the respective importance of plant transpiration over evaporation from open waters, soils and short-term storage such as tree canopies and understories (interception) has long been debated. On the basis of data from lake catchments, Jasechko et al. conclude that transpiration accounts for 80-90% of total land evaporation globally (Fig. 1a). However, another choice of input data, together with more conservative accounting of the related uncertainties, reduces and widens the transpiration ratio estimation to 35-80%. Hence, climate models do not necessarily conflict with observations, but more measurements on the catchment scale are needed to reduce the uncertainty range. There is a Reply to this Brief Communications Arising by Jasechko, S. et al. Nature 506, http://dx.doi.org/10.1038/nature12926 (2014).
Fulford, J.M.; Davies, W.J.
2005-01-01
The U.S. Geological Survey is investigating the performance of radars used for stage (or water-level) measurement. This paper presents a comparison of estimated uncertainties and data for radar water-level measurements with float, bubbler, and wire weight water-level measurements. The radar sensor was also temperature-tested in a laboratory. The uncertainty estimates indicate that radar measurements are more accurate than uncorrected pressure sensors at higher water stages, but are less accurate than pressure sensors at low stages. Field data at two sites indicate that radar sensors may have a small negative bias. Comparison of field radar measurements with wire weight measurements found that the radar tends to measure slightly lower values as stage increases. Copyright ASCE 2005.
Mass Uncertainty and Application For Space Systems
NASA Technical Reports Server (NTRS)
Beech, Geoffrey
2013-01-01
Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.
Multiresolutional models of uncertainty generation and reduction
NASA Technical Reports Server (NTRS)
Meystel, A.
1989-01-01
Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning.
Uncertainties in climate stabilization
Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.
2009-11-01
We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.
Uncertainty quantified trait predictions
NASA Astrophysics Data System (ADS)
Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter
2015-04-01
Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.
Calibration Under Uncertainty.
Swiler, Laura Painton; Trucano, Timothy Guy
2005-03-01
This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.
Equivalence of wave-particle duality to entropic uncertainty.
Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie
2014-01-01
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter. PMID:25524138
Equivalence of wave-particle duality to entropic uncertainty.
Coles, Patrick J; Kaniewski, Jedrzej; Wehner, Stephanie
2014-12-19
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics. Furthermore, it leads to a robust framework for deriving novel WPDRs by applying entropic uncertainty relations to interferometric models. As an illustration, we derive a novel relation that captures the coherence in a quantum beam splitter.
The equivalence principle in a quantum world
NASA Astrophysics Data System (ADS)
Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre
2015-09-01
We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).
Collective uncertainty entanglement test.
Rudnicki, Łukasz; Horodecki, Paweł; Zyczkowski, Karol
2011-10-01
For a given pure state of a composite quantum system we analyze the product of its projections onto a set of locally orthogonal separable pure states. We derive a bound for this product analogous to the entropic uncertainty relations. For bipartite systems the bound is saturated for maximally entangled states and it allows us to construct a family of entanglement measures, we shall call collectibility. As these quantities are experimentally accessible, the approach advocated contributes to the task of experimental quantification of quantum entanglement, while for a three-qubit system it is capable to identify the genuine three-party entanglement.
Collective Uncertainty Entanglement Test
NASA Astrophysics Data System (ADS)
Rudnicki, Łukasz; Horodecki, Paweł; Życzkowski, Karol
2011-10-01
For a given pure state of a composite quantum system we analyze the product of its projections onto a set of locally orthogonal separable pure states. We derive a bound for this product analogous to the entropic uncertainty relations. For bipartite systems the bound is saturated for maximally entangled states and it allows us to construct a family of entanglement measures, we shall call collectibility. As these quantities are experimentally accessible, the approach advocated contributes to the task of experimental quantification of quantum entanglement, while for a three-qubit system it is capable to identify the genuine three-party entanglement.
Schwarzschild mass uncertainty
NASA Astrophysics Data System (ADS)
Davidson, Aharon; Yellin, Ben
2014-02-01
Applying Dirac's procedure to -dependent constrained systems, we derive a reduced total Hamiltonian, resembling an upside down harmonic oscillator, which generates the Schwarzschild solution in the mini super-spacetime. Associated with the now -dependent Schrodinger equation is a tower of localized Guth-Pi-Barton wave packets, orthonormal and non-singular, admitting equally spaced average-`energy' levels. Our approach is characterized by a universal quantum mechanical uncertainty structure which enters the game already at the flat spacetime level, and accompanies the massive Schwarzschild sector for any arbitrary mean mass. The average black hole horizon surface area is linearly quantized.
Fundamental "Uncertainty" in Science
NASA Astrophysics Data System (ADS)
Reichl, Linda E.
The conference on "Uncertainty and Surprise" was concerned with our fundamental inability to predict future events. How can we restructure organizations to effectively function in an uncertain environment? One concern is that many large complex organizations are built on mechanical models, but mechanical models cannot always respond well to "surprises." An underlying assumption a bout mechanical models is that, if we give them enough information about the world, they will know the future accurately enough that there will be few or no surprises. The assumption is that the future is basically predictable and deterministic.
Picturing Data With Uncertainty
NASA Technical Reports Server (NTRS)
Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex
2004-01-01
NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi
Satellite altitude determination uncertainties
NASA Technical Reports Server (NTRS)
Siry, J. W.
1972-01-01
Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.
Position-momentum uncertainty relations based on moments of arbitrary order
Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo; Dehesa, Jesus S.
2011-05-15
The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.
NASA Astrophysics Data System (ADS)
Zhang, Jun; Zhang, Yang; Yu, Chang-Shui
2015-06-01
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.
Zhang, Jun; Zhang, Yang; Yu, Chang-shui
2015-01-01
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details. PMID:26118488
Zhang, Jun; Zhang, Yang; Yu, Chang-shui
2015-06-29
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki's bound entangled state are investigated in details.
Uncertain LDA: Including Observation Uncertainties in Discriminative Transforms.
Saeidi, Rahim; Astudillo, Ramon Fernandez; Kolossa, Dorothea
2016-07-01
Linear discriminant analysis (LDA) is a powerful technique in pattern recognition to reduce the dimensionality of data vectors. It maximizes discriminability by retaining only those directions that minimize the ratio of within-class and between-class variance. In this paper, using the same principles as for conventional LDA, we propose to employ uncertainties of the noisy or distorted input data in order to estimate maximally discriminant directions. We demonstrate the efficiency of the proposed uncertain LDA on two applications using state-of-the-art techniques. First, we experiment with an automatic speech recognition task, in which the uncertainty of observations is imposed by real-world additive noise. Next, we examine a full-scale speaker recognition system, considering the utterance duration as the source of uncertainty in authenticating a speaker. The experimental results show that when employing an appropriate uncertainty estimation algorithm, uncertain LDA outperforms its conventional LDA counterpart.
Principles of project management
NASA Technical Reports Server (NTRS)
1982-01-01
The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.
Intuitions, principles and consequences.
Shaw, A B
2001-02-01
Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences.
Chemical Principls Exemplified
ERIC Educational Resources Information Center
Plumb, Robert C.
1973-01-01
Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)
Identification of severe accident uncertainties
Rivard, J.B.; Behr, V.L.; Easterling, R.G.; Griesmeyer, J.M.; Haskin, F.E.; Hatch, S.W.; Kolaczkowski, A.M.; Lipinski, R.J.; Sherman, M.P.; Taig, A.R.
1984-09-01
Understanding of severe accidents in light-water reactors is currently beset with uncertainty. Because the uncertainties that are present limit the capability to analyze the progression and possible consequences of such accidents, they restrict the technical basis for regulatory actions by the US Nuclear Regulatory Commission (NRC). It is thus necessary to attempt to identify the sources and quantify the influence of these uncertainties. As a part of ongoing NRC severe-accident programs at Sandia National Laboratories, a working group was formed to pool relevant knowledge and experience in assessing the uncertainties attending present (1983) knowledge of severe accidents. This initial report of the Severe Accident Uncertainty Analysis (SAUNA) working group has as its main goal the identification of a consolidated list of uncertainties that affect in-plant processes and systems. Many uncertainties have been identified. A set of key uncertainties summarizes many of the identified uncertainties. Quantification of the influence of these uncertainties, a necessary second step, is not attempted in the present report, although attempts are made qualitatively to demonstrate the relevance of the identified uncertainties.
The maintenance of uncertainty
NASA Astrophysics Data System (ADS)
Smith, L. A.
Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ι-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary
Uncertainty in adaptive capacity
NASA Astrophysics Data System (ADS)
Adger, W. Neil; Vincent, Katharine
2005-03-01
The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).
Antarctic Photochemistry: Uncertainty Analysis
NASA Technical Reports Server (NTRS)
Stewart, Richard W.; McConnell, Joseph R.
1999-01-01
Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.
Uncertainty in Wildfire Behavior
NASA Astrophysics Data System (ADS)
Finney, M.; Cohen, J. D.
2013-12-01
The challenge of predicting or modeling fire behavior is well recognized by scientists and managers who attempt predictions of fire spread rate or growth. At the scale of the spreading fire, the uncertainty in winds, moisture, fuel structure, and fire location make accurate predictions difficult, and the non-linear response of fire spread to these conditions means that average behavior is poorly represented by average environmental parameters. Even more difficult are estimations of threshold behaviors (e.g. spread/no-spread, crown fire initiation, ember generation and spotting) because the fire responds as a step-function to small changes in one or more environmental variables, translating to dynamical feedbacks and unpredictability. Recent research shows that ignition of fuel particles, itself a threshold phenomenon, depends on flame contact which is absolutely not steady or uniform. Recent studies of flame structure in both spreading and stationary fires reveals that much of the non-steadiness of the flames as they contact fuel particles results from buoyant instabilities that produce quasi-periodic flame structures. With fuel particle ignition produced by time-varying heating and short-range flame contact, future improvements in fire behavior modeling will likely require statistical approaches to deal with the uncertainty at all scales, including the level of heat transfer, the fuel arrangement, and weather.
Probabilistic Mass Growth Uncertainties
NASA Technical Reports Server (NTRS)
Plumer, Eric; Elliott, Darren
2013-01-01
Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.
ERIC Educational Resources Information Center
Beim, George
This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…
Chemical Principles Exemplified
ERIC Educational Resources Information Center
Plumb, Robert C.
1970-01-01
This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…
Uncertainties in risk assessment at USDOE facilities
Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.
1994-01-01
The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.
Uncertainty relations as Hilbert space geometry
NASA Technical Reports Server (NTRS)
Braunstein, Samuel L.
1994-01-01
Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.
Improvement of Statistical Decisions under Parametric Uncertainty
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis
2011-10-01
A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.
Uncertainty and precaution in environmental management.
Krayer von Krauss, M; van Asselt, M B A; Henze, M; Ravetz, J; Beck, M B
2005-01-01
In this paper, two different visions of the relationship between science and policy are contrasted with one another: the "modern" vision and the "precautionary" vision. Conditions which must apply in order to invoke the Precautionary Principle are presented, as are some of the main challenges posed by the principle. The following central question remains: If scientific certainty cannot be provided, what may then justify regulatory interventions, and what degree of intervention is justifiable? The notion of "quality of information" is explored, and it is emphasized that there can be no absolute definition of good or bad quality. Collective judgments of quality are only possible through deliberation on the characteristics of the information, and on the relevance of the information to the policy context. Reference to a relative criterion therefore seems inevitable and legal complexities are to be expected. Uncertainty is presented as a multidimensional concept, reaching far beyond the conventional statistical interpretation of the concept. Of critical importance is the development of methods for assessing qualitative categories of uncertainty. Model quality assessment should observe the following rationale: identify a model that is suited to the purpose, yet bears some reasonable resemblance to the "real" phenomena. In this context, "purpose" relates to the policy and societal contexts in which the assessment results are to be used. It is therefore increasingly agreed that judgment of the quality of assessments necessarily involves the participation of non-modellers and non-scientists. A challenging final question is: How to use uncertainty information in policy contexts? More research is required in order to answer this question.
Earthquake Loss Estimation Uncertainties
NASA Astrophysics Data System (ADS)
Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander
2013-04-01
The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity
Uncertainty relation in Schwarzschild spacetime
NASA Astrophysics Data System (ADS)
Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng
2015-04-01
We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.
A genetic uncertainty problem.
Tautz, D
2000-11-01
The existence of genes that, when knocked out, result in no obvious phenotype has puzzled biologists for many years. The phenomenon is often ascribed to redundancy in regulatory networks, caused by duplicated genes. However, a recent systematic analysis of data from the yeast genome projects does not support a link between gene duplications and redundancies. An alternative explanation suggests that genes might also evolve by very weak selection, which would mean that their true function cannot be studied in normal laboratory experiments. This problem is comparable to Heisenberg's uncertainty relationship in physics. It is possible to formulate an analogous relationship for biology, which, at its extreme, predicts that the understanding of the full function of a gene might require experiments on an evolutionary scale, involving the entire effective population size of a given species.
NASA Astrophysics Data System (ADS)
Petzinger, Tom
I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.
Rau, N.; Fong, C.C.; Grigg, C.H.; Silverstein, B.
1994-11-01
In the electric utility industry, only one thing can be guaranteed with absolute certainty: one lives and works with many unknowns. Thus, the industry has embraced probability methods to varying degrees over the last 25 years. These techniques aid decision makers in planning, operations, and maintenance by quantifying uncertainty. Examples include power system reliability, production costing simulation, and assessment of environmental factors. A series of brainstorming sessions was conducted by the Application of Probability Methods (APM) Subcommittee of the IEEE Power Engineering Society to identify research and development needs and to ask the question, ''where should we go from here '' The subcommittee examined areas of need in data development, applications, and methods for decision making. The purpose of this article is to share the thoughts of APM members with a broader audience to the findings and to invite comments and participation.
Generalized uncertainty relations
NASA Astrophysics Data System (ADS)
Akten, Burcu Elif
1999-12-01
The Heisenberg uncertainty relation has been put into a stronger form by Schrödinger and Robertson. This inequality is also canonically invariant. We ask if there are other independent inequalities for higher orders. The aim is to find a systematic way for writing these inequalities. After an overview of the Heisenberg and Schrödinger-Robertson inequalities and their minimal states in Chapter 1, we start by constructing the higher order invariants in Chapter 2. We construct some of the simpler invariants by direct calculation, which suggests a schematic way of representing all invariants. Diagrams describing invariants help us see their structure and their symmetries immediately and various simplifications in their calculations are obtained as a result. With these new tools, a more systematic approach to construct and classify invariants using group theory is introduced next. In Chapter 4, various methods of obtaining higher order inequalities are discussed and compared. First, the original approach of HUR is applied to the next order and a new inequality is obtained by working in a specific frame where the expectation value tensor is in its simplest form. However, this method can not be used for higher orders as the significant simplifications of a specific frame is no longer available. The second method consists of working with a state vector written as a sum of the eigenvectors of the operator (qp)s and has a Gaussian distribution about the state which makes
Direct tests of measurement uncertainty relations: what it takes.
Busch, Paul; Stevens, Neil
2015-02-20
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables. PMID:25763941
Direct tests of measurement uncertainty relations: what it takes.
Busch, Paul; Stevens, Neil
2015-02-20
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables.
Parameter uncertainty for ASP models
Knudsen, J.K.; Smith, C.L.
1995-10-01
The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more than one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.
Impact of discharge data uncertainty on nutrient load uncertainty
NASA Astrophysics Data System (ADS)
Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars
2016-04-01
Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.
The precautionary principle and ecological hazards of genetically modified organisms.
Giampietro, Mario
2002-09-01
This paper makes three points relevant to the application of the precautionary principle to the regulation of GMOs. i) The unavoidable arbitrariness in the application of the precautionary principle reflects a deeper epistemological problem affecting scientific analyses of sustainability. This requires understanding the difference between the concepts of "risk", "uncertainty" and "ignorance". ii) When dealing with evolutionary processes it is impossible to ban uncertainty and ignorance from scientific models. Hence, traditional risk analysis (probability distributions and exact numerical models) becomes powerless. Other forms of scientific knowledge (general principles or metaphors) may be useful alternatives. iii) The existence of ecological hazards per se should not be used as a reason to stop innovations altogether. However, the precautionary principle entails that scientists move away from the concept of "substantive rationality" (trying to indicate to society optimal solutions) to that of "procedural rationality" (trying to help society to find "satisficing" solutions). PMID:12436844
Uncertainty quantification in lattice QCD calculations for nuclear physics
Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.
2015-02-05
The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.
Uncertainty analysis of thermoreflectance measurements
NASA Astrophysics Data System (ADS)
Yang, Jia; Ziade, Elbara; Schmidt, Aaron J.
2016-01-01
We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica.
Uncertainty analysis of thermoreflectance measurements.
Yang, Jia; Ziade, Elbara; Schmidt, Aaron J
2016-01-01
We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica.
Uncertainty in Air Quality Modeling.
NASA Astrophysics Data System (ADS)
Fox, Douglas G.
1984-01-01
Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that
Simplified propagation of standard uncertainties
Shull, A.H.
1997-06-09
An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards` uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper.
Quantum maximum entropy principle for fractional exclusion statistics.
Trovato, M; Reggiani, L
2013-01-11
Using the Wigner representation, compatibly with the uncertainty principle, we formulate a quantum maximum entropy principle for the fractional exclusion statistics. By considering anyonic systems satisfying fractional exclusion statistic, all the results available in the literature are generalized in terms of both the kind of statistics and a nonlocal description for excluson gases. Gradient quantum corrections are explicitly given at different levels of degeneracy and classical results are recovered when ℏ→0.
Analysis of Infiltration Uncertainty
R. McCurley
2003-10-27
The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the
Antonopoulou, Lila; van Meurs, Philip
2003-11-01
The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health. PMID:14585517
Pandemic influenza: certain uncertainties
Morens, David M.; Taubenberger, Jeffery K.
2011-01-01
SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672
Physical principles of hearing
NASA Astrophysics Data System (ADS)
Martin, Pascal
2015-10-01
The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography
Mama Software Features: Uncertainty Testing
Ruggiero, Christy E.; Porter, Reid B.
2014-05-30
This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.
Housing Uncertainty and Childhood Impatience
ERIC Educational Resources Information Center
Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma
2011-01-01
The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…
Quantification of Emission Factor Uncertainty
Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...
Uncertainty in Integrated Assessment Scenarios
Mort Webster
2005-10-17
The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean
Participatory Development Principles and Practice: Reflections of a Western Development Worker.
ERIC Educational Resources Information Center
Keough, Noel
1998-01-01
Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)
Planning ATES systems under uncertainty
NASA Astrophysics Data System (ADS)
Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin
2015-04-01
Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions
Uncertainties in the Astronomical Ephemeris as Constraints on New Physics
NASA Astrophysics Data System (ADS)
Warecki, Zoey; Overduin, J.
2014-01-01
Most extensions of the standard model of particle physics predict composition-dependent violations of the universality of free fall (equivalence principle). We test this idea using observational uncertainties in mass, range and mean motion for the Moon and planets, as well as orbit uncertainties for Trojan asteroids and Saturnian satellites. For suitable pairs of solar-system bodies, we derive linearly independent constraints on relative difference in gravitational and inertial mass from modifications to Kepler's third law, the migration of stable Lagrange points, and orbital polarization (the Nordtvedt effect). These constraints can be combined with data on bulk composition to extract limits on violations of the equivalence principle for individual elements relative to one another. These limits are weaker than those from laboratory experiments, but span a much larger volume in composition space.
Voith, V L
1986-12-01
This article discusses some general principles of learning as well as possible constraints and how such principles can apply to horses. A brief review is presented of experiments that were designed to assess learning in horses. The use of behavior modification techniques to treat behavior problems in horses is discussed and several examples of the use of these techniques are provided. PMID:3492241
Hamilton's Principle for Beginners
ERIC Educational Resources Information Center
Brun, J. L.
2007-01-01
I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…
Clouser, K D; Gert, B
1990-04-01
The authors use the term "principlism" to refer to the practice of using "principles" to replace both moral theory and particular moral rules and ideals in dealing with the moral problems that arise in medical practice. The authors argue that these "principles" do not function as claimed, and that their use is misleading both practically and theoretically. The "principles" are in fact not guides to action, but rather they are merely names for a collection of sometimes superficially related matters for consideration when dealing with a moral problem. The "principles" lack any systematic relationship to each other, and they often conflict with each other. These conflicts are unresolvable, since there is no unified moral theory from which they are all derived. For comparison the authors sketch the advantages of using a unified moral theory. PMID:2351895
How uncertainty bounds the shape index of simple cells.
Barbieri, D; Citti, G; Sarti, A
2014-04-17
We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex.Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15.
The source dilemma hypothesis: Perceptual uncertainty contributes to musical emotion.
Bonin, Tanor L; Trainor, Laurel J; Belyk, Michel; Andrews, Paul W
2016-09-01
Music can evoke powerful emotions in listeners. Here we provide the first empirical evidence that the principles of auditory scene analysis and evolutionary theories of emotion are critical to a comprehensive theory of musical emotion. We interpret these data in light of a theoretical framework termed "the source dilemma hypothesis," which predicts that uncertainty in the number, identity or location of sound objects elicits unpleasant emotions by presenting the auditory system with an incoherent percept, thereby motivating listeners to resolve the auditory ambiguity. We describe two experiments in which source location and timbre were manipulated to change uncertainty in the auditory scene. In both experiments, listeners rated tonal and atonal melodies with congruent auditory scene cues as more pleasant than melodies with incongruent auditory scene cues. These data suggest that music's emotive capacity relies in part on the perceptual uncertainty it produces regarding the auditory scene.
The source dilemma hypothesis: Perceptual uncertainty contributes to musical emotion.
Bonin, Tanor L; Trainor, Laurel J; Belyk, Michel; Andrews, Paul W
2016-09-01
Music can evoke powerful emotions in listeners. Here we provide the first empirical evidence that the principles of auditory scene analysis and evolutionary theories of emotion are critical to a comprehensive theory of musical emotion. We interpret these data in light of a theoretical framework termed "the source dilemma hypothesis," which predicts that uncertainty in the number, identity or location of sound objects elicits unpleasant emotions by presenting the auditory system with an incoherent percept, thereby motivating listeners to resolve the auditory ambiguity. We describe two experiments in which source location and timbre were manipulated to change uncertainty in the auditory scene. In both experiments, listeners rated tonal and atonal melodies with congruent auditory scene cues as more pleasant than melodies with incongruent auditory scene cues. These data suggest that music's emotive capacity relies in part on the perceptual uncertainty it produces regarding the auditory scene. PMID:27318599
Group environmental preference aggregation: the principle of environmental justice
Davos, C.A.
1986-01-01
The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.
Uncertainties of Mayak urine data
Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir
2008-01-01
For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.
Credible Computations: Standard and Uncertainty
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)
1995-01-01
The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties
Uncertainty of testing methods--what do we (want to) know?
Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone
2013-01-01
It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods.
PIV uncertainty quantification by image matching
NASA Astrophysics Data System (ADS)
Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio
2013-04-01
A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the
Visualizing uncertainty about the future.
Spiegelhalter, David; Pearson, Mike; Short, Ian
2011-09-01
We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge.
Uncertainty analysis of thermoreflectance measurements.
Yang, Jia; Ziade, Elbara; Schmidt, Aaron J
2016-01-01
We derive a generally applicable formula to calculate the precision of multi-parameter measurements that apply least squares algorithms. This formula, which accounts for experimental noise and uncertainty in the controlled model parameters, is then used to analyze the uncertainty of thermal property measurements with pump-probe thermoreflectance techniques. We compare the uncertainty of time domain thermoreflectance and frequency domain thermoreflectance (FDTR) when measuring bulk materials and thin films, considering simultaneous measurements of various combinations of thermal properties, including thermal conductivity, heat capacity, and thermal boundary conductance. We validate the uncertainty analysis using Monte Carlo simulations on data from FDTR measurements of an 80 nm gold film on fused silica. PMID:26827342
Estimations of uncertainties of frequencies
NASA Astrophysics Data System (ADS)
Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan
2015-08-01
Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common.The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints.We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.
Climate Projections and Uncertainty Communication.
Joslyn, Susan L; LeClerc, Jared E
2016-01-01
Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.
Climate Projections and Uncertainty Communication.
Joslyn, Susan L; LeClerc, Jared E
2016-01-01
Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995
Maximum predictive power and the superposition principle
NASA Technical Reports Server (NTRS)
Summhammer, Johann
1994-01-01
In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.
Uncertainty-induced quantum nonlocality
NASA Astrophysics Data System (ADS)
Wu, Shao-xiong; Zhang, Jun; Yu, Chang-shui; Song, He-shan
2014-01-01
Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2×d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.
Dynamical Realism and Uncertainty Propagation
NASA Astrophysics Data System (ADS)
Park, Inkwan
In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates
Archimedes' Principle in Action
ERIC Educational Resources Information Center
Kires, Marian
2007-01-01
The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)
Chemical Principles Exemplified
ERIC Educational Resources Information Center
Plumb, Robert C.
1972-01-01
Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)
Archimedes' principle in action
NASA Astrophysics Data System (ADS)
Kireš, Marián
2007-09-01
The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers.
Uncertainty of empirical correlation equations
NASA Astrophysics Data System (ADS)
Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.
2016-08-01
The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.
Wildfire Decision Making Under Uncertainty
NASA Astrophysics Data System (ADS)
Thompson, M.
2013-12-01
Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
The Bayesian brain: phantom percepts resolve sensory uncertainty.
De Ridder, Dirk; Vanneste, Sven; Freeman, Walter
2014-07-01
Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the
Principles of Tendon Transfer.
Wilbur, Danielle; Hammert, Warren C
2016-08-01
Tendon transfers provide a substitute, either temporary or permanent, when function is lost due to neurologic injury in stroke, cerebral palsy or central nervous system lesions, peripheral nerve injuries, or injuries to the musculotendinous unit itself. This article reviews the basic principles of tendon transfer, which are important when planning surgery and essential for an optimal outcome. In addition, concepts for coapting the tendons during surgery and general principles to be followed during the rehabilitation process are discussed. PMID:27387072
Structural model uncertainty in stochastic simulation
McKay, M.D.; Morrison, J.D.
1997-09-01
Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.
ERIC Educational Resources Information Center
Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.
2002-01-01
Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…
Uncertainty estimation using fuzzy measures for multiclass classification.
Graves, Kynan E; Nagarajah, Romesh
2007-01-01
Uncertainty arises in classification problems when the input pattern is not perfect or measurement error is unavoidable. In many applications, it would be beneficial to obtain an estimate of the uncertainty associated with a new observation and its membership within a particular class. Although statistical classification techniques base decision boundaries according to the probability distributions of the patterns belonging to each class, they are poor at supplying uncertainty information for new observations. Previous research has documented a multiarchitecture, monotonic function neural network model for the representation of uncertainty associated with a new observation for two-class classification. This paper proposes a modification to the monotonic function model to estimate the uncertainty associated with a new observation for multiclass classification. The model, therefore, overcomes a limitation of traditional classifiers that base decisions on sharp classification boundaries. As such, it is believed that this method will have advantages for applications such as biometric recognition in which the estimation of classification uncertainty is an important issue. This approach is based on the transformation of the input pattern vector relative to each classification class. Separate, monotonic, single-output neural networks are then used to represent the "degree-of-similarity" between each input pattern vector and each class. An algorithm for the implementation of this approach is proposed and tested with publicly available face-recognition data sets. The results indicate that the suggested approach provides similar classification performance to conventional principle component analysis (PCA) and linear discriminant analysis (LDA) techniques for multiclass pattern recognition problems as well as providing uncertainty information caused by misclassification.
Uncertainty in perception and the Hierarchical Gaussian Filter.
Mathys, Christoph D; Lomakina, Ekaterina I; Daunizeau, Jean; Iglesias, Sandra; Brodersen, Kay H; Friston, Karl J; Stephan, Klaas E
2014-01-01
In its full sense, perception rests on an agent's model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the Hierarchical Gaussian Filter (HGF) offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (in)stability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF's hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling) all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient-but at the same time intuitive-framework for the resolution of perceptual uncertainty in behaving agents. PMID:25477800
RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY
Salaymeh, S.; Ashley, W.; Jeffcoat, R.
2010-06-17
It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.
Uncertainty limits for the macroscopic elastic moduli of random polycrystalline aggregates
NASA Astrophysics Data System (ADS)
Chinh, Pham Duc
2000-08-01
Practical polycrystalline aggregates are expected to have macroscopic properties that depend upon the properties of constituent crystals and the aggregate geometry. Since that microgeometry is usually random, there will be some uncertainty in the observed macroscopic behavior of the aggregates. The general shape-independent upper and lower estimates for those uncertainty intervals for the elastic moduli of completely random polycrystals are constructed from the minimum energy and complementary energy principles. Applications to aggregates of cubic crystals are presented.
On the Minimal Length Uncertainty Relation and the Foundations of String Theory
Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; Takeuchi, Tatsu
2011-01-01
We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.
The traveltime holographic principle
NASA Astrophysics Data System (ADS)
Huang, Yunsong; Schuster, Gerard T.
2015-01-01
Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.
Communicating Uncertainties on Climate Change
NASA Astrophysics Data System (ADS)
Planton, S.
2009-09-01
The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.
Spaceborne receivers: Basic principles
NASA Technical Reports Server (NTRS)
Stacey, J. M.
1984-01-01
The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.
Sub-Heisenberg phase uncertainties
NASA Astrophysics Data System (ADS)
Pezzé, Luca
2013-12-01
Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.
Uncertainty and Sensitivity Analyses Plan
Simpson, J.C.; Ramsdell, J.V. Jr.
1993-04-01
Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.
Fertility behaviour under income uncertainty.
Ranjan, P
1999-03-01
A two-period stochastic model of fertility behavior was developed in order to provide an explanation for the staggering decrease in birth rates in former Soviet Republics and Eastern European countries. A link between income uncertainty and fertility behavior was proposed. The increase in uncertainty about future income could lead people to postpone their childbearing decision. This is attributable to the irreversibility of the childbearing decision and the ease with which it may be postponed. A threshold effect is the result, so that individuals above the threshold level of income tend to have a stronger desire to have a child immediately, and those below the threshold tend to wait until the income uncertainty is past. This behavioral pattern could account for the recent decline in birth rates that has accompanied a decreasing per capita income level in most of the former Soviet Republics and the East European countries.
Uncertainty formulations for multislit interferometry
NASA Astrophysics Data System (ADS)
Biniok, Johannes C. G.
2014-12-01
In the context of (far-field) multislit interferometry we investigate the utility of two formulations of uncertainty in accounting for the complementarity of spatial localization and fringe width. We begin with a characterization of the relevant observables and general considerations regarding the suitability of different types of measures. The detailed analysis shows that both of the discussed uncertainty formulations yield qualitatively similar results, confirming that they correctly capture the relevant tradeoff. One approach, based on an idea of Aharonov and co-workers, is intuitively appealing and relies on a modification of the Heisenberg uncertainty relation. The other approach, developed by Uffink and Hilgevoord for single- and double-slit experiments, is readily applied to multislits. However, it is found that one of the underlying concepts requires generalization and that the choice of the parameters requires more consideration than was known.
NASA Technical Reports Server (NTRS)
Hankins, D. B.; Wake, W. H.
1981-01-01
The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.
Itch Management: General Principles.
Misery, Laurent
2016-01-01
Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. PMID:27578069
Itch Management: General Principles.
Misery, Laurent
2016-01-01
Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental.
Uncertainties in offsite consequence analysis
Young, M.L.; Harper, F.T.; Lui, C.H.
1996-03-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Awe, uncertainty, and agency detection.
Valdesolo, Piercarlo; Graham, Jesse
2014-01-01
Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728
Linear Programming Problems for Generalized Uncertainty
ERIC Educational Resources Information Center
Thipwiwatpotjana, Phantipa
2010-01-01
Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…
NASA Astrophysics Data System (ADS)
Fan, Ya-Jing; Cao, Huai-Xin; Meng, Hui-Xian; Chen, Liang
2016-09-01
The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. In this paper, we prove a Schrödinger-type uncertainty relation in terms of generalized metric adjusted skew information and correlation measure by using operator monotone functions, which reads, U_ρ ^{(g,f)}(A)U_ρ ^{(g,f)}(B)≥ f(0)^2l/k| {Corr}_ρ ^{s(g,f)}(A,B)| ^2 for some operator monotone functions f and g, all n-dimensional observables A, B and a non-singular density matrix ρ . As applications, we derive some new uncertainty relations for Wigner-Yanase skew information and Wigner-Yanase-Dyson skew information.
Fine-grained lower limit of entropic uncertainty in the presence of quantum memory.
Pramanik, T; Chowdhury, P; Majumdar, A S
2013-01-11
The limitation on obtaining precise outcomes of measurements performed on two noncommuting observables of a particle as set by the uncertainty principle in its entropic form can be reduced in the presence of quantum memory. We derive a new entropic uncertainty relation based on fine graining, which leads to an ultimate limit on the precision achievable in measurements performed on two incompatible observables in the presence of quantum memory. We show that our derived uncertainty relation tightens the lower bound set by entropic uncertainty for members of the class of two-qubit states with maximally mixed marginals, while accounting for the recent experimental results using maximally entangled pure states and mixed Bell-diagonal states. An implication of our uncertainty relation on the security of quantum key generation protocols is pointed out.
Principles of sound ecotoxicology.
Harris, Catherine A; Scott, Alexander P; Johnson, Andrew C; Panter, Grace H; Sheahan, Dave; Roberts, Mike; Sumpter, John P
2014-03-18
We have become progressively more concerned about the quality of some published ecotoxicology research. Others have also expressed concern. It is not uncommon for basic, but extremely important, factors to apparently be ignored. For example, exposure concentrations in laboratory experiments are sometimes not measured, and hence there is no evidence that the test organisms were actually exposed to the test substance, let alone at the stated concentrations. To try to improve the quality of ecotoxicology research, we suggest 12 basic principles that should be considered, not at the point of publication of the results, but during the experimental design. These principles range from carefully considering essential aspects of experimental design through to accurately defining the exposure, as well as unbiased analysis and reporting of the results. Although not all principles will apply to all studies, we offer these principles in the hope that they will improve the quality of the science that is available to regulators. Science is an evidence-based discipline and it is important that we and the regulators can trust the evidence presented to us. Significant resources often have to be devoted to refuting the results of poor research when those resources could be utilized more effectively.
ERIC Educational Resources Information Center
Kamat, R. V.
1991-01-01
A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)
ERIC Educational Resources Information Center
Rosen, Joe
1981-01-01
Discusses the meaning of symmetry of the laws of physics and symmetry of the universe and the connection between symmetries and asymmetries of the laws of physics and those of the universe. An explanation of Hamilton's principle is offered. The material is suitable for informal discussions with students. (Author/SK)
ERIC Educational Resources Information Center
Martz, Carlton
1999-01-01
This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…
2000-01-01
The Denver principles articulate the self empowerment movement of People With AIDS (PWA). The statements, written in 1983 by the Advisory Committee of the People With AIDS, include recommendations on how to support those with disease. It also includes suggestions for people who have AIDS. It concludes by listing the "rights of people with AIDS."
Reprographic Principles Made Easy.
ERIC Educational Resources Information Center
Young, J. B.
Means for reproducing graphic materials are explained. There are several types of processes: those using light sensitive material, those using heat sensitive material, those using photo conductive materials (electrophotography), and duplicating processes using ink. For each of these, the principles behind them are explained, the necessary…
ERIC Educational Resources Information Center
Siyanova-Chanturia, Anna; Martinez, Ron
2015-01-01
John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…
Principles of Biomedical Ethics
Athar, Shahid
2012-01-01
In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498
Principles of Teaching. Module.
ERIC Educational Resources Information Center
Rhoades, Joseph W.
This module on principles of teaching is 1 in a series of 10 modules written for vocational education teacher education programs. It is designed to enable the teacher to do the following: (1) identify subject matter and integrate that subject matter with thought-provoking questions; (2) organize and demonstrate good questioning techniques; and (3)…
Basic Comfort Heating Principles.
ERIC Educational Resources Information Center
Dempster, Chalmer T.
The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…
Hydrogen evolution: Guiding principles
NASA Astrophysics Data System (ADS)
Xia, Zhenhai
2016-10-01
Lower-cost alternatives to platinum electrocatalysts are being explored for the sustainable production of hydrogen, but often trial-and-error approaches are used for their development. Now, principles are elucidated that suggest pathways to rationally design efficient metal-free electrocatalysts based on doped graphene.
Assessment of uncertainty in chemical models by Bayesian probabilities: Why, when, how?
Sahlin, Ullrika
2015-07-01
A prediction of a chemical property or activity is subject to uncertainty. Which type of uncertainties to consider, whether to account for them in a differentiated manner and with which methods, depends on the practical context. In chemical modelling, general guidance of the assessment of uncertainty is hindered by the high variety in underlying modelling algorithms, high-dimensionality problems, the acknowledgement of both qualitative and quantitative dimensions of uncertainty, and the fact that statistics offers alternative principles for uncertainty quantification. Here, a view of the assessment of uncertainty in predictions is presented with the aim to overcome these issues. The assessment sets out to quantify uncertainty representing error in predictions and is based on probability modelling of errors where uncertainty is measured by Bayesian probabilities. Even though well motivated, the choice to use Bayesian probabilities is a challenge to statistics and chemical modelling. Fully Bayesian modelling, Bayesian meta-modelling and bootstrapping are discussed as possible approaches. Deciding how to assess uncertainty is an active choice, and should not be constrained by traditions or lack of validated and reliable ways of doing it.
NASA Astrophysics Data System (ADS)
Schwabe, Oliver; Shehab, Essam; Erkoyuncu, John
2016-07-01
Quantification and forecasting of cost uncertainty for aerospace innovations is challenged by conditions of small data which arises out of having few measurement points, little prior experience, unknown history, low data quality, and conditions of deep uncertainty. Literature research suggests that no frameworks exist which specifically address cost estimation under such conditions. In order to provide contemporary cost estimating techniques with an innovative perspective for addressing such challenges a framework based on the principles of spatial geometry is described. The framework consists of a method for visualising cost uncertainty and a dependency model for quantifying and forecasting cost uncertainty. Cost uncertainty is declared to represent manifested and unintended future cost variance with a probability of 100% and an unknown quantity and innovative starting conditions considered to exist when no verified and accurate cost model is available. The shape of data is used as an organising principle and the attribute of geometrical symmetry of cost variance point clouds used for the quantification of cost uncertainty. The results of the investigation suggest that the uncertainty of a cost estimate at any future point in time may be determined by the geometric symmetry of the cost variance data in its point cloud form at the time of estimation. Recommendations for future research include using the framework to determine the "most likely values" of estimates in Monte Carlo simulations and generalising the dependency model introduced. Future work is also recommended to reduce the framework limitations noted.
Entropic uncertainty from effective anticommutators
NASA Astrophysics Data System (ADS)
Kaniewski, Jedrzej; Tomamichel, Marco; Wehner, Stephanie
2014-07-01
We investigate entropic uncertainty relations for two or more binary measurements, for example, spin-1/2 or polarization measurements. We argue that the effective anticommutators of these measurements, i.e., the anticommutators evaluated on the state prior to measuring, are an expedient measure of measurement incompatibility. Based on the knowledge of pairwise effective anticommutators we derive a class of entropic uncertainty relations in terms of conditional Rényi entropies. Our uncertainty relations are formulated in terms of effective measures of incompatibility, which can be certified in a device-independent fashion. Consequently, we discuss potential applications of our findings to device-independent quantum cryptography. Moreover, to investigate the tightness of our analysis we consider the simplest (and very well studied) scenario of two measurements on a qubit. We find that our results outperform the celebrated bound due to Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103] and provide an analytical expression for the minimum uncertainty which also outperforms some recent bounds based on majorization.
Quantification of entanglement via uncertainties
Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S.
2007-03-15
We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.
Uncertainties in radiation flow experiments
NASA Astrophysics Data System (ADS)
Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.
2016-03-01
Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.
Saccade Adaptation and Visual Uncertainty
Souto, David; Gegenfurtner, Karl R.; Schütz, Alexander C.
2016-01-01
Visual uncertainty may affect saccade adaptation in two complementary ways. First, an ideal adaptor should take into account the reliability of visual information for determining the amount of correction, predicting that increasing visual uncertainty should decrease adaptation rates. We tested this by comparing observers' direction discrimination and adaptation rates in an intra-saccadic-step paradigm. Second, clearly visible target steps may generate a slower adaptation rate since the error can be attributed to an external cause, instead of an internal change in the visuo-motor mapping that needs to be compensated. We tested this prediction by measuring saccade adaptation to different step sizes. Most remarkably, we found little correlation between estimates of visual uncertainty and adaptation rates and no slower adaptation rates with more visible step sizes. Additionally, we show that for low contrast targets backward steps are perceived as stationary after the saccade, but that adaptation rates are independent of contrast. We suggest that the saccadic system uses different position signals for adapting dysmetric saccades and for generating a trans-saccadic stable visual percept, explaining that saccade adaptation is found to be independent of visual uncertainty. PMID:27252635
Saccade Adaptation and Visual Uncertainty.
Souto, David; Gegenfurtner, Karl R; Schütz, Alexander C
2016-01-01
Visual uncertainty may affect saccade adaptation in two complementary ways. First, an ideal adaptor should take into account the reliability of visual information for determining the amount of correction, predicting that increasing visual uncertainty should decrease adaptation rates. We tested this by comparing observers' direction discrimination and adaptation rates in an intra-saccadic-step paradigm. Second, clearly visible target steps may generate a slower adaptation rate since the error can be attributed to an external cause, instead of an internal change in the visuo-motor mapping that needs to be compensated. We tested this prediction by measuring saccade adaptation to different step sizes. Most remarkably, we found little correlation between estimates of visual uncertainty and adaptation rates and no slower adaptation rates with more visible step sizes. Additionally, we show that for low contrast targets backward steps are perceived as stationary after the saccade, but that adaptation rates are independent of contrast. We suggest that the saccadic system uses different position signals for adapting dysmetric saccades and for generating a trans-saccadic stable visual percept, explaining that saccade adaptation is found to be independent of visual uncertainty.
Uncertainty quantification and error analysis
Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip
2010-01-01
UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.
Exploring Uncertainty with Projectile Launchers
ERIC Educational Resources Information Center
Orzel, Chad; Reich, Gary; Marr, Jonathan
2012-01-01
The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…
Impact of orifice metering uncertainties
Stuart, J.W. )
1990-12-01
In a recent utility study, attributed 38% of its unaccounted-for UAF gas to orifice metering uncertainty biasing caused by straightening vanes. How this was determined and how this applied to the company's orifice meters is described. Almost all (97%) of the company's UAF gas was found to be attributed to identifiable accounting procedures, measurement problems, theft and leakage.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
The Principle of Maximum Conformality
Brodsky, Stanley J; Giustino, Di; /SLAC
2011-04-05
A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.
Measuring uncertainty by extracting fuzzy rules using rough sets
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.
1991-01-01
Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.
Error-disturbance uncertainty relations studied in neutron optics
NASA Astrophysics Data System (ADS)
Sponar, Stephan; Sulyok, Georg; Demirel, Bulent; Hasegawa, Yuji
2016-09-01
Heisenberg's uncertainty principle is probably the most famous statement of quantum physics and its essential aspects are well described by a formulations in terms of standard deviations. However, a naive Heisenberg-type error-disturbance relation is not valid. An alternative universally valid relation was derived by Ozawa in 2003. Though universally valid Ozawa's relation is not optimal. Recently, Branciard has derived a tight error-disturbance uncertainty relation (EDUR), describing the optimal trade-off between error and disturbance. Here, we report a neutron-optical experiment that records the error of a spin-component measurement, as well as the disturbance caused on another spin-component to test EDURs. We demonstrate that Heisenberg's original EDUR is violated, and the Ozawa's and Branciard's EDURs are valid in a wide range of experimental parameters, applying a new measurement procedure referred to as two-state method.
Visualizing Flow of Uncertainty through Analytical Processes.
Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu
2012-12-01
Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.
Quantification of uncertainty in geochemical reactions
NASA Astrophysics Data System (ADS)
Srinivasan, Gowri; Tartakovsky, Daniel M.; Robinson, Bruce A.; Aceves, Alejandro B.
2007-12-01
Predictions of reactive transport in the subsurface are routinely compromised by both model (structural) and parametric uncertainties. We present a set of computational tools for quantifying these two types of uncertainties. The model uncertainty is resolved at the molecular scale where epistemic uncertainty incorporates aleatory uncertainty. The parametric uncertainty is resolved at both molecular and continuum (Darcy) scales. We use the proposed approach to quantify uncertainty in modeling the sorption of neptunium through a competitive ion exchange. This radionuclide is of major concern for various high-level waste storage projects because of its relatively long half-life and its high-solubility and low-sorption properties. We demonstrate how parametric and model uncertainties affect one's ability to estimate the distribution coefficient. The uncertainty quantification tools yield complete probabilistic descriptions of key parameters affecting the fate and migration of neptunium in the subsurface rather than the lower statistical moments. This is important, since these distributions are highly skewed.
Clocks, Computers, Black Holes, Spacetime Foam, and Holographic Principle
NASA Astrophysics Data System (ADS)
Ng, Y. Jack
2002-08-01
What do simple clocks, simple computers, black holes, space-time foam, and holographic principle have in common? I will show that the physics behind them is inter-related, linking together our concepts of information, gravity, and quantum uncertainty. Thus, the physics that sets the limits to computation and clock precision also yields Hawking radiation of black holes and the holographic principle. Moreover, the latter two strongly imply that space-time undergoes much larger quantum fluctuations than what the folklore suggests -- large enough to be detected with modern gravitational-wave interferometers through future refinements.
Common Principles and Multiculturalism
Zahedi, Farzaneh; Larijani, Bagher
2009-01-01
Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720
Common principles and multiculturalism.
Zahedi, Farzaneh; Larijani, Bagher
2009-01-01
Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.
Common principles and multiculturalism.
Zahedi, Farzaneh; Larijani, Bagher
2009-01-01
Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720
Principles of Natural Photosynthesis.
Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A
2016-01-01
Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285
Principles of Glacier Mechanics
NASA Astrophysics Data System (ADS)
Waddington, Edwin D.
Glaciers are awesome in size and move at a majestic pace, and they frequently occupy spectacular mountainous terrain. Naturally, many Earth scientists are attracted to glaciers. Some of us are even fortunate enough to make a career of studying glacier flow. Many others work on the large, flat polar ice sheets where there is no scenery. As a leader of one of the foremost research projects now studying the flow of mountain glaciers (Storglaciaren, Norway), Roger Hooke is well qualified to describe the principles of glacier mechanics. Principles of Glacier Mechanics is written for upper-level undergraduate students and graduate students with an interest in glaciers and the landforms that glaciers produce. While most of the examples in the text are drawn from valley glacier studies, much of the material is also relevant to “glacier flatland” on the polar ice sheets.
NASA Astrophysics Data System (ADS)
Murdin, P.
2000-11-01
A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...
Aswathanarayana, U.
1985-01-01
This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.
Principles of lake sedimentology
Janasson, L.
1983-01-01
This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.
NASA Astrophysics Data System (ADS)
Hughes, Barry D.; Ninham, Barry W.
2016-02-01
A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.
Computational principles of memory.
Chaudhuri, Rishidev; Fiete, Ila
2016-03-01
The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506
Heisenberg's observability principle
NASA Astrophysics Data System (ADS)
Wolff, Johanna
2014-02-01
Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.
Teaching professionalism: general principles.
Cruess, Richard L; Cruess, Sylvia R
2006-05-01
There are educational principles that apply to the teaching of professionalism during undergraduate education and postgraduate training. It is axiomatic that there is a single cognitive base that applies with increasing moral force as students enter medical school, progress to residency or registrar training, and enter practice. While parts of this body of knowledge are easier to teach and learn at different stages of an individual's career, it remains a definable whole at all times and should be taught as such. While the principle that self-reflection on theoretical and real issues encountered in the life of a student, resident or practitioner is essential to the acquisition of experiential learning and the incorporation of the values and behaviors of the professional, the opportunities to provide situations where this can take place will change as an individual progresses through the system, as will the sophistication of the level of learning. Teaching the cognitive base of professionalism and providing opportunities for the internalization of its values and behaviors are the cornerstones of the organization of the teaching of professionalism at all levels. Situated learning theory appears to provide practical guidance as to how this may be implemented. While the application of this theory will vary with the type of curriculum, the institutional culture and the resources available, the principles outlined should remain constant.
Attention, Uncertainty, and Free-Energy
Feldman, Harriet; Friston, Karl J.
2010-01-01
We suggested recently that attention can be understood as inferring the level of uncertainty or precision during hierarchical perception. In this paper, we try to substantiate this claim using neuronal simulations of directed spatial attention and biased competition. These simulations assume that neuronal activity encodes a probabilistic representation of the world that optimizes free-energy in a Bayesian fashion. Because free-energy bounds surprise or the (negative) log-evidence for internal models of the world, this optimization can be regarded as evidence accumulation or (generalized) predictive coding. Crucially, both predictions about the state of the world generating sensory data and the precision of those data have to be optimized. Here, we show that if the precision depends on the states, one can explain many aspects of attention. We illustrate this in the context of the Posner paradigm, using the simulations to generate both psychophysical and electrophysiological responses. These simulated responses are consistent with attentional bias or gating, competition for attentional resources, attentional capture and associated speed-accuracy trade-offs. Furthermore, if we present both attended and non-attended stimuli simultaneously, biased competition for neuronal representation emerges as a principled and straightforward property of Bayes-optimal perception. PMID:21160551
Adjoint-Based Uncertainty Quantification with MCNP
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Credible Software and Simulation Uncertainty
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B.; Nixon, David (Technical Monitor)
1998-01-01
The utility of software primarily depends on its reliability and performance; whereas, its significance depends solely on its credibility for intended use. The credibility of simulations confirms the credibility of software. The level of veracity and the level of validity of simulations determine the degree of credibility of simulations. The process of assessing this credibility in fields such as computational mechanics (CM) differs from that followed by the Defense Modeling and Simulation Office in operations research. Verification and validation (V&V) of CM simulations is not the same as V&V of CM software. Uncertainty is the measure of simulation credibility. Designers who use software are concerned with management of simulation uncertainty. Terminology and concepts are presented with a few examples from computational fluid dynamics.
Individual differences in causal uncertainty.
Weary, G; Edwards, J A
1994-08-01
This article presents a scale that measures chronic individual differences in people's uncertainty about their ability to understand and detect cause-and-effect relationships in the social world: the Causal Uncertainty Scale (CUS). The results of Study 1 indicated that the scale has good internal and adequate test-retest reliability. Additionally, the results of a factor analysis suggested that the scale appears to be tapping a single construct. Study 2 examined the convergent and discriminant validity of the scale, and Studies 3 and 4 examined the predictive and incremental validity of the scale. The importance of the CUS to work on depressives' social information processing and for basic research and theory on human social judgment processes is discussed.
Uncertainty versus computer response time
Rowe, W.D. |
1994-12-31
Interactive on-line presentation of risk analysis results with immediate ``what if`` capability is now possible with available microcomputer technology. This can provide an effective means off presenting the risk results, the decision possibilities, and the underlying assumptions to decision makers, stakeholders, and the public. However, the limitation of computer calculational power on microcomputers requires a trade-off between the precision of the analysis and the computing and display response time. Fortunately, the uncertainties in the risk analysis are usually so large that extreme precision is often unwarranted. Therefore, risk analyses used for this purpose must include trade-offs between precision and processing time, and uncertainties introduced must be put into perspective.
On the quantum mechanical solutions with minimal length uncertainty
NASA Astrophysics Data System (ADS)
Shababi, Homa; Pedram, Pouria; Chung, Won Sang
2016-06-01
In this paper, we study two generalized uncertainty principles (GUPs) including [X,P] = iℏ(1 + βP2j) and [X,P] = iℏ(1 + βP2 + kβ2P4) which imply minimal measurable lengths. Using two momentum representations, for the former GUP, we find eigenvalues and eigenfunctions of the free particle and the harmonic oscillator in terms of generalized trigonometric functions. Also, for the latter GUP, we obtain quantum mechanical solutions of a particle in a box and harmonic oscillator. Finally we investigate the statistical properties of the harmonic oscillator including partition function, internal energy, and heat capacity in the context of the first GUP.
Does quantum uncertainty have a place in everyday applied statistics?
Gelman, Andrew; Betancourt, Michael
2013-06-01
We are sympathetic to the general ideas presented in the article by Pothos & Busemeyer (P&B): Heisenberg's uncertainty principle seems naturally relevant in the social and behavioral sciences, in which measurements can affect the people being studied. We propose that the best approach for developing quantum probability models in the social and behavioral sciences is not by directly using the complex probability-amplitude formulation proposed in the article, but rather, more generally, to consider marginal probabilities that need not be averages over conditionals.
Quantifying uncertainty from material inhomogeneity.
Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee
2009-09-01
Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the
Ozone Uncertainties Study Algorithm (OUSA)
NASA Technical Reports Server (NTRS)
Bahethi, O. P.
1982-01-01
An algorithm to carry out sensitivities, uncertainties and overall imprecision studies to a set of input parameters for a one dimensional steady ozone photochemistry model is described. This algorithm can be used to evaluate steady state perturbations due to point source or distributed ejection of H2O, CLX, and NOx, besides, varying the incident solar flux. This algorithm is operational on IBM OS/360-91 computer at NASA/Goddard Space Flight Center's Science and Applications Computer Center (SACC).
Age models and their uncertainties
NASA Astrophysics Data System (ADS)
Marwan, N.; Rehfeld, K.; Goswami, B.; Breitenbach, S. F. M.; Kurths, J.
2012-04-01
The usefulness of a proxy record is largely dictated by accuracy and precision of its age model, i.e., its depth-age relationship. Only if age model uncertainties are minimized correlations or lead-lag relations can be reliably studied. Moreover, due to different dating strategies (14C, U-series, OSL dating, or counting of varves), dating errors or diverging age models lead to difficulties in comparing different palaeo proxy records. Uncertainties in the age model are even more important if an exact dating is necessary in order to calculate, e.g., data series of flux or rates (like dust flux records, pollen deposition rates). Several statistical approaches exist to handle the dating uncertainties themselves and to estimate the age-depth relationship. Nevertheless, linear interpolation is still the most commonly used method for age modeling. The uncertainties of a certain event at a given time due to the dating errors are often even completely neglected. Here we demonstrate the importance of considering dating errors and implications for the interpretation of variations in palaeo-climate proxy records from stalagmites (U-series dated). We present a simple approach for estimating age models and their confidence levels based on Monte Carlo methods and non-linear interpolation. This novel algorithm also allows for removing age reversals. Our approach delivers a time series of a proxy record with a value range for each age depth also, if desired, on an equidistant time axis. The algorithm is implemented in interactive scripts for use with MATLAB®, Octave, and FreeMat.
Evaluating the uncertainty of input quantities in measurement models
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Elster, Clemens
2014-06-01
uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.
Blade tip timing (BTT) uncertainties
NASA Astrophysics Data System (ADS)
Russhard, Pete
2016-06-01
Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.
Quantifying Uncertainty in Epidemiological Models
Ramanathan, Arvind; Jha, Sumit Kumar
2012-01-01
Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.
Uncertainty propagation in nuclear forensics.
Pommé, S; Jerome, S M; Venchiarutti, C
2014-07-01
Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined.
Characterizing Epistemic Uncertainty for Launch Vehicle Designs
NASA Technical Reports Server (NTRS)
Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank
2016-01-01
NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.
Characterizing Epistemic Uncertainty for Launch Vehicle Designs
NASA Technical Reports Server (NTRS)
Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad
2016-01-01
NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.
Principles of tendon transfers.
Coulet, B
2016-04-01
Tendon transfers are carried out to restore functional deficits by rerouting the remaining intact muscles. Transfers are highly attractive in the context of hand surgery because of the possibility of restoring the patient's ability to grip. In palsy cases, tendon transfers are only used when a neurological procedure is contraindicated or has failed. The strategy used to restore function follows a common set of principles, no matter the nature of the deficit. The first step is to clearly distinguish between deficient muscles and muscles that could be transferred. Next, the type of palsy will dictate the scope of the program and the complexity of the gripping movements that can be restored. Based on this reasoning, a surgical strategy that matches the means (transferable muscles) with the objectives (functions to restore) will be established and clearly explained to the patient. Every paralyzed hand can be described using three parameters. 1) Deficient segments: wrist, thumb and long fingers; 2) mechanical performance of muscles groups being revived: high energy-wrist extension and finger flexion that require strong transfers with long excursion; low energy-wrist flexion and finger extension movements that are less demanding mechanically, because they can be accomplished through gravity alone in some cases; 3) condition of the two primary motors in the hand: extrinsics (flexors and extensors) and intrinsics (facilitator). No matter the type of palsy, the transfer surgery follows the same technical principles: exposure, release, fixation, tensioning and rehabilitation. By performing an in-depth analysis of each case and by following strict technical principles, tendon transfer surgery leads to reproducible results; this allows the surgeon to establish clear objectives for the patient preoperatively. PMID:27117119
Accounting for Calibration Uncertainty in Detectors for High-Energy Astrophysics
NASA Astrophysics Data System (ADS)
Xu, Jin
Systematic instrumental uncertainties in astronomical analyses have been generally ignored in data analysis due to the lack of robust principled methods, though the importance of incorporating instrumental calibration uncertainty is widely recognized by both users and instrument builders. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. Lee et al. (2011) introduced a so-called pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduces an ad hoc technique that simplifies computation by assuming that the current data is not useful in narrowing the uncertainty for the calibration product, i.e., that the prior and posterior distributions for the calibration products are the same. In the thesis, we focus on incorporating calibration uncertainty into a principled Bayesian X-ray spectral analysis, specifically we account for uncertainty in the so-called effective area curve and the photon redistribution matrix. X-ray spectral analysis models the distribution of the energies of X-ray photons emitted from an astronomical source. The effective area curve of an X-ray detector describes its sensitive as a function of the energy of incoming photons, and the photon redistribution matrix describes the probability distribution of the recorded (discrete) energy of a photon as a function of the true (discretized) energy. Starting with the effective area curve, we follow Lee et al. (2011) and use a principle component analysis (PCA) to efficiently represent the uncertainty. Here, however, we leverage this representation to enable a principled, fully Bayesian method to account for calibration uncertainty in high-energy spectral analysis. For the photon redistribution matrix, we first model each conditional distribution as a normal distribution and then apply PCA to the parameters describing the normal models. This results in an
Reznik, Ed; Chaudhary, Osman; Segrè, Daniel
2013-09-01
The Michaelis-Menten equation for an irreversible enzymatic reaction depends linearly on the enzyme concentration. Even if the enzyme concentration changes in time, this linearity implies that the amount of substrate depleted during a given time interval depends only on the average enzyme concentration. Here, we use a time re-scaling approach to generalize this result to a broad category of multi-reaction systems, whose constituent enzymes have the same dependence on time, e.g. they belong to the same regulon. This "average enzyme principle" provides a natural methodology for jointly studying metabolism and its regulation.
Equivalence Principle in Cosmology
NASA Astrophysics Data System (ADS)
Kopeikin, Sergei
2014-01-01
We analyse the Einstein equivalence principle (EEP) for a Hubble observer in Friedmann-Lemaître-Robertson-Walker (FLRW) spacetime. We show that the affine structure of the light cone in the FLRW spacetime should be treated locally in terms of the optical metric gαβ which is not reduced to the Minkowski metric fαβ due to the nonuniform parametrization of the local equations of light propagation with the proper time of the observer's clock. The physical consequence of this difference is that the Doppler shift of radio waves measured locally is affected by the Hubble expansion.
Talus fractures: surgical principles.
Rush, Shannon M; Jennings, Meagan; Hamilton, Graham A
2009-01-01
Surgical treatment of talus fractures can challenge even the most skilled foot and ankle surgeon. Complicated fracture patterns combined with joint dislocation of variable degrees require accurate assessment, sound understanding of principles of fracture care, and broad command of internal fixation techniques needed for successful surgical care. Elimination of unnecessary soft tissue dissection, a low threshold for surgical reduction, liberal use of malleolar osteotomy to expose body fracture, and detailed attention to fracture reduction and joint alignment are critical to the success of treatment. Even with the best surgical care complications are common and seem to correlate with injury severity and open injuries. PMID:19121756
Bhuvaneswaran, Mohan
2010-01-01
An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile. PMID:21217950
Bhuvaneswaran, Mohan
2010-10-01
An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile.
Archimedes' Principle in General Coordinates
ERIC Educational Resources Information Center
Ridgely, Charles T.
2010-01-01
Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…
Risk Analysis and Uncertainty: Implications for Counselling
ERIC Educational Resources Information Center
Hassenzahl, David
2004-01-01
Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…
Regarding Uncertainty in Teachers and Teaching
ERIC Educational Resources Information Center
Helsing, Deborah
2007-01-01
The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…
Numerical approach for quantification of epistemic uncertainty
NASA Astrophysics Data System (ADS)
Jakeman, John; Eldred, Michael; Xiu, Dongbin
2010-06-01
In the field of uncertainty quantification, uncertainty in the governing equations may assume two forms: aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty can be characterised by known probability distributions whilst epistemic uncertainty arises from a lack of knowledge of probabilistic information. While extensive research efforts have been devoted to the numerical treatment of aleatory uncertainty, little attention has been given to the quantification of epistemic uncertainty. In this paper, we propose a numerical framework for quantification of epistemic uncertainty. The proposed methodology does not require any probabilistic information on uncertain input parameters. The method only necessitates an estimate of the range of the uncertain variables that encapsulates the true range of the input variables with overwhelming probability. To quantify the epistemic uncertainty, we solve an encapsulation problem, which is a solution to the original governing equations defined on the estimated range of the input variables. We discuss solution strategies for solving the encapsulation problem and the sufficient conditions under which the numerical solution can serve as a good estimator for capturing the effects of the epistemic uncertainty. In the case where probability distributions of the epistemic variables become known a posteriori, we can use the information to post-process the solution and evaluate solution statistics. Convergence results are also established for such cases, along with strategies for dealing with mixed aleatory and epistemic uncertainty. Several numerical examples are presented to demonstrate the procedure and properties of the proposed methodology.
Errors and Uncertainty in Physics Measurement.
ERIC Educational Resources Information Center
Blasiak, Wladyslaw
1983-01-01
Classifies errors as either systematic or blunder and uncertainties as either systematic or random. Discusses use of error/uncertainty analysis in direct/indirect measurement, describing the process of planning experiments to ensure lowest possible uncertainty. Also considers appropriate level of error analysis for high school physics students'…
Innovative surgery and the precautionary principle.
Meyerson, Denise
2013-12-01
Surgical innovation involves practices, such as new devices, technologies, procedures, or applications, which are novel and untested. Although innovative practices are believed to offer an improvement on the standard surgical approach, they may prove to be inefficacious or even dangerous. This article considers how surgeons considering innovation should reason in the conditions of uncertainty that characterize innovative surgery. What attitude to the unknown risks of innovative surgery should they take? The answer to this question involves value judgments about the acceptability of risk taking when satisfactory scientific information is not available. This question has been confronted in legal contexts, where risk aversion in the form of the precautionary principle has become increasingly influential as a regulatory response to innovative technologies that pose uncertain future hazards. This article considers whether it is appropriate to apply a precautionary approach when making decisions about innovative surgery.
Principle or constructive relativity
NASA Astrophysics Data System (ADS)
Frisch, Mathias
Appealing to Albert Einstein's distinction between principle and constructive theories, Harvey Brown has argued for an interpretation of the theory of relativity as a dynamic and constructive theory. Brown's view has been challenged by Michel Janssen and in this paper I investigate their dispute. I argue that their disagreement appears larger than it actually is due to the two frameworks used by Brown and Janssen to express their respective views: Brown's appeal to Einstein's principle-constructive distinction and Janssen's framing of the disagreement as one over the question whether relativity provides a kinematic or a dynamic constraint. I appeal to a distinction between types of theories drawn by H. A. Lorentz two decades before Einstein's distinction to argue that Einstein's distinction represents a false dichotomy. I argue further that the disagreement concerning the kinematics-dynamics distinction is a disagreement about labels but not about substance. There remains a genuine disagreement over the explanatory role of spacetime geometry and here I agree with Brown arguing that Janssen sees a pressing need for an explanation of Lorentz invariance where no further explanation is needed.
Principle of relative locality
Amelino-Camelia, Giovanni; Freidel, Laurent; Smolin, Lee; Kowalski-Glikman, Jerzy
2011-10-15
We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.
NASA Astrophysics Data System (ADS)
Ozbek, M. M.; Pinder, G. F.
2006-12-01
There is a growing need in hydrologic and environmental modeling and management to segregate uncertainty, whether it occurs in input parameters or in possible alternative models, into aleatory uncertainty (i.e., irreducible or stochastic) and epistemic uncertainty (i.e., reducible or due to lack of knowledge). While aleatory uncertainty has been known and used as the only source of uncertainty in the hydrologic community for a long time, the notion of epistemic uncertainty is relatively new and it can be due several reasons including 1) field and laboratory methods used in the measurement of parameters, 2) techniques used to interpolate measured values at selected locations, and more importantly, 3) subjective expert opinion interpreting data available to augment existing prior parametric information. A natural framework to quantify epistemic uncertainty has been fuzzy set theory. In this paper, we use the extension principle of fuzzy set theory to simulate groundwater flow and transport with fuzzy model parameters. Our novel implementation of the principle involves two major steps: 1) a tessellation of the parameter space that results in simplexes over which the state variable is approximated by means of trial functions, followed by 2) the optimization of degrees of membership for the state variable in each simplex where the trial functions and the fuzzy parameter membership functions are used as the constraints of the optimization algorithm. We compare our approach to other known approaches to using the extension principle to address groundwater flow and transport in the saturated zone, and highlight features of our approach that apply to any physically based model with fuzzy parameter input.
Medical Need, Equality, and Uncertainty.
Horne, L Chad
2016-10-01
Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. PMID:27196999
Medical Need, Equality, and Uncertainty.
Horne, L Chad
2016-10-01
Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality.
Farkas, Zsuzsa; Slate, Andrew; Whitaker, Thomas B; Suszter, Gabriella; Ambrus, Árpád
2015-05-13
The uncertainty of pesticide residue levels in crops due to sampling, estimated for 106 individual crops and 24 crop groups from residue data obtained from supervised trials, was adjusted with a factor of 1.3 to accommodate the larger variability of residues under normal field conditions. Further adjustment may be necessary in the case of mixed lots. The combined uncertainty of residue data including the contribution of sampling is used for calculation of an action limit, which should not be exceeded when compliance with maximum residue limits is certified as part of premarketing self-control programs. On the contrary, for testing compliance of marketed commodities the residues measured in composite samples should be greater than or equal to the decision limit calculated only from the combined uncertainty of the laboratory phase of the residue determination. The options of minimizing the combined uncertainty of measured residues are discussed. The principles described are also applicable to other chemical contaminants. PMID:25658668
Variational principles of irreversible processes
NASA Astrophysics Data System (ADS)
Ichiyanagi, Masakazu
1994-07-01
This article reviews developments of variational principles in the study of irreversible processes during the past three decades or so. The variational principles we consider here are related to entropy production. The purpose of this article is to explicate that we can formulate a variational principle which relates the transport coefficients to microscopic dynamics of fluctuations. The quantum variational principle restricts the nonequilibrium density matrix to a class conforming to the requirement demanded by the second law of thermodynamics. These are various kinds of variational principles according to different stages of a macroscopic system. The three stages are known, which are dynamical, kinetic, and thermodynamical stages. The relationships among these variational principles are discussed from the point of view of the contraction of information about irrelevant components. Nakano's variational principle has close similarity to the Lippmann-Schwinger theory of scattering, in which some incoming and outgoing disturbances have to be considered in a pair. It is also shown that the variational principle of Onsager's type can be reformulated in the form of Hamilton's principle if a generalization of Hamilton's principle proposed by Djukic and Vujanovic is used. A variational principle in the diagrammatic method is also reviewed, which utilizes the generalized Ward-Takahashi relations.
Performance of Trajectory Models with Wind Uncertainty
NASA Technical Reports Server (NTRS)
Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.
2009-01-01
Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.
Leito, Signe; Mölder, Kadi; Künnapas, Allan; Herodes, Koit; Leito, Ivo
2006-07-14
An ISO GUM measurement uncertainty estimation procedure was developed for a liquid-chromatographic drug quality control method-assay of simvastatin in drug formulation. In quantification of uncertainty components several practical approaches for including difficult-to-estimate uncertainty sources (such as uncertainty due to peak integration, uncertainty due to nonlinearity of the calibration curve, etc.) have been presented. Detailed analysis of contributions of the various uncertainty sources was carried out. The results were calculated based on different definitions of the measurand and it was demonstrated that unequivocal definition of the measurand is essential in order to get rigorous uncertainty estimate. Two different calibration methods - single-point (1P) and five-point (5P) - were used and the obtained uncertainties and uncertainty budgets were compared. Results calculated using 1P and 5P calibrations agree very well. The uncertainty estimate for 1P is only slightly larger than with 5P calibration. PMID:16756985
Haimes, Yacov Y
2012-09-01
This article is grounded on the premise that the complex process of risk assessment, management, and communication, when applied to systems of systems, should be guided by universal systems-based principles. It is written from the perspective of systems engineering with the hope and expectation that the principles introduced here will be supplemented and complemented by principles from the perspectives of other disciplines. Indeed, there is no claim that the following 10 guiding principles constitute a complete set; rather, the intent is to initiate a discussion on this important subject that will incrementally lead us to a more complete set of guiding principles. The 10 principles are as follows: First Principle: Holism is the common denominator that bridges risk analysis and systems engineering. Second Principle: The process of risk modeling, assessment, management, and communication must be systemic and integrated. Third Principle: Models and state variables are central to quantitative risk analysis. Fourth Principle: Multiple models are required to represent the essence of the multiple perspectives of complex systems of systems. Fifth Principle: Meta-modeling and subsystems integration must be derived from the intrinsic states of the system of systems. Sixth Principle: Multiple conflicting and competing objectives are inherent in risk management. Seventh Principle: Risk analysis must account for epistemic and aleatory uncertainties. Eighth Principle: Risk analysis must account for risks of low probability with extreme consequences. Ninth Principle: The time frame is central to quantitative risk analysis. Tenth Principle: Risk analysis must be holistic, adaptive, incremental, and sustainable, and it must be supported with appropriate data collection, metrics with which to measure efficacious progress, and criteria on the basis of which to act. The relevance and efficacy of each guiding principle is demonstrated by applying it to the U.S. Federal Aviation
Principles of vestibular pharmacotherapy.
Chabbert, C
2016-01-01
Ideally, vestibular pharmacotherapy is intended, through specific and targeted molecular actions, to significantly alleviate vertigo symptoms, to protect or repair the vestibular sensory network under pathologic conditions, and to promote vestibular compensation, with the eventual aim of improving the patient's quality of life. In fact, in order to achieve this aim, considerable progress still needs to be made. The lack of information on the etiology of vestibular disorders and the pharmacologic targets to modulate, as well as the technical challenge of targeting a drug to its effective site are some of the main issues yet to be overcome. In this review, my intention is to provide an account of the therapeutic principles that have shaped current vestibular pharmacotherapy and to further explore crucial questions that must be taken into consideration in order to develop targeted and specific pharmacologic therapies for each type and stage of vestibular disorders. PMID:27638072
NASA Astrophysics Data System (ADS)
Barbour, Julian
The definitive ideas that led to the creation of general relativity crystallized in Einstein's thinking during 1912 while he was in Prague. At the centenary meeting held there to mark the breakthrough, I was asked to talk about earlier great work of relevance to dynamics done at Prague, above all by Kepler and Mach. The main topics covered in this chapter are: some little known but basic facts about the planetary motions; the conceptual framework and most important discoveries of Ptolemy and Copernicus; the complete change of concepts that Kepler introduced and their role in his discoveries; the significance of them in Newton's work; Mach's realization that Kepler's conceptual revolution needed further development to free Newton's conceptual world of the last vestiges of the purely geometrical Ptolemaic world view; and the precise formulation of Mach's principle required to place GR correctly in the line of conceptual and technical evolution that began with the ancient Greek astronomers.
System level electrochemical principles
NASA Technical Reports Server (NTRS)
Thaller, L. H.
1985-01-01
The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.
Neuronavigation. Principles. Surgical technique.
Ivanov, Marcel; Ciurea, Alexandru Vlad
2009-01-01
Neuronavigation and stereotaxy are techniques designed to help neurosurgeons precisely localize different intracerebral pathological processes by using a set of preoperative images (CT, MRI, fMRI, PET, SPECT etc.). The development of computer assisted surgery was possible only after a significant technological progress, especially in the area of informatics and imagistics. The main indications of neuronavigation are represented by the targeting of small and deep intracerebral lesions and choosing the best way to treat them, in order to preserve the neurological function. Stereotaxis also allows lesioning or stimulation of basal ganglia for the treatment of movement disorders. These techniques can bring an important amount of confort both to the patient and to the neurosurgeon. Neuronavigation was introduced in Romania around 2003, in four neurosurgical centers. We present our five-years experience in neuronavigation and describe the main principles and surgical techniques.
Neuronavigation. Principles. Surgical technique.
Ivanov, Marcel; Ciurea, Alexandru Vlad
2009-01-01
Neuronavigation and stereotaxy are techniques designed to help neurosurgeons precisely localize different intracerebral pathological processes by using a set of preoperative images (CT, MRI, fMRI, PET, SPECT etc.). The development of computer assisted surgery was possible only after a significant technological progress, especially in the area of informatics and imagistics. The main indications of neuronavigation are represented by the targeting of small and deep intracerebral lesions and choosing the best way to treat them, in order to preserve the neurological function. Stereotaxis also allows lesioning or stimulation of basal ganglia for the treatment of movement disorders. These techniques can bring an important amount of confort both to the patient and to the neurosurgeon. Neuronavigation was introduced in Romania around 2003, in four neurosurgical centers. We present our five-years experience in neuronavigation and describe the main principles and surgical techniques. PMID:20108488
Dynamical principles in neuroscience
Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I.
2006-10-15
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.
Fault Management Guiding Principles
NASA Technical Reports Server (NTRS)
Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan
2011-01-01
Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.
NASA Astrophysics Data System (ADS)
Kheiri, R.
2016-09-01
As an undergraduate exercise, in an article (2012 Am. J. Phys. 80 780–14), quantum and classical uncertainties for dimensionless variables of position and momentum were evaluated in three potentials: infinite well, bouncing ball, and harmonic oscillator. While original quantum uncertainty products depend on {{\\hslash }} and the number of states (n), a dimensionless approach makes the comparison between quantum uncertainty and classical dispersion possible by excluding {{\\hslash }}. But the question is whether the uncertainty still remains dependent on quantum number n. In the above-mentioned article, there lies this contrast; on the one hand, the dimensionless quantum uncertainty of the potential box approaches classical dispersion only in the limit of large quantum numbers (n\\to ∞ )—consistent with the correspondence principle. On the other hand, similar evaluations for bouncing ball and harmonic oscillator potentials are equal to their classical counterparts independent of n. This equality may hide the quantum feature of low energy levels. In the current study, we change the potential intervals in order to make them symmetric for the linear potential and non-symmetric for the quadratic potential. As a result, it is shown in this paper that the dimensionless quantum uncertainty of these potentials in the new potential intervals is expressed in terms of quantum number n. In other words, the uncertainty requires the correspondence principle in order to approach the classical limit. Therefore, it can be concluded that the dimensionless analysis, as a useful pedagogical method, does not take away the quantum feature of the n-dependence of quantum uncertainty in general. Moreover, our numerical calculations include the higher powers of the position for the potentials.
NASA Astrophysics Data System (ADS)
Kheiri, R.
2016-09-01
As an undergraduate exercise, in an article (2012 Am. J. Phys. 80 780-14), quantum and classical uncertainties for dimensionless variables of position and momentum were evaluated in three potentials: infinite well, bouncing ball, and harmonic oscillator. While original quantum uncertainty products depend on {{\\hslash }} and the number of states (n), a dimensionless approach makes the comparison between quantum uncertainty and classical dispersion possible by excluding {{\\hslash }}. But the question is whether the uncertainty still remains dependent on quantum number n. In the above-mentioned article, there lies this contrast; on the one hand, the dimensionless quantum uncertainty of the potential box approaches classical dispersion only in the limit of large quantum numbers (n\\to ∞ )—consistent with the correspondence principle. On the other hand, similar evaluations for bouncing ball and harmonic oscillator potentials are equal to their classical counterparts independent of n. This equality may hide the quantum feature of low energy levels. In the current study, we change the potential intervals in order to make them symmetric for the linear potential and non-symmetric for the quadratic potential. As a result, it is shown in this paper that the dimensionless quantum uncertainty of these potentials in the new potential intervals is expressed in terms of quantum number n. In other words, the uncertainty requires the correspondence principle in order to approach the classical limit. Therefore, it can be concluded that the dimensionless analysis, as a useful pedagogical method, does not take away the quantum feature of the n-dependence of quantum uncertainty in general. Moreover, our numerical calculations include the higher powers of the position for the potentials.
Uncertainty in streamflow records - a comparison of multiple estimation methods
NASA Astrophysics Data System (ADS)
Kiang, Julie; Gazoorian, Chris; Mason, Robert; Le Coz, Jerome; Renard, Benjamin; Mansanarez, Valentin; McMillan, Hilary; Westerberg, Ida; Petersen-Øverleir, Asgeir; Reitan, Trond; Sikorska, Anna; Siebert, Jan; Coxon, Gemma; Freer, Jim; Belleville, Arnaud; Hauet, Alexandre
2016-04-01
Stage-discharge rating curves are used to relate streamflow discharge to continuously measured river stage readings in order to create a continuous record of streamflow discharge. The stage-discharge relationship is estimated and refined using discrete streamflow gaugings over time, during which both the discharge and stage are measured. The resulting rating curve has uncertainty due to multiple factors including the curve-fitting process, assumptions on the form of the model used, the changeable nature of natural channels, and the approaches used to extrapolate the rating equation beyond available observations. A number of different methods have been proposed for estimating rating curve uncertainty, differing in mathematical rigour, in the assumptions made about the component errors, and in the information required to implement the method at any given site. This study compares several methods that range from simple LOWESS fits to more complicated Bayesian methods that consider hydraulic principles directly. We evaluate these different methods when applied to a single gauging station using the same information (channel characteristics, hydrographs, and streamflow gaugings). We quantify the resultant spread of the stage-discharge curves and compare the level of uncertainty attributed to the streamflow record by the different methods..
Toward an uncertainty budget for measuring nanoparticles by AFM
NASA Astrophysics Data System (ADS)
Delvallée, A.; Feltin, N.; Ducourtieux, S.; Trabelsi, M.; Hochepied, J. F.
2016-02-01
This article reports on the evaluation of an uncertainty budget associated with the measurement of the mean diameter of a nanoparticle (NP) population by Atomic Force Microscopy. The measurement principle consists in measuring the height of a spherical-like NP population to determine the mean diameter and the size distribution. This method assumes that the NPs are well-dispersed on the substrate and isolated enough to avoid measurement errors due to agglomeration phenomenon. Since the measurement is directly impacted by the substrate roughness, the NPs have been deposited on a mica sheet presenting a very low roughness. A complete metrological characterization of the instrument has been carried out and the main error sources have been evaluated. The measuring method has been tested on a population of SiO2 NPs. Homemade software has been used to build the height distribution histogram taking into account only isolated NP. Finally, the uncertainty budget including main components has been established for the mean diameter measurement of this NP population. The most important components of this uncertainty budget are the calibration process along Z-axis, the scanning speed influence and then the vertical noise level.
Forest management under uncertainty for multiple bird population objectives
Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.
2005-01-01
We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.
Scientific basis for the Precautionary Principle
Vineis, Paolo . E-mail: p.vineis@imperial.ac.uk
2005-09-01
The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as 'stressors', inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones.
On the Impact of Uncertainty in Initial Conditions of Hydrologic Models on Prediction
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.
2015-12-01
Determining the initial conditions for predictive models remains a challenge due to the uncertainty in measurement/identification of the state variables at the scale of interest. However, the characterization of uncertainty in initial conditions has arguably attracted less attention compared with other sources of uncertainty in hydrologic modelling (e.g, parameter, data, and structural uncertainty). This is perhaps because it is commonly believed that: (1) hydrologic systems (relatively rapidly) forget their initial conditions over time, and (2) other sources of uncertainty (e.g., in data) are dominant. This presentation revisits the basic principles of the theory of nonlinear dynamical systems in the context of hydrologic systems. Through simple example case studies, we demonstrate how and under what circumstances different hydrologic processes represent a range of attracting limit sets in their evolution trajectory in state space over time, including fixed points, limit cycles (periodic behaviour), torus (quasi-periodic behaviour), and strange attractors (chaotic behaviour). Furthermore, the propagation (or dissipation) of uncertainty in initial conditions of several hydrologic models through time, under any of the possible attracting limit sets, is investigated. This study highlights that there are definite situations in hydrology where uncertainty in initial conditions remains of significance. The results and insights gained have important implications for hydrologic modelling under non-stationarity in climate and environment.
Sousa, Marcelo R; Frind, Emil O; Rudolph, David L
2013-05-01
Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.
Application of fuzzy system theory in addressing the presence of uncertainties
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.
2015-02-03
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.
Genetics and psychiatry: a proposal for the application of the precautionary principle.
Porteri, Corinna
2013-08-01
The paper suggests an application of the precautionary principle to the use of genetics in psychiatry focusing on scientific uncertainty. Different levels of uncertainty are taken into consideration--from the acknowledgement that the genetic paradigm is only one of the possible ways to explain psychiatric disorders, via the difficulties related to the diagnostic path and genetic methods, to the value of the results of studies carried out in this field. Considering those uncertainties, some measures for the use of genetics in psychiatry are suggested. Some of those measures are related to the conceptual limits of the genetic paradigm; others are related to present knowledge and should be re-evaluated.
Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma
2009-10-01
Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.
Uncertainty in gridded CO2 emissions estimates
NASA Astrophysics Data System (ADS)
Hogue, Susannah; Marland, Eric; Andres, Robert J.; Marland, Gregg; Woodard, Dawn
2016-05-01
We are interested in the spatial distribution of fossil-fuel-related emissions of CO2 for both geochemical and geopolitical reasons, but it is important to understand the uncertainty that exists in spatially explicit emissions estimates. Working from one of the widely used gridded data sets of CO2 emissions, we examine the elements of uncertainty, focusing on gridded data for the United States at the scale of 1° latitude by 1° longitude. Uncertainty is introduced in the magnitude of total United States emissions, the magnitude and location of large point sources, the magnitude and distribution of non-point sources, and from the use of proxy data to characterize emissions. For the United States, we develop estimates of the contribution of each component of uncertainty. At 1° resolution, in most grid cells, the largest contribution to uncertainty comes from how well the distribution of the proxy (in this case population density) represents the distribution of emissions. In other grid cells, the magnitude and location of large point sources make the major contribution to uncertainty. Uncertainty in population density can be important where a large gradient in population density occurs near a grid cell boundary. Uncertainty is strongly scale-dependent with uncertainty increasing as grid size decreases. Uncertainty for our data set with 1° grid cells for the United States is typically on the order of ±150%, but this is perhaps not excessive in a data set where emissions per grid cell vary over 8 orders of magnitude.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
Induction of models under uncertainty
NASA Technical Reports Server (NTRS)
Cheeseman, Peter
1986-01-01
This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.
Measuring the uncertainty of coupling
NASA Astrophysics Data System (ADS)
Zhao, Xiaojun; Shang, Pengjian
2015-06-01
A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.
Dopamine, uncertainty and TD learning
Niv, Yael; Duff, Michael O; Dayan, Peter
2005-01-01
Substantial evidence suggests that the phasic activities of dopaminergic neurons in the primate midbrain represent a temporal difference (TD) error in predictions of future reward, with increases above and decreases below baseline consequent on positive and negative prediction errors, respectively. However, dopamine cells have very low baseline activity, which implies that the representation of these two sorts of error is asymmetric. We explore the implications of this seemingly innocuous asymmetry for the interpretation of dopaminergic firing patterns in experiments with probabilistic rewards which bring about persistent prediction errors. In particular, we show that when averaging the non-stationary prediction errors across trials, a ramping in the activity of the dopamine neurons should be apparent, whose magnitude is dependent on the learning rate. This exact phenomenon was observed in a recent experiment, though being interpreted there in antipodal terms as a within-trial encoding of uncertainty. PMID:15953384
Uncertainties drive arsenic rule delay
Pontius, F.W.
1995-04-01
The US Environmental Protection Agency (USEPA) is under court order to sign a proposed rule for arsenic by Nov. 30, 1995. The agency recently announced that it will not meet this deadline, citing the need to gather additional information. Development of a National Interim Primary Drinking Water Regulation for arsenic has been delayed several times over the past 10 years because of uncertainties regarding health issues and costs associated with compliance. The early history of development of the arsenic rule has been reviewed. Only recent developments are reviewed here. The current maximum contaminant level (MCL) for arsenic in drinking water is 0.05 mg/L. This MCL was set in 1975, based on the 1962 US Public Health Standards. The current Safe Drinking Water Act (SDWA) requires that the revised arsenic MCL be set as close to the MCL goal (MCLG) as is feasible using best technology, treatment techniques, or other means and taking cost into consideration.
Archimedes' principle in general coordinates
NASA Astrophysics Data System (ADS)
Ridgely, Charles T.
2010-05-01
Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is applied in Schwarzschild coordinates and in rotating coordinates. Using Schwarzschild coordinates for the case of a spherical mass suspended within a perfect fluid leads to the familiar expression of Archimedes' principle. Using rotating coordinates produces an expression for a centrifugal buoyancy force that agrees with accepted theory. It is then argued that Archimedes' principle ought to be applicable to non-gravitational phenomena, as well. Conservation of the energy-momentum tensor is then applied to electromagnetic phenomena. It is shown that a charged body submerged in a charged medium experiences a buoyancy force in accordance with an electromagnetic analogue of Archimedes' principle.
Concepts and Practice of Verification, Validation, and Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Oberkampf, W. L.
2014-12-01
Verification and validation (V&V) are the primary means to assess the numerical and physics modeling accuracy, respectively, in computational simulation. Code verification assesses the reliability of the software coding and the numerical algorithms used in obtaining a solution, while solution verification addresses numerical error estimation of the computational solution of a mathematical model for a specified set of initial and boundary conditions. Validation assesses the accuracy of the mathematical model as compared to experimentally measured response quantities of the system being modeled. As these experimental data are typically available only for simplified subsystems or components of the system, model validation commonly provides limited ability to assess model accuracy directly. Uncertainty quantification (UQ), specifically in regard to predictive capability of a mathematical model, attempts to characterize and estimate the total uncertainty for conditions where no experimental data are available. Specific sources of uncertainty that can impact the total predictive uncertainty are: the assumptions and approximations in the formulation of the mathematical model, the error incurred in the numerical solution of the discretized model, the information available for stochastic input data for the system, and the extrapolation of the mathematical model to conditions where no experimental data are available. This presentation will briefly discuss the principles and practices of VVUQ from both the perspective of computational modeling and simulation, as well as the difficult issue of estimating predictive capability. Contrasts will be drawn between weak and strong code verification testing, and model validation as opposed to model calibration. Closing remarks will address what needs to be done to improve the value of information generated by computational simulation for improved decision-making.
A bayesian foundation for individual learning under uncertainty.
Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E
2011-01-01
Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory.
Solving navigational uncertainty using grid cells on robots.
Milford, Michael J; Wiles, Janet; Wyeth, Gordon F
2010-01-01
To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our
Solving navigational uncertainty using grid cells on robots.
Milford, Michael J; Wiles, Janet; Wyeth, Gordon F
2010-11-11
To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our
Magnetism: Principles and Applications
NASA Astrophysics Data System (ADS)
Craik, Derek J.
2003-09-01
If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.
Bateman's principle and immunity.
Rolff, Jens
2002-01-01
The immunocompetence handicap hypothesis (ICHH) of Folstad and Karter has inspired a large number of studies that have tried to understand the causal basis of parasite-mediated sexual selection. Even though this hypothesis is based on the double function of testosterone, a hormone restricted to vertebrates, studies of invertebrates have tended to provide central support for specific predictions of the ICHH. I propose an alternative hypothesis that explains many of the findings without relying on testosterone or other biochemical feedback loops. This alternative is based on Bateman's principle, that males gain fitness by increasing their mating success whilst females increase fitness through longevity because their reproductive effort is much higher. Consequently, I predict that females should invest more in immunity than males. The extent of this dimorphism is determined by the mating system and the genetic correlation between males and females in immune traits. In support of my arguments, I mainly use studies on insects that share innate immunity with vertebrates and have the advantage that they are easier to study. PMID:11958720
Principles of alternative gerontology
Bilinski, Tomasz; Bylak, Aneta; Zadrag-Tecza, Renata
2016-01-01
Surveys of taxonomic groups of animals have shown that contrary to the opinion of most gerontologists aging is not a genuine trait. The process of aging is not universal and its mechanisms have not been widely conserved among species. All life forms are subject to extrinsic and intrinsic destructive forces. Destructive effects of stochastic events are visible only when allowed by the specific life program of an organism. Effective life programs of immortality and high longevity eliminate the impact of unavoidable damage. Organisms that are capable of agametic reproduction are biologically immortal. Mortality of an organism is clearly associated with terminal specialisation in sexual reproduction. The longevity phenotype that is not accompanied by symptoms of senescence has been observed in those groups of animals that continue to increase their body size after reaching sexual maturity. This is the result of enormous regeneration abilities of both of the above-mentioned groups. Senescence is observed when: (i) an organism by principle switches off the expression of existing growth and regeneration programs, as in the case of imago formation in insect development; (ii) particular programs of growth and regeneration of progenitors are irreversibly lost, either partially or in their entirety, in mammals and birds. “We can't solve problems by using the same kind of thinking we used when we created them.” (Ascribed to Albert Einstein) PMID:27017907
Great Lakes Literacy Principles
NASA Astrophysics Data System (ADS)
Fortner, Rosanne W.; Manzo, Lyndsey
2011-03-01
Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.
Principles of alternative gerontology.
Bilinski, Tomasz; Bylak, Aneta; Zadrag-Tecza, Renata
2016-04-01
Surveys of taxonomic groups of animals have shown that contrary to the opinion of most gerontologists aging is not a genuine trait. The process of aging is not universal and its mechanisms have not been widely conserved among species. All life forms are subject to extrinsic and intrinsic destructive forces. Destructive effects of stochastic events are visible only when allowed by the specific life program of an organism. Effective life programs of immortality and high longevity eliminate the impact of unavoidable damage. Organisms that are capable of agametic reproduction are biologically immortal. Mortality of an organism is clearly associated with terminal specialisation in sexual reproduction. The longevity phenotype that is not accompanied by symptoms of senescence has been observed in those groups of animals that continue to increase their body size after reaching sexual maturity. This is the result of enormous regeneration abilities of both of the above-mentioned groups. Senescence is observed when: (i) an organism by principle switches off the expression of existing growth and regeneration programs, as in the case of imago formation in insect development; (ii) particular programs of growth and regeneration of progenitors are irreversibly lost, either partially or in their entirety, in mammals and birds. PMID:27017907
Principles of electroanatomic mapping.
Bhakta, Deepak; Miller, John M
2008-01-01
Electrophysiologic testing and radiofrequency ablation have evolved as curative measures for a variety of rhythm disturbances. As experience in this field has grown, ablation is progressively being used to address more complex rhythm disturbances. Paralleling this trend are technological advancements to facilitate these efforts, including electroanatomic mapping (EAM). At present, several different EAM systems utilizing various technologies are available to facilitate mapping and ablation. Use of these systems has been shown to reduce fluoroscopic exposure and radiation dose, with less significant effects on procedural duration and success rates. Among the data provided by EAM are chamber reconstruction, tagging of important anatomic landmarks and ablation lesions, display of diagnostic and mapping catheters without using fluoroscopy, activation mapping, and voltage (or scar) mapping. Several EAM systems have specialized features, such as enhanced ability to map non-sustained or hemodynamically unstable arrhythmias, ability to display diagnostic as well as mapping catheter positions, and wide compatibility with a variety of catheters. Each EAM system has its strengths and weaknesses, and the system chosen must depend upon what data is required for procedural success (activation mapping, substrate mapping, cardiac geometry), the anticipated arrhythmia, the compatibility of the system with adjunctive tools (i.e. diagnostic and ablation catheters), and the operator's familiarity with the selected system. While EAM can offer significant assistance during an EP procedure, their incorrect or inappropriate application can substantially hamper mapping efforts and procedural success, and should not replace careful interpretation of data and strict adherence to electrophysiologic principles.
The Principle of General Tovariance
NASA Astrophysics Data System (ADS)
Heunen, C.; Landsman, N. P.; Spitters, B.
2008-06-01
We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.
Evacuation decision-making: process and uncertainty
Mileti, D.; Sorensen, J.; Bogard, W.
1985-09-01
The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.
Uncertainty in tsunami sediment transport modeling
Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.
2016-01-01
Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.
Incorporating Forecast Uncertainty in Utility Control Center
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2014-07-09
Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)
Capturing the uncertainty in adversary attack simulations.
Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce
2008-09-01
This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.
Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip
2015-04-15
Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.
Bayesian Uncertainty Analyses Via Deterministic Model
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.
2001-05-01
Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.
Visual Semiotics & Uncertainty Visualization: An Empirical Study.
MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M
2012-12-01
This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.
Collaborative framework for PIV uncertainty quantification: the experimental database
NASA Astrophysics Data System (ADS)
Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio
2015-07-01
The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for
Gröning, Manfred
2011-10-15
The calibration of all δ(2)H and δ(18)O measurements on the VSMOW/SLAP scale should be performed consistently, based on similar principles, independent of the instrumentation used. The basic principles of a comprehensive calibration strategy are discussed taking water as example. The most common raw data corrections for memory and drift effects are described. Those corrections result in a considerable improvement in data consistency, especially in laboratories analyzing samples of quite variable isotopic composition (e.g. doubly labelled water). The need for a reliable uncertainty assessment for all measurements is discussed and an easy implementation method proposed. A versatile evaluation method based on Excel macros and spreadsheets is presented. It corrects measured raw data for memory and drift effects, performs the calibration and calculates the combined standard uncertainty for each measurement. It allows the easy implementation of the discussed principles in any user laboratory. Following these principles will improve the comparability of data among laboratories. PMID:21913248
Assessing uncertainty in stormwater quality modelling.
Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha
2016-10-15
Designing effective stormwater pollution mitigation strategies is a challenge in urban stormwater management. This is primarily due to the limited reliability of catchment scale stormwater quality modelling tools. As such, assessing the uncertainty associated with the information generated by stormwater quality models is important for informed decision making. Quantitative assessment of build-up and wash-off process uncertainty, which arises from the variability associated with these processes, is a major concern as typical uncertainty assessment approaches do not adequately account for process uncertainty. The research study undertaken found that the variability of build-up and wash-off processes for different particle size ranges leads to processes uncertainty. After variability and resulting process uncertainties are accurately characterised, they can be incorporated into catchment stormwater quality predictions. Accounting of process uncertainty influences the uncertainty limits associated with predicted stormwater quality. The impact of build-up process uncertainty on stormwater quality predictions is greater than that of wash-off process uncertainty. Accordingly, decision making should facilitate the designing of mitigation strategies which specifically addresses variations in load and composition of pollutants accumulated during dry weather periods. Moreover, the study outcomes found that the influence of process uncertainty is different for stormwater quality predictions corresponding to storm events with different intensity, duration and runoff volume generated. These storm events were also found to be significantly different in terms of the Runoff-Catchment Area ratio. As such, the selection of storm events in the context of designing stormwater pollution mitigation strategies needs to take into consideration not only the storm event characteristics, but also the influence of process uncertainty on stormwater quality predictions.
Neural coding of uncertainty and probability.
Ma, Wei Ji; Jazayeri, Mehrdad
2014-01-01
Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability.
Neural coding of uncertainty and probability.
Ma, Wei Ji; Jazayeri, Mehrdad
2014-01-01
Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability. PMID:25032495
Assessing uncertainty in stormwater quality modelling.
Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha
2016-10-15
Designing effective stormwater pollution mitigation strategies is a challenge in urban stormwater management. This is primarily due to the limited reliability of catchment scale stormwater quality modelling tools. As such, assessing the uncertainty associated with the information generated by stormwater quality models is important for informed decision making. Quantitative assessment of build-up and wash-off process uncertainty, which arises from the variability associated with these processes, is a major concern as typical uncertainty assessment approaches do not adequately account for process uncertainty. The research study undertaken found that the variability of build-up and wash-off processes for different particle size ranges leads to processes uncertainty. After variability and resulting process uncertainties are accurately characterised, they can be incorporated into catchment stormwater quality predictions. Accounting of process uncertainty influences the uncertainty limits associated with predicted stormwater quality. The impact of build-up process uncertainty on stormwater quality predictions is greater than that of wash-off process uncertainty. Accordingly, decision making should facilitate the designing of mitigation strategies which specifically addresses variations in load and composition of pollutants accumulated during dry weather periods. Moreover, the study outcomes found that the influence of process uncertainty is different for stormwater quality predictions corresponding to storm events with different intensity, duration and runoff volume generated. These storm events were also found to be significantly different in terms of the Runoff-Catchment Area ratio. As such, the selection of storm events in the context of designing stormwater pollution mitigation strategies needs to take into consideration not only the storm event characteristics, but also the influence of process uncertainty on stormwater quality predictions. PMID:27423532
Whitepaper on Uncertainty Quantification for MPACT
Williams, Mark L.
2015-12-17
The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.
[The beginning of the first principles: the anthropic principle].
González de Posada, Francisco
2004-01-01
The nowadays classical Anthropic Principle is put both in the historical perspective of the traditional problem of "the place of man in the Universe', and in the confluence of several scientific "border" issues, some of which, due to their problematical nature, are also subject of philosophical analysis. On the one hand, the scientific uses of the Principle, related to the initial and constitutional conditions of "our Universe", are enumerated, as they are supposedly necessary for the appearance and consequent development of Life--up to Man--. On the other, an organized collection of the principles of today's Physics is synthetically exhibited. The object of this work is to determine the intrinsic scientific nature of the Anthropic Principle, and the role it plays in the global frame of the principles of Physics (Astrophysics, Astrobiology and Cosmology).
Some Aspects of uncertainty in computational fluid dynamics results
NASA Technical Reports Server (NTRS)
Mehta, U. B.
1991-01-01
Uncertainties are inherent in computational fluid dynamics (CFD). These uncertainties need to be systematically addressed and managed. Sources of these uncertainty analysis are discussed. Some recommendations are made for quantification of CFD uncertainties. A practical method of uncertainty analysis is based on sensitivity analysis. When CFD is used to design fluid dynamic systems, sensitivity-uncertainty analysis is essential.
Uncertainty component evaluation in conventional microbiological qualitative measurements.
Ka, Charlotte Chan Tak
2011-01-01
To couple method performance and QA in microbiological testing, uncertainty profiles have been developed according to relevant LODs and their confidence intervals. Percentage probability of failure is proposed to express this uncertainty. Analysis variance is divided into four categories: uncertainty originating from the sample, uncertainty originating from variations in procedure, uncertainty originating from the measurement system, and uncertainty originating in repeatabilitylreproducibility.
One-parameter class of uncertainty relations based on entropy power.
Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A
2016-06-01
We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function. PMID:27415188
One-parameter class of uncertainty relations based on entropy power
NASA Astrophysics Data System (ADS)
Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A.
2016-06-01
We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.
One-parameter class of uncertainty relations based on entropy power.
Jizba, Petr; Ma, Yue; Hayes, Anthony; Dunningham, Jacob A
2016-06-01
We use the concept of entropy power to derive a one-parameter class of information-theoretic uncertainty relations for pairs of conjugate observables in an infinite-dimensional Hilbert space. This class constitutes an infinite tower of higher-order statistics uncertainty relations, which allows one in principle to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. We illustrate the capability of this class by discussing two examples: superpositions of vacuum and squeezed states and the Cauchy-type heavy-tailed wave function.
NASA Astrophysics Data System (ADS)
Latimer, D. C.
2007-06-01
In Phys. Rev. A 70, 032104 (2004), M. Montesinos and G. F. Torres del Castillo consider various symplectic structures on the classical phase-space of the two-dimensional isotropic harmonic oscillator. Using Dirac’s quantization condition, the authors investigate how these alternative symplectic forms affect this system’s quantization. They claim that these symplectic structures result in mutually inequivalent quantum theories. In fact, we show here that there exists a unitary map between the two representation spaces so that the various quantizations are equivalent.
Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.
ERIC Educational Resources Information Center
Wassermann, Selma
2001-01-01
Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill. (PKP)
NASA Technical Reports Server (NTRS)
Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro
1993-01-01
A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.
Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures
NASA Astrophysics Data System (ADS)
Miyadera, Takayuki; Imai, Hideki
2008-11-01
A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.
NASA Technical Reports Server (NTRS)
Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro
1993-01-01
A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.
Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?
NASA Astrophysics Data System (ADS)
Mc Leod, David; Mc Leod, Roger
2007-04-01
String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that λp = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5
Responding to uncertainty in nursing practice.
Thompson, C; Dowding, D
2001-10-01
Uncertainty is a fact of life for practising clinicians and cannot be avoided. This paper outlines the model of uncertainty presented by Katz (1988, Cambridge University Press, Cambridge, UK. pp. 544-565) and examines the descriptive and normative power of three broad theoretical and strategic approaches to dealing with uncertainty: rationality, bounded rationality and intuition. It concludes that nursing research and development (R&D) must acknowledge uncertainty more fully in its R&D agenda and that good-quality evaluation studies which directly compare intuitive with rational-analytical approaches for given clinical problems should be a dominant feature of future R&D.
Modeling uncertainty: quicksand for water temperature modeling
Bartholow, John M.
2003-01-01
Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.
Harper, F.T.; Young, M.L.; Miller, L.A.
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.
Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.
2012-04-15
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.
Principles of ecosystem sustainability
Chapin, F.S. III; Torn, M.S.; Tateno, Masaki
1996-12-01
Many natural ecosystems are self-sustaining, maintaining an characteristic mosaic of vegetation types of hundreds to thousands of years. In this article we present a new framework for defining the conditions that sustain natural ecosystems and apply these principles to sustainability of managed ecosystems. A sustainable ecosystem is one that, over the normal cycle of disturbance events, maintains its characteristics diversity of major functional groups, productivity, and rates of biogeochemical cycling. These traits are determined by a set of four {open_quotes}interactive controls{close_quotes} (climate, soil resource supply, major functional groups of organisms, and disturbance regime) that both govern and respond to ecosystem processes. Ecosystems cannot be sustained unless the interactive controls oscillate within stable bounds. This occurs when negative feedbacks constrain changes in these controls. For example, negative feedbacks associated with food availability and predation often constrain changes in the population size of a species. Linkages among ecosystems in a landscape can contribute to sustainability by creating or extending the feedback network beyond a single patch. The sustainability of managed systems can be increased by maintaining interactive controls so that they form negative feedbacks within ecosystems and by using laws and regulations to create negative feedbacks between ecosystems and human activities, such as between ocean ecosystems and marine fisheries. Degraded ecosystems can be restored through practices that enhance positive feedbacks to bring the ecosystem to a state where the interactive controls are commensurate with desired ecosystem characteristics. The possible combinations of interactive controls that govern ecosystem traits are limited by the environment, constraining the extent to which ecosystems can be managed sustainably for human purposes. 111 refs., 3 figs., 2 tabs.
Zepp, Fred
2016-01-01
immunotherapies to tackle diseases such as cancer, Alzheimer's disease, and autoimmune disease. This chapter gives an overview of the key considerations and processes involved in vaccine development. It also describes the basic principles of normal immune respoinses and its their function in defense of infectious agents by vaccination.
Hydrotectonics; principles and relevance
Kopf, R.W.
1982-01-01
Hydrotectonics combines the principles of hydraulics and rock mechanics. The hypothesis assumes that: (1) no faults are truly planar, (2) opposing noncongruent wavy wallrock surfaces form chambers and bottlenecks along the fault, and (3) most thrusting occurs beneath the water table. These physical constraints permit the following dynamics. Shear displacement accompanying faulting must constantly change the volume of each chamber. Addition of ground water liquefies dry fault breccia to a heavy incompressible viscous muddy breccia I call fault slurry. When the volume of a chamber along a thrust fault decreases faster than its fault slurry can escape laterally, overpressurized slurry is hydraulically injected into the base of near-vertical fractures in the otherwise impervious overriding plate. Breccia pipes commonly form where such fissures intersect. Alternating decrease and increase in volume of the chamber subjects this injection slurry to reversible surges that not only raft and abrade huge clasts sporadically spalled from the walls of the conduit but also act as a forceful hydraulic ram which periodically widens the conduit and extends its top. If the pipe perforates a petroleum reservoir, leaking hydrocarbons float to its top. Sudden faulting may generate a powerful water hammer that can be amplified at some distal narrow ends of the anastomosing plumbing system, where the shock may produce shatter cones. If vented on the Earth's surface, the muddy breccia, now called extrusion slurry, forms a mud volcano. This hypothesis suggests that many highly disturbed features presently attributed to such catastrophic processes as subsurface explosions or meteorite impacts are due to the rheology of tectonic slurry in an intermittently reactivated pressure-relief tube rooted in a powerful reciprocating hydrotectonic pump activated by a long-lived deep-seated thrust fault.
ERIC Educational Resources Information Center
Ouellette, John
2004-01-01
Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…
Ideario Educativo (Principles of Education).
ERIC Educational Resources Information Center
Consejo Nacional Tecnico de la Educacion (Mexico).
This document is an English-language abstract (approximately 1,500 words) which discusses an overall educational policy for Mexico based on Constitutional principles and those of humanism. The basic principles that should guide Mexican education as seen by the National Technical Council for Education are the following: (1) love of country; (2)…
Multimedia Principle in Teaching Lessons
ERIC Educational Resources Information Center
Kari Jabbour, Khayrazad
2012-01-01
Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…
Meaty Principles for Environmental Educators.
ERIC Educational Resources Information Center
Rockcastle, V. N.
1985-01-01
Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)
Hamilton's principle in stochastic mechanics
NASA Astrophysics Data System (ADS)
Pavon, Michele
1995-12-01
In this paper we establish three variational principles that provide new foundations for Nelson's stochastic mechanics in the case of nonrelativistic particles without spin. The resulting variational picture is much richer and of a different nature with respect to the one previously considered in the literature. We first develop two stochastic variational principles whose Hamilton-Jacobi-like equations are precisely the two coupled partial differential equations that are obtained from the Schrödinger equation (Madelung equations). The two problems are zero-sum, noncooperative, stochastic differential games that are familiar in the control theory literature. They are solved here by means of a new, absolutely elementary method based on Lagrange functionals. For both games the saddle-point equilibrium solution is given by the Nelson's process and the optimal controls for the two competing players are precisely Nelson's current velocity v and osmotic velocity u, respectively. The first variational principle includes as special cases both the Guerra-Morato variational principle [Phys. Rev. D 27, 1774 (1983)] and Schrödinger original variational derivation of the time-independent equation. It also reduces to the classical least action principle when the intensity of the underlying noise tends to zero. It appears as a saddle-point action principle. In the second variational principle the action is simply the difference between the initial and final configurational entropy. It is therefore a saddle-point entropy production principle. From the variational principles it follows, in particular, that both v(x,t) and u(x,t) are gradients of appropriate principal functions. In the variational principles, the role of the background noise has the intuitive meaning of attempting to contrast the more classical mechanical features of the system by trying to maximize the action in the first principle and by trying to increase the entropy in the second. Combining the two variational
Quantum principles and free particles. [evaluation of partitions
NASA Technical Reports Server (NTRS)
1976-01-01
The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.
Precautionary principle in international law.
Saladin, C
2000-01-01
The deregulatory nature of trade rules frequently brings them into conflict with the precautionary principle. These rules dominate debate over the content and legal status of the precautionary principle at the international level. The World Trade Organization (WTO), because of its power in settling disputes, is a key player. Many States are concerned to define the precautionary principle consistent with WTO rules, which generally means defining it as simply a component of risk analysis. At the same time, many States, especially environmental and public health policymakers, see the principle as the legal basis for preserving domestic and public health measures in the face of deregulatory pressures from the WTO. The precautionary principle has begun to acquire greater content and to move into the operative articles of legally binding international agreements. It is important to continue this trend.
Uncertainty reasoning in expert systems
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik
1993-01-01
Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.
Communicating Storm Surge Forecast Uncertainty
NASA Astrophysics Data System (ADS)
Troutman, J. A.; Rhome, J.
2015-12-01
When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.
Sensitivity and Uncertainty Analysis Shell
1999-04-20
SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less
Strong majorization entropic uncertainty relations
NASA Astrophysics Data System (ADS)
Rudnicki, Łukasz; Puchała, Zbigniew; Życzkowski, Karol
2014-05-01
We analyze entropic uncertainty relations in a finite-dimensional Hilbert space and derive several strong bounds for the sum of two entropies obtained in projective measurements with respect to any two orthogonal bases. We improve the recent bounds by Coles and Piani [P. Coles and M. Piani, Phys. Rev. A 89, 022112 (2014), 10.1103/PhysRevA.89.022112], which are known to be stronger than the well-known result of Maassen and Uffink [H. Maassen and J. B. M. Uffink, Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103]. Furthermore, we find a bound based on majorization techniques, which also happens to be stronger than the recent results involving the largest singular values of submatrices of the unitary matrix connecting both bases. The first set of bounds gives better results for unitary matrices close to the Fourier matrix, while the second one provides a significant improvement in the opposite sectors. Some results derived admit generalization to arbitrary mixed states, so that corresponding bounds are increased by the von Neumann entropy of the measured state. The majorization approach is finally extended to the case of several measurements.
NASA Astrophysics Data System (ADS)
Bellac, Michel Le
2014-11-01
At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception
Andersen, Jens E T; Mikolajczak, Maria; Wojtachnio-Zawada, Katarzyna Olga; Nicolajsen, Henrik Vigan
2012-11-01
A principle with quality assurance of ion chromatography (IC) is presented. Since the majority of scientists and costumers are interested in the determination of the true amount of analyte in real samples, the focus of attention should be directed towards the concept of accuracy rather than focussing on precision. By exploiting the principle of pooled calibrations and retainment of all outliers it was possible to obtain full correspondence between calibration uncertainty and repetition uncertainty, which for the first time evidences statistical control in experiments with ion chromatography. Anions of bromide were analysed and the results were subjected to quality assurance (QA). It was found that the limit of quantification (LOQ) was significantly underestimated by up to a factor of 30 with respect to the determination of concentration of unknowns. The concept of lower-limit of analysis (LLA) and upper-limit of analysis (ULA) were found to provide more acceptable limits with respect to reliable analysis with a limited number of repetitions. An excellent correspondence was found between calibration uncertainty and repetition uncertainty. These findings comply with earlier investigations of method validations where it was found that the principle of pooled calibrations provides a more realistic picture of the analytical performance with the drawback, however, that generally higher levels of uncertainties should be accepted, as compared to contemporary literature values. The implications to the science analytical chemistry in general and to method validations in particular are discussed.
Spiritual uncertainty: exemplars of 2 hospice patients.
Stephenson, Pamela Shockey
2014-01-01
Spirituality is important to persons approaching the end of life. The ambiguous nature of dying and spirituality creates many opportunities for uncertainty. This article presents 2 exemplars from hospice patients about the different ways that spiritual uncertainty affected their dying experience. PMID:24919092
Worry, Intolerance of Uncertainty, and Statistics Anxiety
ERIC Educational Resources Information Center
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…