Note: This page contains sample records for the topic uncertainty principle from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: November 12, 2013.
1

Heisenberg's uncertainty principle  

Microsoft Academic Search

Heisenberg's uncertainty principle is usually taken to express a limitation of operational possibilities imposed by quantum mechanics. Here we demonstrate that the full content of this principle also includes its positive role as a condition ensuring that mutually exclusive experimental options can be reconciled if an appropriate trade-off is accepted. The uncertainty principle is shown to appear in three manifestations,

Paul Busch; Teiko Heinonen; Pekka Lahti

2007-01-01

2

Uncertainty Principles in Hilbert Spaces  

Microsoft Academic Search

In this article we provide several generalizations of inequalities bounding the commutator of two linear operators acting on a Hilbert space which relate to the Heisenberg uncertainty principle and time\\/frequency analysis of periodic functions. We develop conditions that ensure these inequalities are sharp and apply our results to concrete examples of importance in the literature.

Say Song Goh; Charles A. Micchelli

2002-01-01

3

The uncertainty principle: A mathematical survey  

Microsoft Academic Search

We survey various mathematical aspects of the uncertainty principle, including Heisenberg’s inequality and its variants, local\\u000a uncertainty inequalities, logarithmic uncertainty inequalities, results relating to Wigner distributions, qualitative uncertainty\\u000a principles, theorems on approximate concentration, and decompositions of phase space.

Gerald B. Folland; Alladi Sitaram

1997-01-01

4

Uncertainty principle in human visual perception  

NASA Astrophysics Data System (ADS)

The orthodox data concerning the contrast sensitivity estimation for sine-wave gratings were formally analyzed. The result of our analysis made feasible a threshold energy value (Delta) E -- energetic equivalent to quantum of perception -- as (Delta) E equals (alpha) (Delta) L(Delta) X2, where (alpha) is a proportionality coefficient, (Delta) L is a threshold luminance, and (Delta) X is a half-period of grating. The value of (Delta) E is a constant for a given value of mean luminance L of the grating and for a middle spatial frequency region. So the `exchange' between luminance threshold (Delta) L and spatial resolution (Delta) X2 values takes place; the increasing of one is followed by the decreasing of the other. We treated this phenomenon as a principle of uncertainty in human visual perception and proved its correctness for other spatial frequencies. Taking into account threshold wavelength ((Delta) (lambda) ) and time ((Delta) t) the uncertainty principle may be extended to a wider class of visual perception problems, including color and flicker objects recognition. So, we suggest the uncertainty principle proposed above is to be one of the cornerstones of the evolution of cognitive systems.

Trifonov, Mikhael I.; Ugolev, Dmitry A.

1994-05-01

5

Black hole thermodynamics with generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

We apply the generalized uncertainty principle to the thermodynamics of a small black hole. Here we have a black hole system with the UV cutoff. It is shown that the minimal length induced by the GUP interrupts the Gross Perry Yaffe phase transition for a small black hole. In order to see whether the black hole remnant takes place a transition to a large black hole, we introduce a black hole in a cavity (IR system). However, we fail to show the phase transition of the remnant to the large black hole.

Myung, Yun Soo; Kim, Yong-Wan; Park, Young-Jai

2007-02-01

6

Illinois PER Interactive Examples: Uncertainty Principle I  

NSDL National Science Digital Library

This interactive homework problem shows some particles passing through a single slit of known width. After the particles pass through the slit they spread out over a range of angles. The user is asked to use the Heisenberg uncertainty principle to determine the minimum range of angles. The problem is accompanied by a Socratic-dialog "help" sequence designed to encourage critical thinking as users do a guided conceptual analysis before attempting the mathematics. Immediate feedback is provided for both correct and incorrect responses. This item is part of a larger collection of interactive problems developed by the Illinois Physics Education Research Group.

Gladding, Gary

2008-07-20

7

Open Timelike Curves Violate Heisenberg's Uncertainty Principle  

NASA Astrophysics Data System (ADS)

Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg’s uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.

Pienaar, J. L.; Ralph, T. C.; Myers, C. R.

2013-02-01

8

Single-Slit Diffraction and the Uncertainty Principle  

NASA Astrophysics Data System (ADS)

This short article emphasizes that a consideration of diffraction phenomena is an excellent way to illustrate the uncertainty principle. Or, it could also be claimed that diffraction has a simple quantum mechanical interpretation based on the uncertainty principle and the Fourier transform between position and momentum space.

Rioux, Frank

2005-08-01

9

An Entropic Uncertainty Principle for Positive Operator Valued Measures  

NASA Astrophysics Data System (ADS)

Extending a recent result by Frank and Lieb, we show an entropic uncertainty principle for mixed states in a Hilbert space, relatively to pairs of positive operator valued measures that are independent in some sense. This yields spatial-spectral uncertainty principles and log-Sobolev inequalities for invariant operators on homogeneous spaces, which are sharp in the compact case.

Rumin, Michel

2012-06-01

10

Gauss Linking Number and Electromagnetic Uncertainty Principle  

Microsoft Academic Search

It is shown that there is a precise sense in which the Heisenberg uncertainty between fluxes of electric and magnetic fields through finite surfaces is given by (one-halfh times) the Gauss linking number of the loops that bound these surfaces. To regularize the relevant operators, one is naturally led to assign a framing to each loop. The uncertainty between the

Abhay Ashtekar; Alejandro Corichi

1997-01-01

11

DISTRIBUTION FUNCTIONS IN LIGHT OF THE UNCERTAINTY PRINCIPLE  

Microsoft Academic Search

Here, we have considered the Husimi and Wigner distribution functions. We have, in particular, shown how the uncertainty principle works for the two cases of simple and damped harmonic oscillators when either of these two distributions are used. The conclusion shows that the Husimi distribution function remains non-negative through the phase space but it does not always satisfies the uncertainty

S. NASIRI

12

Risks, scientific uncertainty and the approach of applying precautionary principle.  

PubMed

The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures. PMID:19705643

Lo, Chang-fa

2009-03-01

13

The Generalized Uncertainty Principle and the Friedmann equations  

NASA Astrophysics Data System (ADS)

The Generalized Uncertainty Principle (or GUP) affects the dynamics in Plank scale. So the known equations of physics are expected to get modified at that very high energy regime. Very recently authors in Ali et al. (Phys. Lett. B 678:497, 2009) proposed a new Generalized Uncertainty Principle (or GUP) with a linear term in Plank length. In this article, the proposed GUP is expressed in a more general form and the effect is studied for the modification of the Friedmann equations of the FRW universe. In the midway the known entropy-area relation get some new correction terms, the leading order term being proportional to sqrt{Area}.

Majumder, Barun

2011-12-01

14

The Uncertainty Principle, Virtual Particles and Real Forces  

ERIC Educational Resources Information Center

|This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

Jones, Goronwy Tudor

2002-01-01

15

Generalized uncertainty principle and the Ramsauer-Townsend effect  

NASA Astrophysics Data System (ADS)

The scattering cross section of electrons in noble gas atoms exhibits a minimum value at electron energies of approximately 1eV. This is the Ramsauer-Townsend effect. In this letter, we study the Ramsauer-Townsend effect in the framework of the Generalized Uncertainty Principle.

Vahedi, Javad; Nozari, Kourosh; Pedram, Pouria

2012-07-01

16

Gauge theories under incorporation of a generalized uncertainty principle  

SciTech Connect

There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

Kober, Martin [Frankfurt Institute for Advanced Studies (FIAS), Institut fuer Theoretische Physik, Johann Wolfgang Goethe-Universitaet, Ruth-Moufang-Strasse 1, 60438 Frankfurt am Main (Germany)

2010-10-15

17

Gauge theories under incorporation of a generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

Kober, Martin

2010-10-01

18

Quantum imaging, quantum lithography and the uncertainty principle  

NASA Astrophysics Data System (ADS)

One of the most surprising consequences of quantum mechanics is the entanglement of two or more distant particles. The ghost image experiment demonstrated the astonishing nonlocal behavior of an entangled photon pair. Even though we still have question in regard to fundamental issues of the entangled quantum systems, quantum entanglement has started to play important roles in practical applications. Quantum lithography is one of the hot topics. We have demonstrated a prove-of-principle quantum lithography experiment recently. The experimental results have beaten the classical diffraction limit by a factor of two. This is a quantum mechanical two-photon phenomenon but not a violation of the uncertainty principle.

Shih, Y.

2003-05-01

19

Quantum black hole in the generalized uncertainty principle framework  

SciTech Connect

In this paper we study the effects of the generalized uncertainty principle (GUP) on canonical quantum gravity of black holes. Through the use of modified partition function that involves the effects of the GUP, we obtain the thermodynamical properties of the Schwarzschild black hole. We also calculate the Hawking temperature and entropy for the modification of the Schwarzschild black hole in the presence of the GUP.

Bina, A.; Moslehi, A. [Department of Physics, Faculty of Science, Arak University, Arak 879 (Iran, Islamic Republic of); Jalalzadeh, S. [Department of Physics, Shahid Beheshti University G.C., Evin, Tehran 19839 (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM) Maragha, Iran, P. O. Box: 55134-441 (Iran, Islamic Republic of)

2010-01-15

20

Nuclear resonance photon scattering and the uncertainty principle  

NASA Astrophysics Data System (ADS)

Published data on nuclear resonance photon scattering (NRPS), in the MeV range, from metallic targets were analysed to get the mean-square zero-point linear momenta of the scattering atoms. It is shown that when these are combined with known experimental values of at 0 K, the theoretical lower limit of the Uncertainty Principle may be approached to within a few percent.

Moreh, Raymond

1994-06-01

21

Effects of the modified uncertainty principle on the inflation parameters  

NASA Astrophysics Data System (ADS)

In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [7] on the inflationary dynamics of the early universe in both standard and Randall-Sundrum type II scenarios. We find that the quantum gravitational effect increase the amplitude of density fluctuation, which is oscillatory in nature, with an increase in the tensor-to-scalar ratio.

Majumder, Barun

2012-03-01

22

Nondivergent classical response functions from uncertainty principle: quasiperiodic systems.  

PubMed

Time-divergence in linear and nonlinear classical response functions can be removed by taking a phase-space average within the quantized uncertainty volume O(hn) around the microcanonical energy surface. For a quasiperiodic system, the replacement of the microcanonical distribution density in the classical response function with the quantized uniform distribution density results in agreement of quantum and classical expressions through Heisenberg's correspondence principle: each matrix element (u/alpha(t)/v) corresponds to the (u-v)th Fourier component of alpha(t) evaluated along the classical trajectory with mean action (Ju+Jv)/2. Numerical calculations for one- and two-dimensional systems show good agreement between quantum and classical results. The generalization to the case of N degrees of freedom is made. Thus, phase-space averaging within the quantized uncertainty volume provides a useful way to establish the classical-quantum correspondence for the linear and nonlinear response functions of a quasiperiodic system. PMID:15638574

Kryvohuz, Maksym; Cao, Jianshu

2005-01-01

23

Generalized Uncertainty Principle in the Presence of Extra Dimensions  

NASA Astrophysics Data System (ADS)

We argue that in the generalized uncertainty principle (GUP) model, the parameter ?0 whose square root, multiplied by Planck length lp, approximates the minimum measurable distance, varies with energy scales. Since the minimal measurable length and extra dimensions are both suggested by quantum gravity theories, we investigate the models based on the GUP and one extra dimension, compactified with radius ?. We obtain an inspiring relation . This relation is also consistent with the predictions at Planck scale and the usual quantum mechanics scale. We also estimate the application range of the GUP model. It turns out that the minimum measurable length is exactly the compactification radius of the extra dimension.

Mu, Ben-Rong; Wu, Hou-Wen; Yang, Hai-Tang

2011-09-01

24

Entropy bound with generalized uncertainty principle in general dimensions  

NASA Astrophysics Data System (ADS)

In this letter, the entropy bound for local quantum field theories (LQFT) is studied in a class of models of the generalized uncertainty principle (GUP) which predicts a minimal length as a reflection of the quantum gravity effects. Both bosonic and fermionic fields confined in an arbitrary spatial dimension d ? 4 ball {\\cal B}^{d} are investigated. It is found that the GUP leads to the same scaling A(d-3)/(d-2)d-2 correction to the entropy bound for bosons and fermions, although the coefficients of this correction are different for each case. Based on our calculation, we conclude that the GUP effects can become manifest at the short distance scale. Some further implications and speculations of our results are also discussed.

Wang, W.; Huang, D.

2012-07-01

25

Restatement of the Heisenberg Uncertainty Principle for the Condition of Superposition.  

National Technical Information Service (NTIS)

This report presents a discussion of the Heisenberg uncertainty principle for the condition of superposition. In that changed form, the principle becomes causal and rigidly quantized, but applies to unperceived reality (probability), consistent with Bohr'...

T. E. Bearden

1975-01-01

26

Corrections to the Cardy-Verlinde formula from the generalized uncertainty principle  

SciTech Connect

In this Letter, we compute the corrections to the Cardy-Verlinde formula of the d-dimensional Schwarzschild black hole. These corrections stem from the generalized uncertainty principle. Then we show one can take into account the generalized uncertainty principle corrections of the Cardy-Verlinde entropy formula by just redefining the Virasoro operator L{sub 0} and the central charge c.

Setare, M.R. [Physics Department, Institute for Studies in Theological Physics and Mathematics (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

2004-10-15

27

Black hole entropy and the modified uncertainty principle: A heuristic analysis  

Microsoft Academic Search

Recently Ali et al. (2009) proposed a Generalized Uncertainty Principle (or GUP) with a linear term in momentum (accompanied by Plank length). Inspired by this idea here we calculate the quantum corrected value of a Schwarzschild black hole entropy and a Reissner–Nordström black hole with double horizon by utilizing the proposed generalized uncertainty principle. We find that the leading order

Barun Majumder

2011-01-01

28

Verification of the Uncertainty Principle by Using Diffraction of Light Waves  

ERIC Educational Resources Information Center

|We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For…

Nikolic, D.; Nesic, Lj

2011-01-01

29

SOURCE ASSESSMENT: ANALYSIS OF UNCERTAINTY--PRINCIPLES AND APPLICATIONS  

EPA Science Inventory

This report provides the results of a study that was conducted to analyze the uncertainties involved in the calculation of the decision parameters used in the Source Assessment Program and to determine the effect of these uncertainties on the decision-making procedure. A general ...

30

Quantum-memory-assisted entropic uncertainty principle, teleportation, and entanglement witness in structured reservoirs  

NASA Astrophysics Data System (ADS)

We relate the principle of quantum-memory-assisted entropic uncertainty to quantum teleportation and show geometrically that any two-qubit state which lowers the upper bound of this uncertainty relation is useful for teleportation. We also explore the efficiency of this entropic uncertainty principle on witnessing entanglement in a general class of bosonic structured reservoirs. The entanglement regions witnessed by different estimates are determined, which may have no relation with the explicit form of the spectral density of the reservoir for certain special chosen sets of the initial states.

Hu, Ming-Liang; Fan, Heng

2012-09-01

31

Stochastic maximum principle for optimal control under uncertainty  

Microsoft Academic Search

Abstract Optimal control problems,involve the difficult task of determining time-varying profiles through dynamic,optimization. Such problems become,even more complex,in practical situations where handling time dependent,uncertainties becomes,an important issue. Approaches to stochastic optimal control problems have been reported in the finance literature and are based on real option theory, combining Ito’s Lemma and the dynamic programming,formulation. This paper describes a new approach

Vicente Rico-ramírez; Urmila M. Diwekar

2004-01-01

32

Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics studentsâ depictions  

NSDL National Science Digital Library

Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize studentsâ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of studentsâ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map studentsâ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that studentsâ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, this paper's findings are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2012-05-21

33

Removing the big bang singularity: the role of the generalized uncertainty principle in quantum gravity  

NASA Astrophysics Data System (ADS)

The possibility of avoiding the big bang singularity by means of a generalized uncertainty principle is investigated. In relation with this matter, the statistical mechanics of a free-particle system obeying the generalized uncertainty principle is studied and it is shown that the entropy of the system has a finite value in the infinite temperature limit. It is then argued that negative temperatures and negative pressures are possible in this system. Finally, it is shown that this model can remove the big bang singularity.

Rashidi, Reza

2013-01-01

34

A Pseudo-Quantum Triad: Schrödinger's Equation, the Uncertainty Principle, and the Heisenberg Group  

NASA Astrophysics Data System (ADS)

We show that the paradigmatic quantum triad "Schrödinger equation-Uncertainty principle-Heisenberg group" emerges mathematically from classical mechanics. In the case of the Schrödinger equation, this is done by extending the metaplectic representation of linear Hamiltonian flows to arbitrary flows; for the Heisenberg group this follows from a careful analysis of the notion of phase of a Lagrangian manifold, and for the uncertainty principle it suffices to use tools from multivariate statistics together with the theory of John's minimum volume ellipsoid. Thus, the mathematical structure needed to make quantum mechanics emerge already exists in classical mechanics.

de Gosson, Maurice A.

2012-05-01

35

Satellite Test of the Equivalence Principle Uncertainty Analysis  

NASA Astrophysics Data System (ADS)

STEP, the Satellite Test of the Equivalence Principle, is intended to test the apparent equivalence of gravitational and inertial mass to 1 part in 1018 (Worden et al. in Adv. Space Res. 25(6):1205-1208, 2000). This will be an increase of more than five orders of magnitude over ground-based experiments and lunar laser ranging observations (Su et al. in Phys. Rev. D 50:3614-3636, 1994; Williams et al. in Phys. Rev. D 53:6730-6739, 1996; Schlamminger et al. in Phys. Rev. Lett. 100:041101, 2008). It is essential to have a comprehensive and consistent model of the possible error sources in an experiment of this nature to be able to understand and set requirements, and to evaluate design trade-offs. In the following pages we describe existing software for such an error model and the application of this software to the STEP experiment. In particular we address several issues, including charge and patch effect forces, where our understanding has improved since the launch of GP-B owing to the availability of GP-B data and preliminary analysis results (Everitt et al. in Space Sci. Rev., 2009, this issue; Silbergleit et al. in Space Sci. Rev., 2009, this issue; Keiser et al. in Space Sci. Rev., 2009, this issue; Heifetz et al. in Space Sci. Rev., 2009, this issue; Muhlfelder et al. in Space Sci. Rev., 2009, this issue).

Worden, Paul; Mester, John

2009-12-01

36

Rotation and Mixing in Massive Stars: Principles and Uncertainties  

NASA Astrophysics Data System (ADS)

The main instabilities induced by rotation in stellar interiors are described. We derive from first principles the general equation describing the transport of the angular momentum. The case of the transport of the chemical species is also discussed. As long as the mass loss rates are not too important, meridional currents, by advecting angular momentum from the inner regions to the outer layers, accelerate the stellar surface during the Main Sequence phase. A 9 M? stellar model at solar metallicity with an equatorial velocity at the beginning of the core H--burning phase equal to 340 km s-1 reaches the break--up limit during the MS phase. The model with an initial velocity of 290 km s-1 approaches this limit without reaching it. The models with 290 km s-1 and 340 km s-1 predict enhancements of the N/C ratio at the end of the MS phase equal to 2.8 and 3.2 times the initial value respectively.

Meynet, G.; Maeder, A.

2005-11-01

37

Principles and applications of measurement and uncertainty analysis in research and calibration  

SciTech Connect

Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

Wells, C.V.

1992-11-01

38

Principles and applications of measurement and uncertainty analysis in research and calibration  

SciTech Connect

Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

Wells, C.V.

1992-11-01

39

Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions  

ERIC Educational Resources Information Center

Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2011-01-01

40

A heuristic analysis of black hole thermodynamics with generalized uncertainty principle  

Microsoft Academic Search

In the standard viewpoint, the temperature of a stationary black hole is proportional to its surface gravity. This is a semiclassical result and the quantum gravity effects are not taken into consideration. This research explores a unified expression for the black hole temperatures in the sense of a generalized uncertainty principle (GUP). Our argument is based on a heuristic analysis

Li Xiang; X. Q. Wen

2009-01-01

41

Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions  

ERIC Educational Resources Information Center

|Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2011-01-01

42

A generalized uncertainty principle and sparse representation in pairs of bases  

Microsoft Academic Search

An elementary proof of a basic uncertainty principle concerning pairs of representations of vectors in different orthonormal bases is provided. The result, slightly stronger than stated before, has a direct impact on the uniqueness property of the sparse representation of such vectors using pairs of orthonormal bases as overcomplete dictionaries. The main contribution in this paper is the improvement of

Michael Elad; Alfred M. Bruckstein

2002-01-01

43

Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom  

ERIC Educational Resources Information Center

|In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

Harbola, Varun

2011-01-01

44

Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities  

NASA Astrophysics Data System (ADS)

We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible.

Tawfik, A.

2013-07-01

45

Trajectory interpretation of the uncertainty principle in 1D systems using complex Bohmian mechanics  

NASA Astrophysics Data System (ADS)

Complex Bohmian mechanics is introduced to investigate the validity of a trajectory interpretation of the uncertainty principles ?q?p??/2 and ?E?t??/2 by replacing probability mean values with time-averaged mean values. It is found that the ?/2 factor in the uncertainty relation ?E?t??/2 stems from a quantum potential whose time-averaged mean value taken along any closed trajectory with a period T=2?/? is proved to be an integer multiple of ??/2 for one-dimensional systems.

Yang, Ciann-Dong

2008-10-01

46

Quantum dynamics of the Taub universe in a generalized uncertainty principle framework  

NASA Astrophysics Data System (ADS)

The implications of a generalized Uncertainty principle on the Taub cosmological model are investigated. The model is studied in the Arnowitt-Deser-Misner reduction of the dynamics and therefore a time variable is ruled out. Such a variable is quantized in a canonical way and the only physical degree of freedom of the system (related to the universe anisotropy) is quantized by means of a modified Heisenberg algebra. The analysis is performed at both the classical and quantum level. In particular, at quantum level, the motion of wave packets is investigated. The two main results obtained are as follows: (i) The classical singularity is probabilistically suppressed. The universe exhibits a stationary behavior and the probability amplitude is peaked in a determinate region. (ii) The generalized uncertainty principle wave packets provide the right behavior in the establishment of a quasi-isotropic configuration for the universe.

Battisti, Marco Valerio; Montani, Giovanni

2008-01-01

47

The uncertainty principle and entangled correlations in quantum key distribution protocols  

NASA Astrophysics Data System (ADS)

Considerations of non-locality and correlation measures provide insights to Quantum Mechanics. Nonphysical states are shown to exceed limits of QM in both respects and yet conform to relativity's `nosignaling' constraint. Recent work has shown that the Uncertainty Principle limits non-locality to distinguish models that exceed those of QM. Accordingly, the Uncertainty Principle is shown to limit correlation strength independently of non-locality, extending interpretation of the prior work, and to underlie the security of Quantum Key Distribution. The established Ekert protocol[6] is compared with more secure variations, in particular H. Yuen's Keyed Communication in Quantum Noise (KCQ) [7] and a new Time-Gating protocol which minimizes authentication and susceptibility to active eavesdropping.

Erdmann, Reinhard; Hughes, David; Michalak, Richard; Cook, Paul; Malowicki, John

2013-05-01

48

The Big-Bang singularity in the framework of a Generalized Uncertainty Principle  

Microsoft Academic Search

We analyze the quantum dynamics of the Friedmann–Robertson–Walker Universe in the context of a Generalized Uncertainty Principle. Since the isotropic Universe dynamics resembles that of a one-dimensional particle, we quantize it with the commutation relations associated to an extended formulation of the Heisenberg algebra. The evolution of the system is described in terms of a massless scalar field taken as

Marco Valerio Battisti; Giovanni Montani

2007-01-01

49

Generalized uncertainty principles and localization of a particle in discrete space  

NASA Astrophysics Data System (ADS)

Generalized uncertainty principles are able to serve as useful descriptions of some of the phenomenology of quantum gravity effects, providing an intuitive grasp on nontrivial space-time structures such as a fundamental discreteness of space, a universal band limit or an irreducible extendedness of elementary particles. In this article, uncertainty relations for single-particle quantum mechanics are derived by a moment expansion of states for quantum systems with a discrete coordinate and, correspondingly, a periodic momentum. Corrections to standard uncertainty relations are found, with some similarities but also key differences to what is often assumed in this context. The relations provided can be applied to discrete models of matter or space-time, including loop quantum cosmology.

Bojowald, Martin; Kempf, Achim

2012-10-01

50

Satisfaction of the uncertainty principle in cancer clinical trials: retrospective cohort analysis  

PubMed Central

Objective To assess whether publicly funded adult cancer trials satisfy the uncertainty principle, which states that physicians should enrol a patient in a trial only if they are substantially uncertain which of the treatments in the trial is most appropriate for the patient. This principle is violated if trials systematically favour either the experimental or the standard treatment. Design Retrospective cohort study of completed cancer trials, with randomisation as the unit of analysis. Setting Two cooperative research groups in the United States. Studies included 93 phase III randomised trials (103 randomisations) that completed recruitment of patients between 1981 and 1995. Main outcome measures Whether the randomisation favoured the experimental treatment, the standard treatment, or neither treatment; effect size (outcome of the experimental treatment compared with outcome of the standard treatment) for each randomisation. Results Three randomisations (3%) favoured the standard treatment, 70 (68%) found no significant difference between treatments, and 30 (29%) favoured the experimental treatment. The average effect size was 1.20 (95% confidence interval 1.13 to 1.28), reflecting a slight advantage for the experimental treatment. Conclusions In cooperative group trials in adults with cancer, there is a measurable average improvement in disease control associated with assignment to the experimental rather than the standard arm. However, the heterogeneity of outcomes and the small magnitude of the advantage suggest that, as a group, these trials satisfy the uncertainty principle.

Joffe, Steven; Harrington, David P; George, Stephen L; Emanuel, Ezekiel J; Budzinski, Lindsay A; Weeks, Jane C

2004-01-01

51

The Symplectic Camel and the Uncertainty Principle: The Tip of an Iceberg?  

NASA Astrophysics Data System (ADS)

We show that the strong form of Heisenberg’s inequalities due to Robertson and Schrödinger can be formally derived using only classical considerations. This is achieved using a statistical tool known as the “minimum volume ellipsoid” together with the notion of symplectic capacity, which we view as a topological measure of uncertainty invariant under Hamiltonian dynamics. This invariant provides a right measurement tool to define what “quantum scale” is. We take the opportunity to discuss the principle of the symplectic camel, which is at the origin of the definition of symplectic capacities, and which provides an interesting link between classical and quantum physics.

de Gosson, Maurice A.

2009-02-01

52

Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?  

NASA Astrophysics Data System (ADS)

In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.

Majumder, Barun; Sen, Sourav

2012-10-01

53

Generalized uncertainty principle in f(R) gravity for a charged black hole  

NASA Astrophysics Data System (ADS)

Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

Said, Jackson Levi; Adami, Kristian Zarb

2011-02-01

54

Generalized uncertainty principle in f(R) gravity for a charged black hole  

SciTech Connect

Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

Said, Jackson Levi [Physics Department, University of Malta, Msida (Malta); Adami, Kristian Zarb [Physics Department, University of Malta, Msida (Malta); Physics Department, University of Oxford, Oxford (United Kingdom)

2011-02-15

55

Before and beyond the precautionary principle: Epistemology of uncertainty in science and law  

SciTech Connect

The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

Tallacchini, Mariachiara [Bioethics, Faculty of Biotechnology, University of Milan, Via Celoria 10, 20100 Milan (Italy) and Science Technology and Law, Law Faculty, University of Piacenza, Via Emilia Parmense 84, 29100 Piacenza (Italy)]. E-mail: mariachiara.tallacchini@unimi.it

2005-09-01

56

Space-time uncertainty principle and conformal symmetry in D-particle dynamics  

NASA Astrophysics Data System (ADS)

Motivated by the space-time uncertainty principle, we establish a conformal symmetry in the dynamics of D-particles. The conformal symmetry, combined with the supersymmetric non-renormalization theorem, uniquely determines the classical form of the effective action for a probe D-particle in the background of a heavy D-particle source, previously constructed by Becker-Becker-Polchinski-Tseytlin. Our results strengthen the conjecture proposed by Maldacena on the correspondence, in the case of D-particles, between the supergravity and the supersymmetric Yang-Mills matrix models in the large-N limit, the latter being the boundary conformal field theory of the former in the classical D-particle background in the near-horizon limit.

Jevicki, Antal; Yoneya, Tamiaki

1998-12-01

57

Minimal length uncertainty principle and the trans-Planckian problem of black hole physics  

NASA Astrophysics Data System (ADS)

The minimal length uncertainty principle of Kemf, Mangano and Mann (KMM), as derived from a mutilated quantum commutator between coordinate and momentum, is applied to describe the modes and wave packets of Hawking particles evaporated from a black hole. The trans-Planckian problem is successfully confronted in that the Hawking particle no longer hugs the horizon at arbitrarily close distances. Rather the mode of Schwarzschild frequency ? deviates from the conventional trajectory when the coordinate r is given by \\|r-2M\\|~=?H?/2? in units of the nonlocal distance legislated into the uncertainty relation. Wave packets straddle the horizon and spread out to fill the whole nonlocal region. The charge carried by the packet (in the sense of the amount of ``stuff'' carried by the Klein-Gordon field) is not conserved in the non-local region and rapidly decreases to zero as time decreases. Read in the forward temporal direction, the non-local region thus is the seat of production of the Hawking particle and its partner. The KMM model was inspired by string theory for which the mutilated commutator has been proposed to describe an effective theory of high momentum scattering of zero mass modes. It is here interpreted in terms of dissipation which gives rise to the Hawking particle into a reservoir of other modes (of as yet unknown origin). On this basis it is conjectured that the Bekenstein-Hawking entropy finds its origin in the fluctuations of fields extending over the nonlocal region.

Brout, R.; Gabriel, Cl.; Lubo, M.; Spindel, Ph.

1999-02-01

58

Casimir effect in minimal length theories based on a generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

We study the corrections to the Casimir effect in the classical geometry of two parallel metallic plates, separated by a distance a, due to the presence of a minimal length (??) arising from quantum mechanical models based on a generalized uncertainty principle (GUP). The approach for the quantization of the electromagnetic field is based on projecting onto the maximally localized states of a few specific GUP models and was previously developed to study the Casimir-Polder effect. For each model we compute the lowest order correction in the minimal length to the Casimir energy and find that it scales with the fifth power of the distance between the plates a-5 as opposed to the well known QED result which scales as a-3 and, contrary to previous claims, we find that it is always attractive. The various GUP models can be in principle differentiated by the strength of the correction to the Casimir energy as every model is characterized by a specific multiplicative numerical constant.

Frassino, A. M.; Panella, O.

2012-02-01

59

Uncertainty Analysis in Software Reliability Modeling by Bayesian Analysis with Maximum-Entropy Principle  

Microsoft Academic Search

In software reliability modeling, the parameters of the model are typically estimated from the test data of the corresponding component. However, the widely used point estimators are subject to random variations in the data, resulting in uncertainties in these estimated parameters. Ignoring the parameter uncertainty can result in grossly underestimating the uncertainty in the total system reliability. This paper attempts

Yuan-shun Dai; Min Xie; Quan Long; Ng Szu-hui

2007-01-01

60

An enquiry concerning the principles of cultural norms and values: The impact of uncertainty and mortality salience on reactions to violations and bolstering of cultural worldviews  

Microsoft Academic Search

This enquiry concerning the principles of cultural norms and values focuses on the impact of mortality and uncertainty salience on people’s reactions to events that violate or bolster their cultural norms and values. Five experiments show that both mortality and uncertainty salience influence people’s reactions to violations and bolstering of their cultural worldviews, yielding evidence for both terror and uncertainty

Kees van den Bos; P. Marijn Poortvliet; Marjolein Maas; Joost Miedema; Ernst-Jan van den Ham

2005-01-01

61

The optimisation approach of ALARA in nuclear practice: an early application of the precautionary principle. Scientific uncertainty versus legal uncertainty.  

PubMed

The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law. PMID:16304938

Lierman, S; Veuchelen, L

2005-01-01

62

Using uncertainty principle to find the ground-state energy of the helium and a helium-like Hookean atom  

NASA Astrophysics Data System (ADS)

In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron cloud. Our calculation also shows how the Coulomb interaction between electrons affects their distribution. This leads to a physical picture of how electrons are located with respect to each other in these atoms. Finally, we also obtain through our calculations a general formula for the estimate of ground-state energy and radius of two electron atoms and ions with atomic number Z.

Harbola, Varun

2011-11-01

63

Learning from the law to address uncertainty in the precautionary principle  

Microsoft Academic Search

Environmentalists have advocated the Precautionary Principle (PP) to help guide public and private decisions about the environment.\\u000a By contrast, industry and its spokesmen have opposed this. There is not one principle, but many that have been recommended\\u000a for this purpose. Despite the attractiveness of a core idea in all versions of the principle—that decision-makers should take\\u000a some precautionary steps to

Carl F. Cranor

2001-01-01

64

Before and beyond the precautionary principle: Epistemology of uncertainty in science and law  

Microsoft Academic Search

The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that “[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental

Mariachiara Tallacchini; Mariachiara

2005-01-01

65

Traversal-Time Distribution and the Uncertainty Principle in Quantum Tunneling  

NASA Astrophysics Data System (ADS)

We derive the exact distribution of times spent in the classically forbidden region by a particle as it tunnels through a square barrier potential. We find that the integrated distribution has a simple interpretation. If one restricts the time spent in the barrier to be <=?, then an effective uncertainty ?V is introduced in the barrier height, such that?V?~=1. This uncertainty leads to unusual interference effects in the distribution. These interference effects explain why the average speed of a particle traversing a very opaque barrier may exceed the speed of light.

Fertig, H. A.

1990-11-01

66

MANAGING UNCERTAINTY IN ENGINEERING DESIGN USING IMPRECISE PROBABILITIES AND PRINCIPLES OF INFORMATION ECONOMICS  

Microsoft Academic Search

SUMMARY The engineering design community recognizes that an essential part of the design process is decision- making. Each decision consists of two main phases—problem formulation and problem solution. Existing literature focuses on problem solution using precisely known probabilities. Problem formulation has received considerably less attention. The objective of this thesis is to investigate methods for managing uncertainty during the formulation

JASON AUGHENBAUGH

67

The precautionary principle and international conflict over domestic regulation: mitigating uncertainty and improving adaptive capacity.  

PubMed

Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs. PMID:16304935

Oye, K A

2005-01-01

68

A Study on the New Agegraphic Dark Energy Model in the Framework of Generalized Uncertainty Principle for Different Scale Factors  

NASA Astrophysics Data System (ADS)

In this paper, we consider the New Agegraphic Dark Energy (NADE) model interacting with pressureless Dark Matter (DM) in the framework of generalized uncertainty principle. We consider different expressions of the scale factor a(t) pertaining to the emergent, the intermediate and the logamediate scenarios of the universe. We have derived the expressions for various cosmological parameters in all the three cases and plotted the equation of state (EoS) parameter ? D and squared speed of the sound vs2 to check the stability of the model in each case. We have observed that for emergent and intermediate cases, the EoS parameter has a quintom-like behavior and in the logamediate case it has quintessence-like behavior. The negative squared speed of sound in all of the three cases has indicated that the model is classically unstable for each choice of scale factor.

Pasqua, Antonio; Chattopadhyay, Surajit

2013-09-01

69

Some consequences of the generalized uncertainty principle induced ultraviolet wave-vector cutoff in one-dimensional quantum mechanics  

NASA Astrophysics Data System (ADS)

A projection method is proposed to treat the one-dimensional Schrödinger equation for a single particle when the generalized uncertainty principle (GUP) generates an ultraviolet (UV) wave-vector cutoff. The existence of a unique coordinate representation called the naive one is derived from the one-parameter family of discrete coordinate representations. In this bandlimited quantum mechanics a continuous potential is reconstructed from discrete sampled values observed by means of a particle in maximally localized states. It is shown that bandlimitation modifies the speed of the center and the spreading time of a Gaussian wave packet moving in free space. Indication is found that GUP accompanied by bandlimitation may cause departures of the low-lying energy levels of a particle in a box from those in ordinary quantum mechanics to be much less suppressed than commonly thought when GUP without bandlimitation is at work.

Sailer, K.; Péli, Z.; Nagy, S.

2013-04-01

70

Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators  

NASA Astrophysics Data System (ADS)

We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory.

Marchiolli, M. A.; Mendonça, P. E. M. F.

2013-09-01

71

Decision Making Under Uncertainty.  

National Technical Information Service (NTIS)

This report introduces concepts, principles, and approaches for addressing uncertainty in decision making. The sources of uncertainty in decision making are discussed, emphasizing the distinction between uncertainty and risk, and the characterization of u...

B. K. Harper K. N. Mitchell M. T. Schultz T. S. Bridges

2010-01-01

72

Adding a strategic edge to human factors/ergonomics: Principles for the management of uncertainty as cornerstones for system design.  

PubMed

It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735

Grote, Gudela

2013-04-23

73

The Scientific Principle of Uncertainty  

Microsoft Academic Search

``IF the actual history of science had been different, and if the scientific doctrines most familiar to us had been those which must be expressed in this [statistical] way, it is possible that we might have considered the existence of a certain kind of contingency as self-evident truth, and treated the doctrine of philosophical necessity as a mere sophism.''

Joseph Larmor

1930-01-01

74

Reducing uncertainty in European supply chains  

Microsoft Academic Search

In this paper, we show that reducing supply chain uncertainty increases responsiveness and thereby benefits bottom line performance as assessed via total cycle time reduction. We term this effect as the uncertainty reduction principle. To enable uncertainty reduction we use the uncertainty circle to focus on the sources to be eliminated. We also show that these sources of uncertainty can

Paul Childerhouse; Denis R. Towill

2004-01-01

75

Risk Management Principles for Nanotechnology  

Microsoft Academic Search

Risk management of nanotechnology is challenged by the enormous uncertainties about the risks, benefits, properties, and future\\u000a direction of nanotechnology applications. Because of these uncertainties, traditional risk management principles such as acceptable\\u000a risk, cost–benefit analysis, and feasibility are unworkable, as is the newest risk management principle, the precautionary\\u000a principle. Yet, simply waiting for these uncertainties to be resolved before undertaking

Gary E. Marchant; Douglas J. Sylvester; Kenneth W. Abbott

2008-01-01

76

On the relativity and uncertainty of distance, time, and energy measurements by man. (1) Derivation of the Weber psychophysical law from the Heisenberg uncertainty principle applied to a superconductive biological detector. (2) The reverse derivation. (3) A human theory of relativity.  

PubMed

The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature. PMID:7330097

Cope, F W

1981-01-01

77

Dimensions of the Precautionary Principle  

Microsoft Academic Search

This essay attempts to provide an analytical apparatus which may be used for finding an authoritative formulation of the Precautionary Principle. Several formulations of the Precautionary Principle are examined. Four dimensions of the principle are identified: (1) the threat dimension, (2) the uncertainty dimension, (3) the action dimension, and (4) the command dimension. It is argued that the Precautionary Principle

Per Sandin

1999-01-01

78

Reflections on uncertainty in risk assessment and risk management by the Society of Environmental Toxicology and Chemistry (SETAC) precautionary principle workgroup.  

PubMed

Quantitative uncertainty assessments and the distribution of risk are under scrutiny and significant criticism has been made of null hypothesis testing when careful consideration of Type I (false positive) and II (false negative) error rates have not been taken into account. An alternative method, equivalence testing, is discussed yielding more transparency and potentially more precaution in the quantifiable uncertainty assessments. With thousands of chemicals needing regulation in the near future and low public trust in the regulatory process, decision models are required with transparency and learning processes to manage this task. Adaptive, iterative, and learning decision making tools and processes can help decision makers evaluate the significance of Type I or Type II errors on decision alternatives and can reduce the risk of committing Type III errors (accurate answers to the wrong questions). Simplistic cost-benefit based decision-making tools do not incorporate the complex interconnectedness characterizing environmental risks, nor do they enhance learning, participation, or include social values and ambiguity. Hence, better decision-making tools are required, and MIRA is an attempt to include some of the critical aspects. PMID:16304937

Sanderson, H; Stahl, C H; Irwin, R; Rogers, M D

2005-01-01

79

Measurement uncertainty.  

PubMed

The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill. PMID:18573808

Bartley, David; Lidén, Göran

2008-06-23

80

Retina as Reciprocal Spatial Fourier Transform Space Implies ``Wave-transformation'' Functions, String Theory, the Inappropriate Uncertainty Principle, and Predicts ``Quarked'' Protons.  

NASA Astrophysics Data System (ADS)

Vision, via transform space: ``Nature behaves in a reciprocal way;' also, Rect x pressure-input sense-reports as Sinc p, indicating brain interprets reciprocal ``p'' space as object space. Use Mott's and Sneddon's Wave Mechanics and Its Applications. Wave transformation functions are strings of positron, electron, proton, and neutron; uncertainty is a semantic artifact. Neutrino-string de Broglie-Schr"odinger wave-function models for electron, positron, suggest three-quark models for protons, neutrons. Variably vibrating neutrino-quills of this model, with appropriate mass-energy, can be a vertical proton string, quills leftward; thread string circumferentially, forming three interlinked circles with ``overpasses''. Diameters are 2:1:2, center circle has quills radially outward; call it a down quark, charge --1/3, charge 2/3 for outward quills, the up quarks of outer circles. String overlap summations are nodes; nodes also far left and right. Strong nuclear forces may be --px. ``Dislodging" positron with neutrino switches quark-circle configuration to 1:2:1, `downers' outside. Unstable neutron charge is 0. Atoms build. With scale factors, retinal/vision's, and quantum mechanics,' spatial Fourier transforms/inverses are equivalent.

Mc Leod, Roger David; Mc Leod, David M.

2007-10-01

81

Laser triangulation: fundamental uncertainty in distance measurement.  

PubMed

We discuss the uncertainty limit in distance sensing by laser triangulation. The uncertainty in distance measurement of laser triangulation sensors and other coherent sensors is limited by speckle noise. Speckle arises because of the coherent illumination in combination with rough surfaces. A minimum limit on the distance uncertainty is derived through speckle statistics. This uncertainty is a function of wavelength, observation aperture, and speckle contrast in the spot image. Surprisingly, it is the same distance uncertainty that we obtained from a single-photon experiment and from Heisenberg's uncertainty principle. Experiments confirm the theory. An uncertainty principle connecting lateral resolution and distance uncertainty is introduced. Design criteria for a sensor with minimum distanc uncertainty are determined: small temporal coherence, small spatial coherence, a large observation aperture. PMID:20862156

Dorsch, R G; Häusler, G; Herrmann, J M

1994-03-01

82

UNIFIED THEORY'S NEW PRINCIPLE OF NULL ACTION  

Microsoft Academic Search

Quantum Theory's Uncertainty Principle violates conservation of energy & momentum and invokes matter's unrealistic creation from, and dissolution into, nothing. In Unified Theory action is an evolute of sharmon with Planck constant h as its quantum. The inviolable conservations of energy and momentum ordain conservation of action, invalidation of the Uncertainty Principle and introduction of the new Principle of Null

Rati Ram Sharma

83

Uncertainties in Long-Term Geologic Offset Rates of Faults: General Principles Illustrated With Data From California and Other Western States  

NASA Astrophysics Data System (ADS)

Because the slip rates of seismic faults are highly variable, a better target for statistical estimation is the long- term offset rate, which can be defined as the rate of one component of the slip which would be measured between any two times when fault-plane shear tractions are equal. The probability density function for the sum of elastic offset plus fault slip offset since a particular geologic event includes uncertainties associated with changes in elastic strain between that event and the present, which are estimated from the sizes of historic earthquake offsets on other faults of similar type. The probability density function for the age of a particular geologic event may be non-Gaussian, especially if it is determined from cross-cutting relations, or from radiocarbon or cosmogenic-nuclide ages containing inheritance. Two alternate convolution formulas relating the distributions for offset and age give the probability density function for long-term offset rate; these are computed for most published cases of dated offset features along active faults in California and other western states. After defining a probabilistic measure of disagreement between two long-term offset rate distributions measured on the same fault section, I investigate how disagreement varies with geologic time (difference in age of the offset features) and with publication type (primary, secondary, or tertiary). Patterns of disagreement suggest that at least 4.3% of offset rates in primary literature are incorrect (due to failure to span the whole fault, undetected complex initial shapes of offset features, or faulty correlation in space or in geologic time) or unrepresentative (due to variations in offset rate along the trace). Tertiary (third-hand) literature sources have a higher error rate of 14.5%. In the western United States, it appears that rates from offset features as old as 3 Ma can be averaged without introducing age-dependent bias. Offsets of older features can and should be used as well, but it is necessary to make allowance for the increased risk, rising to rapidly to 48%, that they are inapplicable to neotectonics. Based on these results, best-estimate combined probability density functions are computed for the long-term offset rates of all active faults in California and other conterminous western states, and described in tables using several scalar measures. Of 849 active and potentially-active faults in the conterminous western United States, only 48 are "well-constrained" (having combined probability density functions for long-term offset rate in which the width of the 95%-confidence range is smaller than the median). It appears to require about 4 offset features to give an even chance of achieving a well-constrained combined rate, and at least 7 offset features to guarantee it.

Bird, P.

2006-12-01

84

The Uncertainty of Fluxes  

Microsoft Academic Search

In the ordinary quantum Maxwell theory of a free electromagnetic field, formulated on a curved 3-manifold, we observe that\\u000a magnetic and electric fluxes cannot be simultaneously measured. This uncertainty principle reflects torsion: fluxes modulo\\u000a torsion can be simultaneously measured. We also develop the Hamilton theory of self-dual fields, noting that they are quantized\\u000a by Pontrjagin self-dual cohomology theories and that

Daniel S. Freed; Gregory W. Moore; Graeme Segal

2007-01-01

85

Uncertainty, information, and time-frequency distributions  

Microsoft Academic Search

ABSTRACT The well-known uncertaintly principle is of- ten invoked in signal processing. It is also often considered to have the same implications in sig- nal analysis as does the uncertainty principle in quantum,mechanics. The uncertainty principle is often incorrectly interpreted to mean that one cannot locate the time-frequency coordi- nates of a signal with arbitrarily good precision, since, in quantum

W. J. Williams; M. L. Brown; A. O. Hero

1991-01-01

86

Evaluation of uncertainty visualization techniques for information fusion  

Microsoft Academic Search

This paper highlights the importance of uncertainty visualization in information fusion, reviews general methods of representing uncertainty and presents perceptual and cognitive principles from Tufte, Chambers and Bertin as well as users experiments documented in the literature. Examples of uncertainty representations in information fusion are analyzed using these general theories. These principles can be used in future theoretical evaluations of

Maria Riveiro

2007-01-01

87

Uncertainty principles and ideal atomic decomposition  

Microsoft Academic Search

Suppose a discrete-time signal S(t), 0⩽t

David L. Donoho; Xiaoming Huo

2001-01-01

88

Uncertainties in estimating measurement uncertainties.  

National Technical Information Service (NTIS)

All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by ...

J. P. Clark A. H. Shull

1994-01-01

89

Measuring Uncertainty  

NSDL National Science Digital Library

This article, authored by P.G. Moore for the Royal Statistical Society's website, provides well-defined exercises to assess the probabilities of decision-making and the degree of uncertainty. The author states the focus of the article as: "When analyzing situations which involve decisions to be made as between alternative courses of action under conditions of uncertainty, decision makers and their advisers are often called upon to assess judgmental probability distributions of quantities whose true values are unknown to them. How can this judgment be taught?" Moore provides five different exercises and even external reference for those interested in further study of the topic.

Moore, P. G.

2009-04-08

90

Teaching Uncertainties  

ERIC Educational Resources Information Center

The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

Duerdoth, Ian

2009-01-01

91

Teaching Uncertainties  

ERIC Educational Resources Information Center

|The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

Duerdoth, Ian

2009-01-01

92

Bernoulli's Principle  

ERIC Educational Resources Information Center

|Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

Hewitt, Paul G.

2004-01-01

93

Entropic uncertainty relations under the relativistic motion  

NASA Astrophysics Data System (ADS)

The uncertainty principle bounds our ability to simultaneously predict two incompatible observables of a quantum particle. Assisted by a quantum memory to store the particle, this uncertainty could be reduced and quantified by a new Entropic Uncertainty Relation (EUR). In this Letter, we explore how the relativistic motion of the system would affect the EUR in two sample scenarios. First, we show that the Unruh effect of an accelerating particle would surely increase the uncertainty if the system and particle entangled initially. On the other hand, the entanglement could be generated from nonuniform motion once the Unruh decoherence is prevented by utilizing the cavity. We show that, in a uncertainty game between an inertial cavity and a nonuniformly accelerated one, the uncertainty evolves periodically with respect to the duration of acceleration segment. Therefore, with properly chosen cavity parameters, the uncertainty bound could be protected. Implications of our results for gravitation are also discussed.

Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

2013-10-01

94

Uncertainty analysis  

SciTech Connect

An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

Thomas, R.E.

1982-03-01

95

Uncertainty in the classroom---teaching quantum physics  

Microsoft Academic Search

The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how it can be used to elucidate many topics in

K. E. Johansson; D. Milstead

2008-01-01

96

Uncertainty in the Classroom--Teaching Quantum Physics  

ERIC Educational Resources Information Center

|The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing…

Johansson, K. E.; Milstead, D.

2008-01-01

97

Entropic uncertainty relations in multidimensional position and momentum spaces  

SciTech Connect

Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

Huang Yichen [Department of Physics, University of California, Berkeley, Berkeley, California 94720 (United States)

2011-05-15

98

Communication and Uncertainty Management.  

ERIC Educational Resources Information Center

|Suggests the fundamental challenge for refining theories of communication and uncertainty is to abandon the assumption that uncertainty will produce anxiety. Outlines and extends a theory of uncertainty management and reviews current theory and research. Concludes that people want to reduce uncertainty because it is threatening, but uncertainty

Brashers, Dale E.

2001-01-01

99

Pascal's Principle  

NSDL National Science Digital Library

This site from HyperPhysics provides a description of Pascal's Principle, which explains how pressure is transmitted in an enclosed fluid. Drawings and sample calculations are provided. Examples illustrating the principle include a hydraulic press and an automobile hydraulic lift.

Nave, Carl R.

2011-11-28

100

The physical origins of the uncertainty theorem  

NASA Astrophysics Data System (ADS)

The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

Giese, Albrecht

2013-10-01

101

The uncertainties in estimating measurement uncertainties  

SciTech Connect

All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties.

Clark, J.P.; Shull, A.H.

1994-07-01

102

Mechanics of Uncertainty: Managing Uncertainty in Mechanics  

Microsoft Academic Search

Uncertainty is ubiquitous in the natural, engineered, and social environ- ments. Devising rationales for explaining it, strategies for its integration into scientic determinism and mitigating its consequences has been an active arena of rational endeavor where many scientic concepts have taken turn at fame and infamy. Far from being a static concept, uncertainty is the complement of knowledge, and as

Roger G. Ghanem

103

Uncertainty in artificial intelligence  

SciTech Connect

Dealing with uncertainty is central to Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy. Some of the notable issues which emerge from these papers revolve around an interval based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

Kanal, L.N.; Lemmer, J.F.

1986-01-01

104

Bernoulli's Principle  

NSDL National Science Digital Library

Bernoulli's principle relates the pressure of a fluid to its elevation and its speed. Bernoulli's equation can be used to approximate these parameters in water, air or any fluid that has very low viscosity. Students learn about the relationships between the components of the Bernoulli equation through real-life engineering examples and practice problems.

Integrated Teaching And Learning Program And Laboratory

105

Bernoulli's Principle  

NSDL National Science Digital Library

In this lab, students will use a little background information about Bernoulli's principle to figure out how the spinning of a moving ball affects its trajectory. The activity is inquiry in that students will be discovering this relationship on their own.

Horton, Michael

2009-05-30

106

Uncertainty in audiometer calibration  

NASA Astrophysics Data System (ADS)

The objective of this work is to present a metrology study necessary for the accreditation of audiometer calibration procedures at the National Brazilian Institute of Metrology Standardization and Industrial Quality—INMETRO. A model for the calculation of measurement uncertainty was developed. Metrological aspects relating to audiometer calibration, traceability and measurement uncertainty were quantified through comparison between results obtained at the Industrial Noise Laboratory—LARI of the Federal University of Santa Catarina—UFSC and the Laboratory of Electric/acoustics—LAETA of INMETRO. Similar metrological performance of the measurement system used in both laboratories was obtained, indicating that the interlaboratory results are compatible with the expected values. The uncertainty calculation was based on the documents: EA-4/02 Expression of the Uncertainty of Measurement in Calibration (European Co-operation for Accreditation 1999 EA-4/02 p 79) and Guide to the Expression of Uncertainty in Measurement (International Organization for Standardization 1993 1st edn, corrected and reprinted in 1995, Geneva, Switzerland). Some sources of uncertainty were calculated theoretically (uncertainty type B) and other sources were measured experimentally (uncertainty type A). The global value of uncertainty calculated for the sound pressure levels (SPLs) is similar to that given by other calibration institutions. The results of uncertainty related to measurements of SPL were compared with the maximum uncertainties Umax given in the standard IEC 60645-1: 2001 (International Electrotechnical Commission 2001 IEC 60645-1 Electroacoustics—Audiological Equipment—Part 1:—Pure-Tone Audiometers).

Aurélio Pedroso, Marcos; Gerges, Samir N. Y.; Gonçalves, Armando A., Jr.

2004-02-01

107

Uncertainty and Cognitive Control  

PubMed Central

A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

2011-01-01

108

Psychosomatic Principles  

PubMed Central

There are four lines of development that might be called psychosomatic principles. The first represents the work initiated by Claude Bernard, Cannon, and others, in neurophysiology and endocrinology in relationship to stress. The second is the application of psychoanalytic formulations to the understanding of illness. The third is in the development of the social sciences, particularly anthropology, social psychology and sociology with respect to the emotional life of man, and, fourth, there is an increased application of epidemiological techniques to the understanding and incidence of disease and its causes. These principles can be applied to the concepts of comprehensive medicine and they bid fair to be unifying and helpful in its study. This means that future practitioners, as well as those working in the field of psychosomatic medicine, are going to have to have a much more precise knowledge of the influence of emotions on bodily processes.

Cleghorn, R. A.

1965-01-01

109

Uncertainty quantification for Markov chain models  

NASA Astrophysics Data System (ADS)

Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

Meidani, Hadi; Ghanem, Roger

2012-12-01

110

Generalized entropic uncertainty relations  

Microsoft Academic Search

A new class of uncertainty relations is derived for pairs of observables in a finite-dimensional Hilbert space which do not have any common eigenvector. This class contains an ``entropic'' uncertainty relation which improves a previous result of Deutsch and confirms a recent conjecture by Kraus. Some comments are made on the extension of these relations to the case where the

Hans Maassen; J. B. M. Uffink

1988-01-01

111

The Uncertainty Relation for Quantum Propositions  

NASA Astrophysics Data System (ADS)

Logical propositions with the fuzzy modality "Probably" are shown to obey an uncertainty principle very similar to that of Quantum Optics. In the case of such propositions, the partial truth values are in fact probabilities. The corresponding assertions in the metalanguage, have complex assertion degrees which can be interpreted as probability amplitudes. In the logical case, the uncertainty relation is about the assertion degree, which plays the role of the phase, and the total number of atomic propositions, which plays the role of the number of modes. In analogy with coherent states in quantum physics, we define "quantum coherent propositions" those which minimize the above logical uncertainty relation. Finally, we show that there is only one kind of compound quantum-coherent propositions: the "cat state" propositions.

Zizzi, Paola

2013-01-01

112

The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment  

ERIC Educational Resources Information Center

|An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…

Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea

2010-01-01

113

Quantum imaging, quantum lithography and the uncertainty principle  

Microsoft Academic Search

:   One of the most surprising consequences of quantum mechanics is the entanglement of two or more distant particles. Even though\\u000a we still have questions in regard to fundamental issues of the entangled quantum systems, quantum entanglement has started\\u000a to play important roles in practical applications. Quantum imaging is one of the hot topics. Quantum imaging has many interesting\\u000a features

Y. Shih

2003-01-01

114

Quantum imaging, quantum lithography and the uncertainty principle  

Microsoft Academic Search

One of the most surprising consequences of quantum mechanics is the entanglement of two or more distant particles. The ghost image experiment demonstrated the astonishing nonlocal behavior of an entangled photon pair. Even though we still have question in regard to fundamental issues of the entangled quantum systems, quantum entanglement has started to play important roles in practical applications. Quantum

Y. Shih

2003-01-01

115

The Effects of Discounted Cost on the Uncertainty Threshold Principle.  

National Technical Information Service (NTIS)

The optimal stochastic control of a linear system with purely random parameters and with respect to a discounted quadratic index of performance is considered. It is shown that if a function involving the parameter variances and the discount factor exceeds...

R. Ku M. Athans P. Varaiya

1977-01-01

116

The Precautionary Principle and Nanotechnology Risk Assessments  

Microsoft Academic Search

The great uncertainty over the potentially negative consequences of nanotechnology research demands that regulators and entities playing a role in implementing this relatively new field of technology assess which risks are acceptable, and which must be curtailed. A possible method of addressing these risks is embodied in the Precautionary Principle, a strategy of pursuing preventative measures against negative externalities when

Jamie W. Coslett

2005-01-01

117

Information Theoretic Quantification of Diagnostic Uncertainty  

PubMed Central

Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes’ rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians’ deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians’ application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

2012-01-01

118

Strategy for Uncertainty Visualization Design.  

National Technical Information Service (NTIS)

Visualizing uncertainty can be a challenging endeavour. In an attempt to minimize the challenges, this paper defines a systematic approach to designing a visual representation of uncertainty called the Uncertainty Visualization Development Strategy (UVDS)...

A. S. Lapinski

2009-01-01

119

Minimum Uncertainty and Entanglement  

NASA Astrophysics Data System (ADS)

We address the question, does a system A being entangled with another system B, put any constraints on the Heisenberg uncertainty relation (or the Schrödinger-Robertson inequality)? We find that the equality of the uncertainty relation cannot be reached for any two noncommuting observables, for finite dimensional Hilbert spaces if the Schmidt rank of the entangled state is maximal. One consequence is that the lower bound of the uncertainty relation can never be attained for any two observables for qubits, if the state is entangled. For infinite-dimensional Hilbert space too, we show that there is a class of physically interesting entangled states for which no two noncommuting observables can attain the minimum uncertainty equality.

Hari Dass, N. D.; Qureshi, Tabish; Sheel, Aditi

2013-06-01

120

Analysis of Infiltration Uncertainty  

SciTech Connect

In a total-system performance assessment (TSPA), uncertainty in the performance measure (e.g., radiation dose) is estimated by first estimating the uncertain y in the input variables and then propagating that uncertain y through the model system by means of Monte Carlo simulation. This paper discusses uncertainty in surface infiltration, which is one of the input variables needed for performance assessments of the Yucca Mountain site. Infiltration has been represented in recent TSPA simulations by using three discrete infiltration maps (i.e., spatial distributions of infiltration) for each climate state in the calculation of unsaturated-zone flow and transport. A detailed uncertainty analysis of infiltration was carried out for two purposes: to better quantify the possible range of infiltration, and to determine what probability weights should be assigned to the three infiltration cases in a TSPA simulation. The remainder of this paper presents the approach and methodology for the uncertainty analysis, along with a discussion of the results.

MCCURLEY,RONALD D.; HO,CLIFFORD K.; WILSON,MICHAEL L.; HEVESI,JOSEPH A.

2000-10-30

121

Uncertainty: Medicine's Frequent Companion  

MedlinePLUS

... no means rare. Back to top The Elusive Gold Standard The "gold standard" is a concept commonly embraced by doctors — ... one or the other. The biopsy is the gold standard, and there is generally little uncertainty about ...

122

Aggregating and Communicating Uncertainty.  

National Technical Information Service (NTIS)

Strategic intelligence products contain a substantial element of uncertainty which must be communicated to the consumer if they are to be used effectively. As a social science, estimative intelligence uses methods which are largely global or intuitive in ...

J. M. Morris R. J. D'Amore

1980-01-01

123

Analysis of Infiltration Uncertainty.  

National Technical Information Service (NTIS)

The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site...

R. McCurley

2003-01-01

124

Uncertainty and Variability Analysis  

Microsoft Academic Search

\\u000a Statistical methods are an inseparable component of all modeling studies. Most environmental processes we have covered in\\u000a this book do not lend themselves to a deterministic mode of analysis because of the inherent uncertainties involved in the\\u000a parameter values that describe the physical system analyzed. These uncertainties may arise from the randomness of the natural\\u000a processes, a lack of data

Mustafa M. Aral

125

Principlism and moral dilemmas: a new principle  

PubMed Central

Moral conflicts occur in theories that involve more than one principle. I examine basic ways of dealing with moral dilemmas in medical ethics and in ethics generally, and propose a different approach based on a principle I call the "mutuality principle". It is offered as an addition to Tom Beauchamp and James Childress' principlism. The principle calls for the mutual enhancement of basic moral values. After explaining the principle and its strengths, I test it by way of an examination of three responses—in the recent Festschrift for Dr Raanon Gillon—to a case involving parental refusal of a blood transfusion. The strongest response is the one that comes closest to the requirements of the mutuality principle but yet falls short. I argue that the mutuality principle provides an explicit future orientation in principlism and gives it greater moral coherence.

DeMarco, J

2005-01-01

126

Role of the precautionary principle in water recycling  

Microsoft Academic Search

In an engineering context the precautionary principle is often perceived as an excuse to do nothing or a substantial barrier to technical progress. The precautionary principle requires that remedial measures be taken in situations of scientific uncertainty where evidence of harm cannot be proven but potential damage to human or environmental health is significant. In this paper the scope of

A. I. Schäfera; S. Beder

2006-01-01

127

The Precautionary Principle and the Prevention of Marine Pollution  

Microsoft Academic Search

This paper argues that the environmental changes witnessed in the past decade call for a new approach to environmental management; an approach based not on the principle of the assimilative capacity of the environment but on the precautionary principle, and the emerging preventive environmental paradigm. Uncertainties in scientific knowledge and complexities in ecological systems have presented specific failures of the

Tim Jackson; Peter J. Taylor

1992-01-01

128

Classification images with uncertainty  

PubMed Central

Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms.

Tjan, Bosco S.; Nandy, Anirvan S.

2009-01-01

129

Measurement Uncertainty Estimation in Amperometric Sensors: A Tutorial Review  

PubMed Central

This tutorial focuses on measurement uncertainty estimation in amperometric sensors (both for liquid and gas-phase measurements). The main uncertainty sources are reviewed and their contributions are discussed with relation to the principles of operation of the sensors, measurement conditions and properties of the measured samples. The discussion is illustrated by case studies based on the two major approaches for uncertainty evaluation–the ISO GUM modeling approach and the Nordtest approach. This tutorial is expected to be of interest to workers in different fields of science who use measurements with amperometric sensors and need to evaluate the uncertainty of the obtained results but are new to the concept of measurement uncertainty. The tutorial is also expected to be educative in order to make measurement results more accurate.

Helm, Irja; Jalukse, Lauri; Leito, Ivo

2010-01-01

130

The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making under Dynamic Uncertainty.  

National Technical Information Service (NTIS)

This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exi...

M. Athans R. Ku S. B. Gershwin

1976-01-01

131

Site uncertainty, allocation uncertainty, and superfund liability valuation  

Microsoft Academic Search

The amount and timing of a firm's ultimate financial obligation for contingent liabilities is uncertain and subject to the outcome of future events. We decompose uncertainty about Superfund contingent liabilities into two sources: (1) uncertainty regarding site clean-up cost (site uncertainty); and (2) uncertainty regarding allocation of total site-clean-up cost across multiple parties associated with the site (allocation uncertainty). We

Katherine Campbell; Stephan E. Sefcik; Naomi S. Soderstrom

1998-01-01

132

Uncertainty relation for photons.  

PubMed

The uncertainty relation for the photons in three dimensions that overcomes the difficulties caused by the nonexistence of the photon position operator is derived in quantum electrodynamics. The photon energy density plays the role of the probability density in configuration space. It is shown that the measure of the spatial extension based on the energy distribution in space leads to an inequality that is a natural counterpart of the standard Heisenberg relation. The equation satisfied by the photon wave function in momentum space which saturates the uncertainty relations has the form of the Schrödinger equation in coordinate space in the presence of electric and magnetic charges. PMID:22540772

Bialynicki-Birula, Iwo; Bialynicka-Birula, Zofia

2012-04-03

133

The legacy of uncertainty  

NASA Astrophysics Data System (ADS)

An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

Brown, Laurie M.

1993-03-01

134

Simple Resonance Hierarchy for Surmounting Quantum Uncertainty  

SciTech Connect

For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

Amoroso, Richard L. [Noetic Advanced Studies Institute, Oakland, CA 94610-1422 (United States)

2010-12-22

135

Uncertainties in repository modeling  

SciTech Connect

The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

Wilson, J.R.

1996-12-31

136

Separability conditions from the Landau-Pollak uncertainty relation  

SciTech Connect

We obtain a collection of necessary (sufficient) conditions for a bipartite system of qubits to be separable (entangled), which are based on the Landau-Pollak formulation of the uncertainty principle. These conditions are tested and compared with previously stated criteria by applying them to states whose separability limits are already known. Our results are also extended to multipartite and higher-dimensional systems.

Vicente, Julio I. de [Departamento de Matematicas, Universidad Carlos III de Madrid, Avenida de la Universidad 30, 28911 Leganes, Madrid (Spain); Sanchez-Ruiz, Jorge [Departamento de Matematicas, Universidad Carlos III de Madrid, Avenida de la Universidad 30, 28911 Leganes, Madrid (Spain); Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071 Granada (Spain)

2005-05-15

137

Stochastic modeling of uncertainties in computational structural dynamics—Recent theoretical advances  

NASA Astrophysics Data System (ADS)

This paper deals with a short overview on stochastic modeling of uncertainties. We introduce the types of uncertainties, the variability of real systems, the types of probabilistic approaches, the representations for the stochastic models of uncertainties, the construction of the stochastic models using the maximum entropy principle, the propagation of uncertainties, the methods to solve the stochastic dynamical equations, the identification of the prior and the posterior stochastic models, the robust updating of the computational models and the robust design with uncertain computational models. We present recent theoretical advances in this field concerning the parametric and the nonparametric probabilistic approaches of uncertainties in computational structural dynamics for the construction of the prior stochastic models of both the uncertainties on the computational model parameters and on the modeling uncertainties, and for their identification with experimental data. We also present the construction of the posterior stochastic model of uncertainties using the Bayesian method when experimental data are available.

Soize, C.

2013-05-01

138

An exploration of the uncertainty relation satisfied by BP network learning ability and generalization ability  

Microsoft Academic Search

This paper analyses the intrinsic relationship between the BP network learning ability and generalization ability and other\\u000a influencing factors when the overfit occurs, and introduces the multiple correlation coefficient to describe the complexity\\u000a of samples; it follows the calculation uncertainty principle and the minimum principle of neural network structural design,\\u000a provides an analogy of the general uncertainty relation in the

Zuoyong Li; Lihong Peng

2004-01-01

139

Generalized Quantization Principle in Canonical Quantum Gravity and Application to Quantum Cosmology  

Microsoft Academic Search

In this paper is considered a generalized quantization principle for the gravitational field in canonical quantum gravity, especially with respect to quantum geometrodynamics. This assumption can be interpreted as a transfer from the generalized uncertainty principle in quantum mechanics, which is postulated as generalization of the Heisenberg algebra to introduce a minimal length, to a corresponding quantization principle concerning the

Martin Kober

2011-01-01

140

Precautionary principles: general definitions and specific applications to genetically modified organisms  

Microsoft Academic Search

Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find specific principles unacceptably vague or see them as clearly

Ragnar E. Löfstedt; Baruch Fischhoff; Ilya R. Fischhoff

2002-01-01

141

Information-theoretic measure of uncertainty due to quantum and thermal fluctuations  

Microsoft Academic Search

We study an information-theoretic measure of uncertainty for quantum systems. It is the Shannon information I of the phase-space probability distribution , where ||z> are coherent states and rho is the density matrix. As shown by Lieb I>=1, and this bound represents a strengthened version of the uncertainty principle. For a harmonic oscillator in a thermal state, I coincides with

Arlen Anderson; Jonathan J. Halliwell

1993-01-01

142

Uncertainty measures and uncertainty relations for angle observables  

Microsoft Academic Search

Uncertainty measures must not depend on the choice of origin of the measurement scale; it is therefore argued that quantum-mechanical uncertainty relations, too, should remain invariant under changes of origin. These points have often been neglected in dealing with angle observables. Known measures of location and uncertainty for angles are surveyed. The angle variance angv {ø} is defined and discussed.

Ernst Breitenberger

1985-01-01

143

Position-momentum uncertainty relations based on moments of arbitrary order  

SciTech Connect

The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.

Zozor, Steeve [Laboratoire Grenoblois d'Image, Parole, Signal et Automatique (GIPSA-Lab, CNRS), 961 rue de la Houille Blanche, F-38402 Saint Martin d'Heres (France); Portesi, Mariela [Instituto de Fisica La Plata (CONICET), and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina); Sanchez-Moreno, Pablo [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, E-18071 Granada (Spain); Departamento de Matematica Aplicada, Universidad de Granada, E-18071 Granada (Spain); Dehesa, Jesus S. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, E-18071 Granada (Spain); Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain)

2011-05-15

144

PARTICIPATION UNDER UNCERTAINTY  

Microsoft Academic Search

This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an

Moses A. Boudourides

145

Inherent uncertainty of air kerma  

NASA Astrophysics Data System (ADS)

Air kerma is defined in air, the constituents and the amounts of which are determined experimentally with some uncertainties. The inherent uncertainty of air kerma in a photon beam stemming from the uncertainties of atomic weights and composition data in air was evaluated as a function of photon energy. The uncertainty was 1 part in 104 for photon beams with energies in the range 1 keV to 20 MeV.

Yi, Chul-Young

2013-04-01

146

Intuitions, principles and consequences  

PubMed Central

Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. Key Words: Intuitions • principles • consequences • utilitarianism

Shaw, A

2001-01-01

147

Practice uncertainty: changing perceptions.  

PubMed

HOW TO OBTAIN CONTACT HOURS BY READING THIS ISSUE Instructions: 1.2 contact hours will be awarded by Villanova University College of Nursing upon successful completion of this activity. A contact hour is a unit of measurement that denotes 60 minutes of an organized learning activity. This is a learner-based activity. Villanova University College of Nursing does not require submission of your answers to the quiz. A contact hour certificate will be awarded once you register, pay the registration fee, and complete the evaluation form online at https://villanova.gosignmeup.com/dev_students.asp?action=browse&main=Nursing+Journals&misc=564. In order to obtain contact hours you must: 1. Read the article, "Practice Uncertainty: Changing Perceptions," found on pages 439-444, carefully noting any tables and other illustrative materials that are included to enhance your knowledge and understanding of the content. Be sure to keep track of the amount of time (number of minutes) you spend reading the article and completing the quiz. 2. Read and answer each question on the quiz. After completing all of the questions, compare your answers to those provided within this issue. If you have incorrect answers, return to the article for further study. 3. Go to the Villanova website to register for contact hour credit. You will be asked to provide your name, contact information, and a VISA, MasterCard, or Discover card number for payment of the $20.00 fee. Once you complete the online evaluation, a certificate will be automatically generated. This activity is valid for continuing education credit until September 30, 2015. CONTACT HOURS This activity is co-provided by Villanova University College of Nursing and SLACK Incorporated. Villanova University College of Nursing is accredited as a provider of continuing nursing education by the American Nurses Credentialing Center's Commission on Accreditation. OBJECTIVES Review factors leading to practice change. Describe actions that can be taken to reduce negative consequences of practice uncertainty. DISCLOSURE STATEMENT Neither the planners nor the authors have any conflicts of interest to disclose. Practice uncertainty occurs when health care providers feel uncomfortable in response to unfamiliar or challenging patient care situations. Practice uncertainty is inevitable in health care, and there are many contextual factors that can lead to either good or bad outcomes for patients and health care providers. Practice uncertainty is not a well-established concept in the literature, perhaps because of the predominant empirical paradigm and the high value placed on certainty within current health care culture. This study was conducted to explore practice uncertainty and bring this topic into the foreground as a first step toward practice evolution. A shift in the perception of practice uncertainty may change the way in which practitioners experience this phenomenon. This process must start with nursing educators recognizing and acknowledging this phenomenon when it occurs. J Contin Educ Nurs 2013;44(10):439-444. PMID:23875604

Vaid, Patrycja R; Ewashen, Carol; Green, Theresa

2013-07-23

148

Earnings Uncertainty and Precautionary Saving  

Microsoft Academic Search

We test for the presence of precautionary saving using a self-reported measure of earnings uncertainty drawn from the 1989 Italian Survey of Household Income and Wealth. The effect of uncertainty on saving and wealth accumulations is consistent with the theory of precautionary saving and with decreasing prudence. Earnings uncertainty, however, explains only a small fraction of saving and asset accumulation.

Luigi Guiso; Tullio Jappelli; Daniele Terlizzese

1992-01-01

149

Multidisciplinary Structural Optimization Considering Uncertainty  

Microsoft Academic Search

This paper proposes structural optimization considering uncertainty. Loading and inclination uncertainties are considered in this project. The methodology of structural optimization considering uncertainty results in structural designs which perform adequately when exposed to a range of uncertain loading and inclination conditions. The performance metrics used in this study are structural assembly time and maximum stress. The improvement of these metrics

William Nadir

150

Collective uncertainty entanglement test.  

PubMed

For a given pure state of a composite quantum system we analyze the product of its projections onto a set of locally orthogonal separable pure states. We derive a bound for this product analogous to the entropic uncertainty relations. For bipartite systems the bound is saturated for maximally entangled states and it allows us to construct a family of entanglement measures, we shall call collectibility. As these quantities are experimentally accessible, the approach advocated contributes to the task of experimental quantification of quantum entanglement, while for a three-qubit system it is capable to identify the genuine three-party entanglement. PMID:22107275

Rudnicki, ?ukasz; Horodecki, Pawe?; Zyczkowski, Karol

2011-10-03

151

Missing Principle of War.  

National Technical Information Service (NTIS)

The nine principles of war have guided American military doctrine since 1921. However, these principles since their formulation eighty years ago, have not accounted for the human dimension of war. Major military operations and campaigns are not won or los...

B. E. Kulifay

2000-01-01

152

Principles of Political Ecology.  

National Technical Information Service (NTIS)

Presented are the concepts and principles of political ecology, i.e., the application of ecological principles to public affairs. The public issues of direct concern in political ecology are environmental quality; energy, materials, and other natural reso...

R. L. Shelton

1976-01-01

153

Chemical Principls Exemplified  

ERIC Educational Resources Information Center

|Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)|

Plumb, Robert C.

1973-01-01

154

Pauli's Exclusion Principle  

NASA Astrophysics Data System (ADS)

1. The exclusion principle: a philosophical overview; 2. The origins of the exclusion principle: an extremely natural prescriptive rule; 3. From the old quantum theory to the new quantum theory: reconsidering Kuhn's incommensurability; 4. How Pauli's rule became the exclusion principle: from the Fermi-Dirac statistics to the spin-statistics theorem; 5. The exclusion principle opens up new avenues: from the eightfold way to quantum chromodynamics.

Massimi, Michela

2012-10-01

155

Towards richer process principles  

Microsoft Academic Search

Today's and tomorrow's complex, interdependent, dynamic systems require richer process principles than the simplistic principles in the Agile Manifesto or in simplistic sequential waterfall or Vee models. The resulting principles should capitalize on the strengths of these while avoiding their weaknesses.

Barry Boehm

2011-01-01

156

Chemical Principles Exemplified  

ERIC Educational Resources Information Center

This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

Plumb, Robert C.

1970-01-01

157

ONLINE COLLABORATION PRINCIPLES  

Microsoft Academic Search

This paper uses the community of inquiry model to describe the principles of collaboration. The principles describe social and cognitive presence issues associated with the three functions of teaching presence—design, facilitation and direction. Guidelines are discussed for each of the principles.

D. R. Garrison

158

Deriving Acquisition Principles from Tutoring Principles  

Microsoft Academic Search

This paper describes our analysis of the literature on tutorial dialogues and presents a compilation of useful principles\\u000a that students and teachers typically follow in making tutoring interactions successful. The compilation is done in the context\\u000a of making use of those principles in building knowledge acquisition interfaces since acquisition interfaces can be seen as\\u000a students acquiring knowledge from the user.

Jihie Kim; Yolanda Gil

2002-01-01

159

Quantum maximum entropy principle for fractional exclusion statistics.  

PubMed

Using the Wigner representation, compatibly with the uncertainty principle, we formulate a quantum maximum entropy principle for the fractional exclusion statistics. By considering anyonic systems satisfying fractional exclusion statistic, all the results available in the literature are generalized in terms of both the kind of statistics and a nonlocal description for excluson gases. Gradient quantum corrections are explicitly given at different levels of degeneracy and classical results are recovered when ??0. PMID:23383879

Trovato, M; Reggiani, L

2013-01-09

160

Managing Uncertainty in Data and Models: UncertWeb  

NASA Astrophysics Data System (ADS)

There is an increasing recognition that issues of quality, error and uncertainty are central concepts to both scientific progress and practical decision making. Recent moves towards evidence driven policy and complex, uncertain scientific investigations into climate change and its likely impacts have heightened the awareness that uncertainty is critical in linking our observations and models to reality. The most natural, principled framework is provided by Bayesian approaches, which recognise a variety of sources of uncertainty such as aleatory (variability), epistemic (lack of knowledge) and possibly ontological (lack of agreed definitions). Most current information models used in the geosciences do not fully support the communication of uncertain results, although some do provide limited support for quality information in metadata. With the UncertWeb project (http://www.uncertweb.org), involving statisticians, geospatial and application scientists and informaticians we are developing a framework for representing and communicating uncertainty in observational data and models which builds on existing standards such as the Observations and Measurements conceptual model, and related Open Geospatial Consortium and ISO standards to allow the communication and propagation of uncertainty in chains of model services. A key component is the description of uncertainties in observational data, based on a revised version of UncertML, a conceptual model and encoding for representing uncertain quantities. In this talk we will describe how we envisage using UncertML with existing standards to describe the uncertainty in observational data and how this uncertainty information can then be propagated through subsequent analysis. We will highlight some of the tools which we are developing within UncertWeb to support the management of uncertainty in web based geoscientific applications.

Nativi, S.; Cornford, D.; Pebesma, E. J.

2010-12-01

161

Uncertainty in Seismic Hazard Assessment  

NASA Astrophysics Data System (ADS)

Uncertainty is a part of our life, and society has to deal with it, even though it is sometimes difficult to estimate. This is particularly true in seismic hazard assessment for large events, such as the mega-tsunami in Southeast Asia and the great New Madrid earthquakes in the central United States. There are two types of uncertainty in seismic hazard assessment: temporal and spatial. Temporal uncertainty describes distribution of the events in time and is estimated from the historical records, while spatial uncertainty describes distribution of physical measurements generated at a specific point by the events and is estimated from the measurements at the point. These uncertainties are of different characteristics and generally considered separately in hazard assessment. For example, temporal uncertainty (i.e., the probability of exceedance in a period) is considered separately from spatial uncertainty (a confidence level of physical measurement) in flood hazard assessment. Although estimating spatial uncertainty in seismic hazard assessment is difficult because there are not enough physical measurements (i.e., ground motions), it can be supplemented by numerical modeling. For example, the ground motion uncertainty or tsunami uncertainty at a point of interest has been estimated from numerical modeling. Estimating temporal uncertainty is particularly difficult, especially for large earthquakes, because there are not enough instrumental, historical, and geological records. Therefore, the temporal and spatial uncertainties in seismic hazard assessment are of different characteristics and should be determined separately. Probabilistic seismic hazard analysis (PSHA), the most widely used method to assess seismic hazard for various aspects of public and financial policy, uses spatial uncertainty (ground motion uncertainty) to extrapolate temporal uncertainty (ground motion occurrence), however. This extrapolation, or so-called ergodic assumption, is caused by a mathematical error in hazard calculation of PSHA: incorrectly equating the conditional exceedance probability of the ground-motion attenuation relationship (a function) to the exceedance probability of the ground-motion uncertainty (a variable). An alternative approach has been developed to correct the error and to determine temporal and spatial uncertainties separately.

Wang, Z.

2006-12-01

162

Living with uncertainty  

SciTech Connect

In the electric utility industry, only one thing can be guaranteed with absolute certainty: one lives and works with many unknowns. Thus, the industry has embraced probability methods to varying degrees over the last 25 years. These techniques aid decision makers in planning, operations, and maintenance by quantifying uncertainty. Examples include power system reliability, production costing simulation, and assessment of environmental factors. A series of brainstorming sessions was conducted by the Application of Probability Methods (APM) Subcommittee of the IEEE Power Engineering Society to identify research and development needs and to ask the question, ''where should we go from here '' The subcommittee examined areas of need in data development, applications, and methods for decision making. The purpose of this article is to share the thoughts of APM members with a broader audience to the findings and to invite comments and participation.

Rau, N.; Fong, C.C.; Grigg, C.H.; Silverstein, B.

1994-11-01

163

Parameter uncertainty for ASP models  

SciTech Connect

The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more than one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.

Knudsen, J.K.; Smith, C.L.

1995-10-01

164

Participatory Development Principles and Practice: Reflections of a Western Development Worker.  

ERIC Educational Resources Information Center

Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

Keough, Noel

1998-01-01

165

Uncertainty and Surprise: An Introduction  

NASA Astrophysics Data System (ADS)

Much of the traditional scientific and applied scientific work in the social and natural sciences has been built on the supposition that the unknowability of situations is the result of a lack of information. This has led to an emphasis on uncertainty reduction through ever-increasing information seeking and processing, including better measurement and observational instrumentation. Pending uncertainty reduction through better information, efforts are devoted to uncertainty management and hierarchies of controls. A central goal has been the avoidance of surprise.

McDaniel, Reuben R.; Driebe, Dean J.

166

Uncertainty quantification in molecular dynamics  

NASA Astrophysics Data System (ADS)

This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the presence of intrinsic noise and parametric uncertainty. The dissertation is organized in successive, self-contained problems of increasing complexity aimed at investigating the target UQ challenge in a progressive fashion.

Rizzi, Francesco

167

Higher-order uncertainty relations  

NASA Astrophysics Data System (ADS)

Using the non-negativity of Gram determinants of arbitrary order, we derive higher-order uncertainty relations for the symmetric uncertainty matrices of corresponding order n?>?2 to n Hermitean operators (n?=?2 is the usual case). The special cases of third-order and fourth-order uncertainty relations are considered in detail. The obtained third-order uncertainty relations are applied to the Lie groups SU(1,1) with three Hermitean basis operators (K1,K2,K0) and SU(2) with three Hermitean basis operators (J1,J2,J3) where, in particular, the group-coherent states of Perelomov type and of Barut Girardello type for SU(1,1) and the spin or atomic coherent states for SU(2) are investigated. The uncertainty relations for the determinant of the third-order uncertainty matrix are satisfied with the equality sign for coherent states and this determinant becomes vanishing for the Perelomov type of coherent states for SU(1,1) and SU(2). As an example of the application of fourth-order uncertainty relations, we consider the canonical operators (Q1,P1,Q2,P2) of two boson modes and the corresponding uncertainty matrix formed by the operators of the corresponding mean deviations, taking into account the correlations between the two modes. In two mathematical appendices, we prove the non-negativity of the determinant of correlation matrices of arbitrary order and clarify the principal structure of higher-order uncertainty relations.

Wünsche, A.

2006-07-01

168

A simple experimental checking of Heisenberg's uncertainty relations  

NASA Astrophysics Data System (ADS)

We show that the quantum mechanical interpretation of the diffraction of light on a slit, when a wave function is assigned to a photon, can be used for a direct experimental study of Heisenberg's position-momentum and equivalent position-wave vector uncertainty relation for the photon. Results of an experimental test of the position-wave vector uncertainty relation, where the wavelength is used as the input parameter, are given and they very well confirm our approach. The same experimental results can also be used for a test of the position-momentum uncertainty relation when the momentum p0 of a photon is known as the input parameter. We show that a measurement of p0, independent of the knowledge of the value of the Planck's constant, is possible. Using that value of p0, a test of the position-momentum uncertainty relation could be regarded as a method for a direct measurement of the Planck's constant. This is discussed, since the diffraction pattern is also well described by classical electrodynamics in the considered experimental conditions. This approach for testing the Heisenberg's uncertainty relations is very simple and consequently suitable as a quantitative exercise in undergraduate experimental courses, as well as a visual and attractive demonstration of the Heisenberg's uncertainty principle in courses of quantum mechanics.

Paši?, S.; Gamulin, O.; Tocilj, Z.

169

Analysis of Infiltration Uncertainty  

SciTech Connect

The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the simulated multi-rectangular region approximating the repository footprint, shown in Figure 1-1. (For brevity, these maps will be referred to as the analog maps, and the corresponding averaged net infiltration values as the analog values.)

R. McCurley

2003-10-27

170

Uncertainty analysis in RECCAP  

NASA Astrophysics Data System (ADS)

The Global Carbon Project RECCAP exercise aims to produce regional analyses of net carbon fluxes between the atmosphere and the land and ocean carbon systems. The project aims to synthesise multiple source of information from modelling, inversions and inventory studies. A careful analysis of uncertainty is essential, both for the final synthesis and for assuring consistency in the process of combining disparate inputs. A unifying approach is to treat the overall analysis as a process of statistical estimation. The broadest-scale grouping of approaches is `top-down' vs. `bottom-up' techniques, but each of these needs to be further partitioned. Top-down approaches generally take the form of inversions, using measurements of carbon dioxide concentrations to either deduce surface concentrations or deduce parameters in spatially-explicit process-based models. These two types of inversion will have somewhat different statistical characteristics, but each will achieve only limited spatial resolution due to the ill-conditioned nature of the inversion. Bottom-up techniques aim to resolve great spatial detail. They comprise both census-type studies (mainly for anthropogenic emissions) and modelling studies with remotely-sensed data to provide spatially and temporally explicit forcing or constraints. Again, these two types of approach are likely to have quite different statistical characteristics. An important issue in combining information is consistency between definitions used for the disparate components. Cases where there is significant potential for ambiguity include wildfire and delayed responses to land-use change. A particular concern is the potential for `double counting' when combining bottom-up estimates with the results of inversion techniques that have incorporated Bayesian constraints using the same data as is used in the bottom-up estimates. The communication of distribution of uncertainty in one time and two space dimensions poses particular challenges. Temporal variability can be usefully characterised in terms of long-term trends, seasonal cycles and irregular variability. Additional choices need to be made concerning the frequency ranges that define each of these components. Spatial resolution remains problematic with the diffuse boundaries of top-down approaches failing to match the sharp boundaries from bottom-up techniques.

Enting, I. G.

2010-12-01

171

Uncertainty of Thermal Diffusivity Measurements by Laser Flash Method  

Microsoft Academic Search

The laser-pulse method is a well-established nonsteady-state measurement technique for measuring the thermal diffusivity, a, of solid homogeneous isotropic opaque materials. BNM-LNE has developed its own bench based on the principle of this method in which the thermal diffusivity is identified according to the “partial time moments method.” Uncertainties of thermal diffusivity by means of this method have been calculated

B. Hay; J. R. Filtz; J. Hameury; L. Rongione

2005-01-01

172

Airport congestion management under uncertainty  

Microsoft Academic Search

This paper is about single airports and airport networks. Linear and non-linear model specifications are applied to analyze the relative welfare effects of slots and congestion pricing under uncertainty. Uncertainty refers to passenger benefits and congestion costs. I show that, from a welfare perspective, congestion pricing is the right choice for single airports in a linear context, but that slots

Achim I. Czerny

2010-01-01

173

Uncertainty in Soft Constraint Problems  

Microsoft Academic Search

Preferences and uncertainty occur in many real-life problems. The theory of possibility is one way of deal- ing with uncertainty, which allows for easy integration with fuzzy preferences. In this paper we consider an existing technique to perform such an integration and, while follow- ing the same basic idea, we propose to generalize it to other classes of preferencesand toprobabilistic

Maria Silvia Pini; Francesca Rossi; Kristen Brent Venable

2005-01-01

174

Uncertainty in soft constraint problem  

Microsoft Academic Search

Preferences and uncertainty occur in many real-life problems. The theory of possibility is one way of dealing with uncertainty, which allows for easy integration with fuzzy preferences. In this paper we consider an existing technique to perform such an integration and, while following the same basic idea, we propose to generalize it to other classes of preferences and to probabilistic

Maria Silvia Pini; Francesca Rossi; K. Brent Venable

2005-01-01

175

Disjunctive Temporal Planning with Uncertainty  

Microsoft Academic Search

Driven by planning problems with both disjunctive constraints and contingency, we define the Disjunc- tive Temporal Problem with Uncertainty (DTPU), an extension of the DTP that includes contingent events. Generalizing existing work on Simple Temporal Problems with Uncertainty, we divide the time-points into controllable and uncontrollable classes, and propose varying notions of controlla- bility to replace the notion of consistency.

Kristen Brent Venable; Neil Yorke-smith

2005-01-01

176

Uncertainty in Bipolar Preference Problems  

Microsoft Academic Search

Preferences and uncertainty are common in many real-life problems. In this paper, we focus on bipolar preferences and on uncertainty modelled via uncontrollable variables. However, some information is provided for such vari- ables, in the form of possibility distributions over their domains. To tackle such problems, we eliminate the uncertain part of the problem, making sure that some desirable properties

Stefano Bistarelli; Maria Silvia Pini; Francesca Rossi; Kristen Brent Venable

2007-01-01

177

Uncertainty in bipolar preference problems  

Microsoft Academic Search

Preferences and uncertainty are common in many real-life problems. In this article, we focus on bipolar preferences and uncertainty modelled via uncontrollable variables, and we assume that uncontrollable variables are specified by possibility distributions over their domains. To tackle such problems, we concentrate on uncertain bipolar problems with totally ordered preferences, and we eliminate the uncertain part of the problem,

Stefano Bistarelli; Maria Silvia Pini; Francesca Rossi; Kristen Brent Venable

2011-01-01

178

Quantification of Emission Factor Uncertainty  

EPA Science Inventory

Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

179

Uncertainty determinants of corporate liquidity  

Microsoft Academic Search

This paper investigates the link between the optimal level of non-financial firms' liquid assets and uncertainty. We develop a partial equilibrium model of precautionary demand for liquid assets showing that firms alter their liquidity ratio in response to changes in either macroeconomic or idiosyncratic uncertainty. We test this hypothesis using a panel of non-financial US firms drawn from the COMPUSTAT

Christopher F. Baum; Mustafa Caglayan; Andreas Stephan; Oleksandr Talavera

2008-01-01

180

Environmental prices, uncertainty, and learning  

Microsoft Academic Search

There is an increasing demand for putting a shadow price on the environment to guide public policy and incentivize private behaviour. In practice, setting that price can be extremely difficult as uncertainties abound. There is often uncertainty not just about individual parameters but about the structure of the problem and how to model it. A further complication is the second-best

Simon Dietz; Samuel Fankhauser

2010-01-01

181

Accounting Anomalies and Information Uncertainty  

Microsoft Academic Search

We examine whether rational investor responses to information uncertainty explain properties of and returns to accounting-based trading anomalies. We proxy for information uncertainty with two measures of earnings quality: the standard deviation of the residuals from a Dechow and Dichev (2002) model relating accruals to cash flows, and the absolute value of performance-adjusted abnormal accruals from a modified Jones (1991)

Jennifer Francis; Ryan LaFond; Per Olsson; Katherine Schipper

2003-01-01

182

Mystery Boxes: Uncertainty and Collaboration  

NSDL National Science Digital Library

This lesson teaches students that scientific knowledge is fundamentally uncertain. Students manipulate sealed mystery boxes and attempt to determine the inner structure of the boxes which contain a moving ball and a fixed barrier or two. The nature and sources of uncertainty inherent in the process of problem-solving are experienced. The uncertainty of the conclusions is reduced by student collaboration.

Beard, Jean

183

Housing Uncertainty and Childhood Impatience  

ERIC Educational Resources Information Center

|The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

2011-01-01

184

Aspects of modeling uncertainty and prediction  

SciTech Connect

Probabilistic assessment of variability in model prediction considers input uncertainty and structural uncertainty. For input uncertainty, understanding of practical origins of probabilistic treatments as well as restrictions and limitations of methodology is much more developed than for structural uncertainty. There is a simple basis for structural uncertainty that parallels that for input uncertainty. Although methodologies for assessing structural uncertainty for models in general are very limited, more options are available for submodels.

McKay, M.D.

1993-12-31

185

Uncertainties in Arctic precipitation  

NASA Astrophysics Data System (ADS)

Precipitation is an essential and highly variable component of the freshwater budget, and solid precipitation in particular, has a major impact on the local and global climate. The impacts of snow on the surface energy balance are tremendous, as snow has a higher albedo than any other naturally occurring surface condition. Documenting the instrumentally observed precipitation climate records presents its own challenges since the stations themselves undergo many changes in the course of their operation.Though it is crucial to accurately measure precipitation as a means to predict change in future water budgets, estimates of long-term precipitation are riddled with measurement biases. Some of the challenges facing reliable measurement of solid precipitation include missing data, gage change, discontinued stations, trace precipitation, blizzards, wetting losses when emptying the gage, and evaporation between the time of event and the time of measurement. Rain measurements likewise face uncertainties such as splashing of rain out of the gage, evaporation, and extreme events, though the magnitude of these impacts on overall measurement is less than that faced by solid precipitation. In all, biases can be so significant that they present major problems for the use of precipitation data in climate studies.

Majhi, Ipshita; Alexeev, Vladimir; Cherry, Jessica; Groisman, Pasha; Cohen, Judah

2013-04-01

186

Robust SAR ATR by hedging against uncertainty  

NASA Astrophysics Data System (ADS)

For the past two years in this conference, we have described techniques for robust identification of motionless ground targets using single-frame Synthetic Aperture Radar (SAR) data. By robust identification, we mean the problem of determining target ID despite the existence of confounding statistically uncharacterizable signature variations. Such variations can be caused by effects such as mud, dents, attachment of nonstandard equipment, nonstandard attachment of standard equipment, turret articulations, etc. When faced with such variations, optimal approaches can often behave badly-e.g., by mis-identifying a target type with high confidence. A basic element of our approach has been to hedge against unknowable uncertainties in the sensor likelihood function by specifying a random error bar (random interval) for each value of the likelihood function corresponding to any given value of the input data. Int his paper, we will summarize our recent results. This will include a description of the fuzzy maximum a posteriori (MAP) estimator. The fuzzy MAP estiamte is essentially the set of conventional MAP estimates that are plausible, given the assumed uncertainty in the problem. Despite its name, the fuzzy MAP is derived rigorously from first probabilistic principles based on random interval theory.

Hoffman, John R.; Mahler, Ronald P.; Ravichandran, Ravi B.; Huff, Melvyn; Musick, Stanton

2002-07-01

187

The anthropic principle  

NASA Astrophysics Data System (ADS)

The anthropic principle states that the fact of existence of intelligent beings may be a valid explanation of why the universe and laws of physics are as they are. The origin and some of the deeper implications of the principle are investigated. The discussion involves considerations of physics and metaphysics, unified schemes and holism, the nature of physical explanation, realism and idealism, and symmetry.

Rosen, Joe

1985-04-01

188

The genetic difference principle.  

PubMed

In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice. PMID:15186680

Farrelly, Colin

2004-01-01

189

Principles of Paleogeography  

Microsoft Academic Search

The broad general principles of paleogeography, which I would cite as most fundamental, are as follows: 1. Ocean basins are permanent hollows of the earth's surface and have occupied their present sites since an early date in the development of geographic features. This principle does not exclude notable changes in the positions of their margins, which on the whole have

Bailey Willis

1910-01-01

190

Principles of Sports Nutrition  

Microsoft Academic Search

LEARNING OUTCOME: To measure if nutrition course work enhances athletes’ knowledge of sports nutrition principles.The objectives of this study were to determine if nutrition course work enhances athletes’ knowledge of sports nutrition principles and to identify athletes’ resources for nutrition information. Questionnaires were distributed to 40 athletes at a state university. Seventeen completed surveys were returned and included members from

L. D. Tartamella; D. S. Kemler

1996-01-01

191

Principlism and communitarianism.  

PubMed

The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help. PMID:14519838

Callahan, D

2003-10-01

192

First Comes Love, Then Comes Google: An Investigation of Uncertainty Reduction Strategies and Self-Disclosure in Online Dating  

Microsoft Academic Search

This study investigates relationships between privacy concerns, uncertainty reduction behaviors, and self-disclosure among online dating participants, drawing on uncertainty reduction theory and the warranting principle. The authors propose a conceptual model integrating privacy concerns, self-efficacy, and Internet experience with uncertainty reduction strategies and amount of self-disclosure and then test this model on a nationwide sample of online dating participants (

Jennifer L. Gibbs; Nicole B. Ellison; Chih-Hui Lai

2011-01-01

193

Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty.  

National Technical Information Service (NTIS)

This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimat...

J. Hajagos L. Ginzburg S. Ferson V. Kreinovich W. Oberkampf

2007-01-01

194

Ontological Uncertainty and Semantic Uncertainty in Global Network Organizations.  

National Technical Information Service (NTIS)

In the literature, management of uncertainty is argued to be a central feature of effective project management. Global network organizations can involve people with different genders, personality types, cultures, first languages, social concerns, and/or w...

S. Fox

2008-01-01

195

Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms  

ERIC Educational Resources Information Center

|Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

2002-01-01

196

Erring with high?level nuclear waste disposal: a case study of the precautionary principle  

Microsoft Academic Search

Increasingly, the “Precautionary Principle” is being discussed as a basis for decision?making to protect environmental and human health where there are risks of serious or irreversible damage but where there are gaps in knowledge and uncertainties to demonstrate conclusively either the existence of the risks or their levels. Many analyses of the precautionary principle focus on the abstract or philosophical

John Lemons

2001-01-01

197

PIV uncertainty quantification by image matching  

NASA Astrophysics Data System (ADS)

A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

2013-04-01

198

Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation  

SciTech Connect

This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

Smith, D.L., E-mail: Donald.L.Smith@anl.gov [Argonne National Laboratory, 1710 Avenida Del Mundo 1506, Coronado, CA 92118 (United States); Otuka, N. [Nuclear Data Section, International Atomic Energy Agency, Wagramerstrasse 5, A-1400 Wien (Austria)

2012-12-15

199

The Economics of Uncertainty II.  

National Technical Information Service (NTIS)

The paper contains a general discussion of economic decisions under certainty and under uncertainty. It is demonstrated that decision rules can be derived from underlying preference orderings, which in some cases can be represented by utility functions. (...

K. Borch

1965-01-01

200

Improved Event Location Uncertainty Estimates.  

National Technical Information Service (NTIS)

The objective of this project was to develop methodologies that improve location uncertainties in the presence of correlated, systematic model errors and non-Gaussian measurement errors. We have developed a methodology based on copula theory to obtain rob...

H. Israelsson I. Bondar K. McLaughlin

2008-01-01

201

Evaluation of Process Inventory Uncertainties.  

National Technical Information Service (NTIS)

This paper discusses the determination of some of the process inventory uncertainties in the Fast Flux Test Facility (FFTF) process line at the Los Alamos Scientific Laboratory (LASL) Plutonium Processing Facility (TA-55). A brief description of the FFTF ...

N. J. Roberts

1980-01-01

202

Uncertainty in Climate Change Modeling  

NSDL National Science Digital Library

Learn why trout are important indicators in Wisconsin’s changing climate, and why the uncertainty of global climate models complicates predictions about their survival, in this video produced by the Wisconsin Educational Communications Board.

Ecb, Wisconsin

2010-11-30

203

Number-phase uncertainty relations  

Microsoft Academic Search

The minimization problem of finding the number-phase minimum uncertainty states (MUS) is considered and its solutions are found either numerically or, under some special conditions, analytically. The phase uncertainty measure is based on the Bandilla-Paul dispersion. The problem is treated (i) in a finite-dimensional Hilbert space and (ii) for a countably infinite-dimensional Hilbert space (i.e. the standard quantum harmonic oscillator),

T. Opatrny

1995-01-01

204

Finalization of rule removes uncertainty.  

PubMed

Although the deadline for compliance with the Health Insurance Portability and Accountability Act Privacy Rule is only a few months away, some hospitals and other covered entities have essentially placed HIPAA compliance efforts on hold for the past several months because of the perceived uncertainty surrounding the rule's requirements. A new final rule was released on Aug. 9, removing this uncertainty and maintaining the April 14, 2003, deadline. PMID:12402650

Wachler, Andrew B; Fehn, Amy K

205

Measurement uncertainty in analytical chemistry  

Microsoft Academic Search

It is now becoming recognised in the measurement community that as well as reporting the measured value it is also essential\\u000a to give its uncertainty. Without a knowledge of the uncertainty, it is impossible for the users of the result to know what\\u000a confidence can be placed in it and it is also impossible to assess the comparability of different

A. Williams

1996-01-01

206

Structural model uncertainty in stochastic simulation  

SciTech Connect

Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

McKay, M.D.; Morrison, J.D. [Los Alamos National Lab., NM (United States). Technology and Safety Assessment Div.

1997-09-01

207

A typology for visualizing uncertainty  

NASA Astrophysics Data System (ADS)

Information analysts must rapidly assess information to determine its usefulness in supporting and informing decision makers. In addition to assessing the content, the analyst must be confident about the quality and veracity of the information. Visualizations can concisely represent vast quantities of information, thus aiding the analyst to examine larger quantities of material; however, visualization programs are challenged to incorporate a notion of confidence or certainty because the factors that influence the certainty or uncertainty of information vary with the type of information and the type of decisions being made. For example, the assessment of potentially subjective human-reported data leads to a large set of uncertainty concerns in fields such as national security, law enforcement (witness reports), and even scientific analysis where data is collected from a variety of individual observers. What"s needed is a formal model or framework for describing uncertainty as it relates to information analysis, to provide a consistent basis for constructing visualizations of uncertainty. This paper proposes an expanded typology for uncertainty, drawing from past frameworks targeted at scientific computing. The typology provides general categories for analytic uncertainty, a framework for creating task-specific refinements to those categories, and examples drawn from the national security field.

Thomson, Judi; Hetzler, Elizabeth; MacEachren, Alan; Gahegan, Mark; Pavel, Misha

2005-03-01

208

Uncertainty in measurements by counting  

NASA Astrophysics Data System (ADS)

Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

Bich, Walter; Pennecchi, Francesca

2012-02-01

209

The equivalence principle  

SciTech Connect

For the example of the motion of an accelerated charge it is shown that an inertial frame of reference in which there is a homogeneous static gravitational field with strength g is physically inequivalent to a uniformly accelerated frame of reference moving with acceleration -g with respect to the intertial frame of reference. It follows from this that the equivalence principle does not hold in the usual formulation. The widely held opinion that such a principle is the basis of the general theory of relativity is not entirely correct. Einstein`s theory of gravitation is based on an equivalence principle of a deeper content, which takes the form that the metric field g{sub uv} of a Riemannian space is declared to be a gravitational field. Such is the {open_quotes}natural formulation of the equivalence principle{close_quotes} at which Einstein subsequently arrived.

Logunov, A.A.; Mestvirishvili, M.A.; Chugreev, Yu.V.

1994-10-01

210

Principles of Radio Navigation.  

National Technical Information Service (NTIS)

The textbook for radio engineering higher institutions of learning and departments explains the operating principles of radio navigational aids used to navigate flying vehicles. General questions concerned with the accuracy of navigational fixes obtained ...

O. V. Belavin

1970-01-01

211

Chemical Principles Exemplified  

ERIC Educational Resources Information Center

Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

Plumb, Robert C.

1972-01-01

212

Chemical Principles Exemplified  

ERIC Educational Resources Information Center

|Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)|

Plumb, Robert C.

1972-01-01

213

A generalized KKMF principle  

NASA Astrophysics Data System (ADS)

We present in this paper a generalized version of the celebrated Knaster-Kuratowski-Mazurkiewicz-Fan's principle on the intersection of a family of closed sets subject to a classical geometric condition and a weakened compactness condition. The fixed point formulation of this generalized principle extends the Browder-Fan fixed point theorem to set-valued maps of non-compact convex subsets of topological vector spaces.

Ben-El-Mechaiekh, Hichem; Chebbi, Souhail; Florenzano, Monique

2005-09-01

214

Buoyancy: Archimedes Principle  

NSDL National Science Digital Library

This site describes bouyancy (the difference between the upward and downward forces acting on the bottom and the top of an object) and the Archimedes Principle, which states that the buoyant force on a submerged object is equal to the weight of the fluid that is displaced by it. It consists of text descriptions of these principles, using the examples of metal cubes suspended in water and hot air baloons in the atmosphere. Mathematical word problems are included.

215

Harmonic oscillator with minimal length, minimal momentum, and maximal momentum uncertainties in SUSYQM framework  

NASA Astrophysics Data System (ADS)

We consider a Generalized Uncertainty Principle (GUP) framework which predicts a maximal uncertainty in momentum and minimal uncertainties both in position and momentum. We apply supersymmetric quantum mechanics method and the shape invariance condition to obtain the exact harmonic oscillator eigenvalues in this GUP context. We find the supersymmetric partner Hamiltonians and show that the harmonic oscillator belongs to a hierarchy of Hamiltonians with a shift in momentum representation and different masses and frequencies. We also study the effect of a uniform electric field on the harmonic oscillator energy spectrum in this setup.

Asghari, M.; Pedram, P.; Nozari, K.

2013-10-01

216

Uncertainty quantification in reacting flow.  

SciTech Connect

Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.

Najm, Habib N.

2010-05-01

217

Uncertainty quantification in reacting flow.  

SciTech Connect

Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.

Marzouk, Youssef M. (MIT, Cambridge, MA); Debusschere, Bert J.; Najm, Habib N.; Berry, Robert Bruce

2010-06-01

218

Communicating Uncertainties on Climate Change  

NASA Astrophysics Data System (ADS)

The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

Planton, S.

2009-09-01

219

Measurement control: Principles and practice as applied to nondestructive assay  

SciTech Connect

This paper discusses the principles and practice of measurement control for nondestructive assay (NDA) instruments. The NDA is not always blessed with the highly controlled samples that are assumed in the analytical laboratory. This adversely affects the use and applicability of the historical error information from instrument stability checks to estimate measurement uncertainties for the broad range of sample characteristics presented to most NDA instruments. This paper emphasizes the methods used to perform instrument stability checks and discusses the resulting uncertainty information that can be derived from these measurements. 4 refs., 2 figs., 2 tabs.

Sampson, T.E.

1991-01-01

220

An Informational Characterization of Schrödinger's Uncertainty Relations  

Microsoft Academic Search

Heisenberg's uncertainty relations employ commutators of observables to set fundamental limits on quantum measurement. The information concerning incompatibility (non-commutativity) of observables is well included but that concerning correlation is missing. Schrödinger's uncertainty relations remedy this defect by supplementing the correlation in terms of anti-commutators. However, both Heisenberg's uncertainty relations and Schrödinger's uncertainty relations are expressed in terms of variances, which

Shunlong Luo; Zhengmin Zhang

2004-01-01

221

Is uncertainty bad for you? It depends ….  

PubMed

Parents and educators should be more concerned about uncertainty in educational aspirations than uncertainty regarding career choice among adolescents. Moreover, the impact of uncertainty on young people's attainment varies by socio-historical context, the timing of uncertainty, the available resources, and individual characteristics of the adolescents themselves. PMID:23097364

Schoon, Ingrid; Gutman, Leslie Morrison; Sabates, Ricardo

2012-01-01

222

Framework for managing uncertainty in property projects  

Microsoft Academic Search

A primary task of property development (or real estate development, RED) is making assessments and managing risks and uncertainties. Property managers cope with a wide range of uncertainties, particularly in the early project phases. Although the existing literature addresses the management of calculated risks, the management of uncertainties is underexposed. A framework and method are presented for uncertainty management, both

Isabelle M. M. J. Reymen; Geert P. M. R. Dewulf; Sjoerd B. Blokpoel

2008-01-01

223

Bayesian diagnosis and prognosis using instrument uncertainty  

Microsoft Academic Search

How can diagnosis and prognosis systems be improved in the presence of uncertainty in test results? How can these uncertainties be identified and modeled? Can diagnosis be improved as a result of understanding these uncertainties? These questions represent the core problems to be explored in this paper. Specifically, we explore the process by which instrument uncertainty can be used to

John W. Sheppard; Mark A. Kaufman

2005-01-01

224

Linear Programming Problems for Generalized Uncertainty  

ERIC Educational Resources Information Center

|Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

Thipwiwatpotjana, Phantipa

2010-01-01

225

Climate negotiations under scientific uncertainty  

PubMed Central

How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk.

Barrett, Scott; Dannenberg, Astrid

2012-01-01

226

Uncertainties in hydrocarbon charge prediction  

NASA Astrophysics Data System (ADS)

Computer simulations allow the prediction of hydrocarbon volumes, composition and charge timing in undrilled petroleum prospects. Whereas different models may give different hydrocarbon charge predictions, it has now become evident that a dominant cause of erroneous predictions is the poor quality of input data. The main culprit for prediction errors is the uncertainty in the initial hydrogen index (H/C) of the source rock. A 10% uncertainty in the H/C may lead to 50% error in the predicted hydrocarbon volumes, and associated gas-oil ratio. Similarly, uncertainties in the maximum burial temperature and the kinetics of hydrocarbon generation may lead to 20-50% error. Despite this, charge modelling can have great value for the ranking of prospects in the same area with comparable geological histories.

Visser, W.; Bell, A.

227

Estimating uncertainty in resolution tests  

NASA Astrophysics Data System (ADS)

Resolution testing of imaging optical equipment is still commonly performed using the USAF 1951 target. The limiting resolution is normally calculated from the group and element that can just be resolved by an observer. Although resolution testing has limitations, its appeal lies in the fact that it is a quick test with low complexity. Resolution uncertainty can serve as a diagnostic tool, aid in understanding observer variability, and assist in planning experiments. It may also be necessary to satisfy a customer requirement or international standard. This paper derives theoretical results for estimating resolution and calculating its uncertainty, based on observer measurements, while taking the target spatial-frequency quantization into account. We show that estimating the resolution by simply averaging the target spatial frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger uncertainty analysis.

Goncalves, Duarte P.; Griffith, Derek J.

2006-05-01

228

Basic Principles of Chromatography  

NASA Astrophysics Data System (ADS)

Chromatography has a great impact on all areas of analysis and, therefore, on the progress of science in general. Chromatography differs from other methods of separation in that a wide variety of materials, equipment, and techniques can be used. [Readers are referred to references (1-19) for general and specific information on chromatography.]. This chapter will focus on the principles of chromatography, mainly liquid chromatography (LC). Detailed principles and applications of gas chromatography (GC) will be discussed in Chap. 29. In view of its widespread use and applications, high-performance liquid chromatography (HPLC) will be discussed in a separate chapter (Chap. 28). The general principles of extraction are first described as a basis for understanding chromatography.

Ismail, Baraem; Nielsen, S. Suzanne

229

CO2 transport uncertainties from the uncertainties in meteorological fields  

NASA Astrophysics Data System (ADS)

Inference of surface CO2 fluxes from atmospheric CO2 observations requires information about large-scale transport and turbulent mixing in the atmosphere, so transport errors and the statistics of the transport errors have significant impact on surface CO2 flux estimation. In this paper, we assimilate raw meteorological observations every 6 hours into a general circulation model with a prognostic carbon cycle (CAM3.5) using the Local Ensemble Transform Kalman Filter (LETKF) to produce an ensemble of meteorological analyses that represent the best approximation to the atmospheric circulation and its uncertainty. We quantify CO2 transport uncertainties resulting from the uncertainties in meteorological fields by running CO2 ensemble forecasts within the LETKF-CAM3.5 system forced by prescribed surface fluxes. We show that CO2 transport uncertainties are largest over the tropical land and the areas with large fossil fuel emissions, and are between 1.2 and 3.5 ppm at the surface and between 0.8 and 1.8 ppm in the column-integrated CO2 (with OCO-2-like averaging kernel) over these regions. We further show that the current practice of using a single meteorological field to transport CO2 has weaker vertical mixing and stronger CO2 vertical gradient when compared to the mean of the ensemble CO2 forecasts initialized by the ensemble meteorological fields, especially over land areas. The magnitude of the difference at the surface can be up to 1.5 ppm.

Liu, Junjie; Fung, Inez; Kalnay, Eugenia; Kang, Ji-Sun

2011-06-01

230

RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY  

Microsoft Academic Search

It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and

S. Salaymeh; W. Ashley; R. Jeffcoat

2010-01-01

231

Neutrino oscillations and uncertainty relations  

NASA Astrophysics Data System (ADS)

We show that coherent flavor neutrino states are produced (and detected) due to the momentum-coordinate Heisenberg uncertainty relation. The Mandelstam-Tamm time-energy uncertainty relation requires non-stationary neutrino states for oscillations to happen and determines the time interval (propagation length) which is necessary for that. We compare different approaches to neutrino oscillations which are based on different physical assumptions but lead to the same expression for the neutrino transition probability in standard neutrino oscillation experiments. We show that a Mössbauer neutrino experiment could allow us to distinguish different approaches and we present arguments in favor of the 163Ho-163Dy system for such an experiment.

Bilenky, S. M.; von Feilitzsch, F.; Potzel, W.

2011-11-01

232

Geographic Uncertainty in Environmental Security  

NASA Astrophysics Data System (ADS)

This volume contains 17 papers presented at the NATO Advanced Research Workshop on Fuzziness and Uncertainty held in Kiev, Ukraine, 28 June to 1 July 2006. Eleven of the papers deal with fuzzy set concepts, while the other six (papers 5, 7, 13, 14, 15, and 16) are not fuzzy. A reader with no prior exposure to fuzzy set theory would benefit from having an introductory text at hand, but the papers are accessible to a wide audience. In general, the papers deal with broad issues of classification and uncertainty in geographic information.

Ahlquist, Jon

2008-06-01

233

Plastic variational principle based on the least work consumption principle  

Microsoft Academic Search

Plastic variational principles are foundation to solve the boundary-value problems of plastic mechanics with the variational\\u000a method (or energy method) and finite element method. The most convenient way of establishing different kinds of variational\\u000a principles is to set up the extreme principle related to the studied problem. Based on a general new extreme principle-the\\u000a Least work consumption principle, the variational

Song-hua Tang; Ying-she Luo; Zhu-bao Zhou; Zhi-chao Wang

2008-01-01

234

Principles of "Metrological Statistics"  

NASA Astrophysics Data System (ADS)

That portion of applied statistics which is used for the treatment of experimental data is essentially founded on unbiased estimators - notwithstanding the fact that any real measurement process implies so-called unknown systematic errors. Until now, those perturbations giving rise to unknown deviations from the true values have been formally randomized, i.e. they are treated as if they were of random origin. Biased estimators thus again become fictitiously unbiased. However, the associated procedures suffer from two disadvantages: they are too complicated and, in the author's opinion, they generate uncertainties which are not sufficiently reliable. The submitted paper attempts to establish "Metrological Statistics" which, being based on biased estimators, is much simpler to handle and which, in particular, yields safe estimates for measurement uncertainties.

Grabe, M.

1987-01-01

235

Uncertainty of height information in coherence scanning interferometry  

NASA Astrophysics Data System (ADS)

Coherence scanning interferometry CSI with a broadband light source (short known as white light interferometry) is, beside the confocal technique, one of the most popular optical principles to measure surface topography. Compared to coherent interferometry, the broadband light source leads, theoretically, to an unambiguous phase information. The paper describes the properties of the correlogram in the spatial and in the frequency domain. All deviations from the ideal correlogram are expressed by an addition phase term. The uncertainty of height information is discussed for both, the frequency domain analyse (FDA) proposed by de Groot and the Hilbert transform. For the frequency domain analyse, the uncertainty is quantified by the Cramér-Rao bound. The second part of the paper deals with the phase evaluation of the correlogram, which is necessary to achieve a high vertical resolution. Because the envelope function is often distorted, phase jumps lead to ambiguous height informations. In particular, this effect can be observed measuring rough surfaces.

Seewig, J.; Böttner, T.; Broschart, D.

2011-05-01

236

Principles of Biomedical Ethics  

PubMed Central

In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making.

Athar, Shahid

2012-01-01

237

Principles of dataspace systems  

Microsoft Academic Search

The most acute information management challenges today stem from organizations relying on a large number of diverse, inter- related data sources, but having no means of managing them in a convenient, integrated, or principled fashion. These chal- lenges arise in enterprise and government data management, digital libraries, ìsmartî homes and personal information man- agement. We have proposed dataspaces as a

Alon Y. Halevy; Michael J. Franklin; David Maier

2006-01-01

238

Basic principles of stability  

Microsoft Academic Search

An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least

William Egan; Timothy Schofield

2009-01-01

239

Principles of Object Perception  

Microsoft Academic Search

Research on human infants has begun to shed light on early-developing processes for segmenting perceptual arrays into objects. Infants appear to perceive obiects by analyzing three-dlmensional surface arrangements and motions. Their per- ception does not accord with a general tendency to maximize figural goodness or to attend to nonaccldentol geometric relations in visual arrays. Object perceptlan does accord with principles

Elizabeth S. Spelke

1990-01-01

240

Principles of Corrosion  

Microsoft Academic Search

The principles of corrosion have been reviewed from two aspects: Thermodynamics and kinetics. Thermodynamic considerations yield the relative corrosion tendencies ofthe elements. Three types of corrosion cells have also been mentioned: Galvanic, concentration and simple anodic. Kinetics attempts to explain and predict the actual corrosion behavior of materials. The relative corrosion tendencies, the metallurgical aspects and the environmental conditions must

Stephen C. Kolesar

1974-01-01

241

The Principles of Flight  

NSDL National Science Digital Library

This section of the Pilot's Web website contains introductory information on various flight related physical principles. Topics include Newton's laws of motion and force, airfoils, lift and drag, forces acting on an airplane, speed, flight maneuvers, the effects of roll, and more. Each topic contains good illustrations, descriptions, and equations.

2003-10-10

242

Principles of remote attestation  

Microsoft Academic Search

Remote attestation is the activity of making a claim about properties of a target by supplying evidence to an appraiser over a net- work. We identify ve central principles to guide development of attes- tation systems. We argue that (i) attestation must be able to deliver temporally fresh evidence; (ii) comprehensive information about the tar- get should be accessible; (iii)

George Coker; Joshua D. Guttman; Peter Loscocco; Amy L. Herzog; Jonathan Millen; Brian O’Hanlon; John D. Ramsdell; Ariel Segall; Justin Sheehy; Brian T. Sniffen

2011-01-01

243

Principles of nuclear geology  

Microsoft Academic Search

This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate

Aswathanarayana

1985-01-01

244

Matters of Principle.  

ERIC Educational Resources Information Center

This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

Martz, Carlton

1999-01-01

245

Laboratory Safety Principles  

NSDL National Science Digital Library

This workshop covers major principles and regulations pertinent to working in laboratories with hazardous materials. It is divided into 45 minute segments dealing with: Radioactive Materials (Staiger); Toxic, Reactive, Carcinogenic, and Teratogenic Chemicals (Carlson); Infectious Agents (Laver); and Fire Safety Concepts and Physical Hazards (Arnston).

Jerry Staiger, Keith Carlson, Jim Laver, Ray Arntson (University of Minnesota;); Keith Carlson (University of Minnesota;); Jim Lauer (University of Minnesota;); Ray Amtson (University of Minnesota;)

2008-04-11

246

Fermat's Principle Revisited.  

ERIC Educational Resources Information Center

A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

Kamat, R. V.

1991-01-01

247

Congress probes climate change uncertainties  

Microsoft Academic Search

Policymakers are demanding information about climate change faster than it can be turned out by scientists. This conflict between politics and science was debated at a recent congressional hearing on priorities in global change research. On October 8 and 10, panels of scientists that included AGU president-elect Ralph J. Cicerone of the University of California attempted to identify scientific uncertainties

Lynn Teo Simarski

1991-01-01

248

Induction of models under uncertainty  

Microsoft Academic Search

This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilitic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation class descriptions both for the

Peter Cheeseman

1986-01-01

249

Getting to grips with uncertainty  

Microsoft Academic Search

Challenges the belief shown by many McKinsey managers that accurate predictions of the future follow rigorous analysis using the many tools available. Describes four levels of uncertainty that require differing approaches: clear-enough future; alternate futures; range of futures; and true ambiguity. Emphasizes that real options create learning through the thought processes that the method demands and by virtue of the

T Kippenberger

1998-01-01

250

The legal status of uncertainty  

Microsoft Academic Search

Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We

L. Ferraris; D. Miozzo

2009-01-01

251

Uncertainties in Fault Tree Analysis  

Microsoft Academic Search

Fault tree analysis is one kind of the probabilistic safety analysis method. After constructing a fault tree, many basic events which can happen theoretically have never occurred so far or have occurred so infrequently that their reasonable data are not available. However, the use of fuzzy probability can describe the failure probability and its uncertainty of each basic event ,

Yue-Lung Cheng

252

Uncertainty in air quality modeling  

Microsoft Academic Search

Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions. The group agreed that air

Douglas G. Fox

1984-01-01

253

Aeroelastic analysis considering structural uncertainty  

Microsoft Academic Search

Uncertainties in aero elastic analysis are investigated. In aero elastic analysis, we usually find divergence speed or flutter speed by using deterministic equations of motion for elastic wings of aircraft. If any parameter in these equations is sensitive to the critical speed, we should treat it carefully since inaccuracy is inevitable in the production process. Tragic failure may occur if

T. Ueda

2005-01-01

254

Consumption inequality and income uncertainty  

Microsoft Academic Search

This paper places the debate over using consumption or income in studies of inequality growth in a formal intertemporal setting. It highlights the importance of permanent and transitory income uncertainty in the evaluation of growth in consumption inequality. We derive conditions under which the growth of variances and covariances of income and consumption can be used to separately identify the

Richard Blundell; Ian Preston

1998-01-01

255

The Economics of Uncertainty, I.  

National Technical Information Service (NTIS)

The paper is an introduction to a broad survey of the uncertainty element in economics. The paper leads up to a game-theoretical formulation of the problem, stressing the interdependence of the actions taken by all decision makers in the system. (Author)

K. Borch

1965-01-01

256

Evaluation of process inventory uncertainties  

Microsoft Academic Search

This paper discusses the determination of some of the process inventory uncertainties in the Fast Flux Test Facility (FFTF) process line at the Los Alamos Scientific Laboratory (LASL) Plutonium Processing Facility (TA-55). A brief description of the FFTF process is given, along with a more detailed look at the peroxide precipitation and re-dissolution (PR) process. Emphasis is placed on the

1980-01-01

257

Identity Uncertainty and Citation Matching  

Microsoft Academic Search

Identity uncertainty is a pervasive problem in real-world data analysis. It arises whenever objects are not labeled with unique identifiers or when those identifiers may not be perceived perfectly. In such cases, two ob- servations may or may not correspond to the same object. In this paper, we consider the problem in the context of citation matching—the prob- lem of

Hanna Pasula; Bhaskara Marthi; Brian Milch; Stuart J. Russell; Ilya Shpitser

2002-01-01

258

Exploring Uncertainty with Projectile Launchers  

ERIC Educational Resources Information Center

|The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

Orzel, Chad; Reich, Gary; Marr, Jonathan

2012-01-01

259

Robot motion planning with uncertainty  

Microsoft Academic Search

Modelling robot motion planning with uncertainty in a Bayesian framework leads to a computationally intractable stochastic control problem. We seek hypotheses that can justify a separate implementation of control, localization and planning. In the end, we reduce the stochastic control problem to path- planning in the extended space of poses?covariances; the transitions between states are modeled through the use of

Andrea Censi; Daniele Calisi; Alessandro De Luca; Giuseppe Oriolo

260

The precautionary principle in environmental science.  

PubMed Central

Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy.

Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

2001-01-01

261

Structural Damage Assessment under Uncertainty  

NASA Astrophysics Data System (ADS)

Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are isolated based on the integration of sensitivity analysis and statistical sampling, which minimizes the occurrence of false-damage indication due to uncertainty. To perform diagnostic decision-making under uncertainty, an evidential reasoning approach for damage assessment is developed for addressing the possible imprecision in the damage localization results. The newly developed damage detection and localization techniques are applied and validated through both vibration test data from literature and in house laboratory experiments.

Lopez Martinez, Israel

262

Hybrid computed torque controlled motor-toggle servomechanism using fuzzy neural network uncertainty observer  

Microsoft Academic Search

The dynamic response of a hybrid computed torque controlled toggle mechanism, which is driven by a permanent magnet (PM) synchronous servo motor, is studied in this paper. First, based on the principle of computed torque control, a position controller is developed for the motor–toggle servomechanism. Moreover, to relax the requirement of the lumped uncertainty in the design of a computed

Faa-jeng Lin; Rong-jong Wai

2002-01-01

263

The Principle of Maximum Conformality  

SciTech Connect

A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

Brodsky, Stanley J; /SLAC; Giustino, Di; /SLAC

2011-04-05

264

Experimental uncertainty estimation and statistics for data having interval uncertainty.  

SciTech Connect

This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

2007-05-01

265

Attention, Uncertainty, and Free-Energy  

PubMed Central

We suggested recently that attention can be understood as inferring the level of uncertainty or precision during hierarchical perception. In this paper, we try to substantiate this claim using neuronal simulations of directed spatial attention and biased competition. These simulations assume that neuronal activity encodes a probabilistic representation of the world that optimizes free-energy in a Bayesian fashion. Because free-energy bounds surprise or the (negative) log-evidence for internal models of the world, this optimization can be regarded as evidence accumulation or (generalized) predictive coding. Crucially, both predictions about the state of the world generating sensory data and the precision of those data have to be optimized. Here, we show that if the precision depends on the states, one can explain many aspects of attention. We illustrate this in the context of the Posner paradigm, using the simulations to generate both psychophysical and electrophysiological responses. These simulated responses are consistent with attentional bias or gating, competition for attentional resources, attentional capture and associated speed-accuracy trade-offs. Furthermore, if we present both attended and non-attended stimuli simultaneously, biased competition for neuronal representation emerges as a principled and straightforward property of Bayes-optimal perception.

Feldman, Harriet; Friston, Karl J.

2010-01-01

266

Attention, uncertainty, and free-energy.  

PubMed

We suggested recently that attention can be understood as inferring the level of uncertainty or precision during hierarchical perception. In this paper, we try to substantiate this claim using neuronal simulations of directed spatial attention and biased competition. These simulations assume that neuronal activity encodes a probabilistic representation of the world that optimizes free-energy in a Bayesian fashion. Because free-energy bounds surprise or the (negative) log-evidence for internal models of the world, this optimization can be regarded as evidence accumulation or (generalized) predictive coding. Crucially, both predictions about the state of the world generating sensory data and the precision of those data have to be optimized. Here, we show that if the precision depends on the states, one can explain many aspects of attention. We illustrate this in the context of the Posner paradigm, using the simulations to generate both psychophysical and electrophysiological responses. These simulated responses are consistent with attentional bias or gating, competition for attentional resources, attentional capture and associated speed-accuracy trade-offs. Furthermore, if we present both attended and non-attended stimuli simultaneously, biased competition for neuronal representation emerges as a principled and straightforward property of Bayes-optimal perception. PMID:21160551

Feldman, Harriet; Friston, Karl J

2010-12-02

267

Principles of Semiconductor Devices  

NSDL National Science Digital Library

Home page of an online and interactive textbook, Principles of Semiconductor Devices., written by Bart J. Van Zeghbroeck, Ph.D., Professor in the Department of Electrical and Computer Engineering at the University of Colorado at Boulder. The goal of this text is to provide the basic principles of common semiconductor devices, with a special focus on Metal-Oxide-Semiconductor Field-Effect-Transistors (MOSFETs). A browser environment was chosen so that text, figures and equations can be linked for easy reference. A table of contents, a glossary, active figures and some study aids are integrated with the text with the intention to provide a more effective reference and learning environment. Chapter titles include: Semiconductor Fundamentals, Metal-Semiconductor Junctions, p-n Junctions, Bipolar Transistors, MOS Capacitors, and MOSFET.

Van Zeghbroeck, Bart J.

2011-06-13

268

Basic Principles of Ultrasound  

NSDL National Science Digital Library

Created by a team of medical professionals and health-care specialists, the main Echo Web site contains a wide range of resources dealing primarily with diagnostic ultrasounds, sonography, and the field of echocardiography. One of the most helpful of these resources is the Basic Principles of Ultrasound online course, which is available here at no cost. The course itself is divided into six different sections, along with a bibliography and FAQ area. Visitors can use the online course to learn about the basic principles of ultrasound, the basic science behind related devices and instruments, and the ways to use these devices safely. Instructors might also do well to use this website in conjunction with lectures on the subject, or as away to give students an additional resource to consult at their leisure.

2004-01-01

269

White coat principles.  

PubMed

The White Coat Ceremony, which many dental schools use to mark the transition to patient care, is an opportunity to reflect on the values of dental practice. Eight principles are offered for consideration: 1 ) patient care is the point of practice; 2) the doctor-patient relationship is essential; 3) discuss options and possibilities; 4) mistakes will be made; 5) tell the truth; be assertive; 7 ) consult; and 8) manage your stress and your life. PMID:15948496

Peltier, Bruce N

2004-01-01

270

Principles of lake sedimentology  

SciTech Connect

This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.

Janasson, L.

1983-01-01

271

PHYSICAL PRINCIPLES OF MAMMOGRAPHY  

Microsoft Academic Search

An outline is given of the underlying physical principles that govern the selection and use of systems for X-ray mammography.\\u000a Particular attention is paid to screen-film mammography as some aspects of digital mammography are considered in another lecture.\\u000a The size and composition of the compressed female breast and of calcifications are described and the magnitude of photon interaction\\u000a processes in

DAVID R. DANCE

272

Principles of nuclear geology  

SciTech Connect

This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

Aswathanarayana, U.

1985-01-01

273

Principles of Geriatric Surgery  

Microsoft Academic Search

\\u000a The world population is aging and the conditions that require cardiothoracic surgery – atherosclerosis, lung and esophageal\\u000a cancer, degenerative valve disease, dysrhythmia, and others – increase in incidence with increasing age. What do we know about\\u000a surgery in the elderly that will help us improve our care of these conditions? Six general principles are useful for teaching\\u000a purposes. These include

Mark R. Katlic

274

Policy Uncertainty and Precautionary Savings  

Microsoft Academic Search

This paper uses German micro data and a quasi-natural experiment to provide new evidence on the empirical importance of precautionary savings. Our quasi-natural experiment draws on a sharp increase in uncertainty (as reported in a survey of German citizens) observed in the run-up to the 1998 general election. Our estimates are obtained from a diff-in-diff estimator and thus overcome the

Francesco Giavazzi; Michael McMahon

2009-01-01

275

Uncertainty in the Semantic Web  

Microsoft Academic Search

During the recent decade, significant research activities have been directed towards the Semantic Web as future substitute\\u000a of the Web. Many experts predict that the next huge step forward in Web information technology will be achieved by adding\\u000a semantics to Web data. An important role in research towards the Semantic Web is played by formalisms and technologies for\\u000a handling uncertainty

Thomas Lukasiewicz

2009-01-01

276

Term Premiums and Inflation Uncertainty  

Microsoft Academic Search

Abstract This paper provides cross-country empirical evidence on term premia, inflation uncertainty, and their relationship. It has three components. First, I construct a panel of zero-coupon nominal government bond yields spanning ten countries and eighteen years. From these, I construct forward rates and decompose,these into expected future short-term interest rates and term premiums, using both statistical methods (an affine term

Jonathan H. Wright

277

Coal blending optimization under uncertainty  

Microsoft Academic Search

Coal blending is one of several options available for reducing sulfur emissions from coal-fired power plants. However, decisions about coal blending must deal with uncertainty and variability in coal properties, and with the effect of off-design coal characteristics on power plant performance and cost. To deal with these issues, a multi-objective chance-constrained optimization model is developed for an illustrative coal

Jhih-Shyang Shih; H. Christopher Frey

1995-01-01

278

Reducing Parameter Uncertainty by Noise?  

NASA Astrophysics Data System (ADS)

Within a given climate model framework, the uncertainty on the state of the climate system appears as model parameter uncertainty, hence, is linked to parameter estimation. In this contribution, the possible existence of bifurcations is explicitly taken into account. Parameter estimation based on time-averages (a standard method in the climate modeling community) is compared to a scheme suggested in [1], based on spectral properties of a stochastically forced system. The work of [1], dealing with the Stommel model and a potential breakdown of the Thermohaline Circulation, is generalized to more complex models - either with respect to spatial resolution or processes. Particular attention is paid to systems displaying different types of bifurcations. Finally, given the above dynamical setting, it is discussed in which way uncertainty is formalized best. The classical as well as the Bayesian probability approach are considered, the latter method being often important in cases of under-constrained complex models. The prospects of non-additive measures are sketched which can improve on key weaknesses of the Bayesian approach. begin{thebibliography}{0} bibitem{k} Thomas Kleinen, Hermann Held, Gerhard Petschel-Held, The potential role of spectral properties in detecting thresholds in the Earth System: Application to the Thermohaline Circulation, accepted for publication in Ocean Dynamics

Held, H.; Kleinen, T.

2003-04-01

279

Fuzzy-algebra uncertainty assessment  

SciTech Connect

A significant number of analytical problems (for example, abnormal-environment safety analysis) depend on data that are partly or mostly subjective. Since fuzzy algebra depends on subjective operands, we have been investigating its applicability to these forms of assessment, particularly for portraying uncertainty in the results of PRA (probabilistic risk analysis) and in risk-analysis-aided decision-making. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only known (not assumed) information. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments; and therefore require an even more judicious approach. Fuzzy algebra matches these requirements well. One of the most useful aspects of this work is that we have shown the potential for significant differences (especially in perceived margin relative to a decision threshold) between fuzzy assessment and probabilistic assessment based on subtle factors inherent in the choice of probability distribution models. We have also shown the relation of fuzzy algebra assessment to ``bounds`` analysis, as well as a description of how analyses can migrate from bounds analysis to fuzzy-algebra analysis, and to probabilistic analysis as information about the process to be analyzed is obtained. Instructive examples are used to illustrate the points.

Cooper, J.A. [Sandia National Labs., Albuquerque, NM (United States); Cooper, D.K. [Naval Research Lab., Washington, DC (United States)

1994-12-01

280

Charpy Machine Verification: Limits and Uncertainty.  

National Technical Information Service (NTIS)

We clarify some issues pertaining to uncertainty statements and the ASTM E 23 limits used in the Charpy machine verification program. We explain some of the distributional subtleties associated with uncertainty and ultimately relate these to the ASTM limi...

C. M. Wang C. N. McCowan J. D. Splett

2008-01-01

281

Application of uncertainty and variability in LCA  

Microsoft Academic Search

As yet, the application of an uncertainty and variability analysis is not common practice in LCAs. A proper analysis will\\u000a be facilitated when it is clear which types of uncertainties and variabilities exist in LCAs and which tools are available\\u000a to deal with them. Therefore, a framework is developed to classify types of uncertainty and variability in LCAs. Uncertainty\\u000a is

Mark A. J. Huijbregts

1998-01-01

282

Guiding Principles for Diabetes Care.  

National Technical Information Service (NTIS)

The National Diabetes Education Program (NDEP) has developed these Guiding Principles for Diabetes Care to help the health care team manage the disease effectively. The principles outline seven essential components of quality diabetes care that form the b...

2004-01-01

283

Administrative Process. Principles of Organization.  

National Technical Information Service (NTIS)

The report is a training outline for a presentation for students which includes lecture, class discussion, and film. The purpose is to introduce the students to the basic principles of organization, and to relate these principles to housing management and...

1973-01-01

284

Systems-based guiding principles for risk modeling, planning, assessment, management, and communication.  

PubMed

This article is grounded on the premise that the complex process of risk assessment, management, and communication, when applied to systems of systems, should be guided by universal systems-based principles. It is written from the perspective of systems engineering with the hope and expectation that the principles introduced here will be supplemented and complemented by principles from the perspectives of other disciplines. Indeed, there is no claim that the following 10 guiding principles constitute a complete set; rather, the intent is to initiate a discussion on this important subject that will incrementally lead us to a more complete set of guiding principles. The 10 principles are as follows: First Principle: Holism is the common denominator that bridges risk analysis and systems engineering. Second Principle: The process of risk modeling, assessment, management, and communication must be systemic and integrated. Third Principle: Models and state variables are central to quantitative risk analysis. Fourth Principle: Multiple models are required to represent the essence of the multiple perspectives of complex systems of systems. Fifth Principle: Meta-modeling and subsystems integration must be derived from the intrinsic states of the system of systems. Sixth Principle: Multiple conflicting and competing objectives are inherent in risk management. Seventh Principle: Risk analysis must account for epistemic and aleatory uncertainties. Eighth Principle: Risk analysis must account for risks of low probability with extreme consequences. Ninth Principle: The time frame is central to quantitative risk analysis. Tenth Principle: Risk analysis must be holistic, adaptive, incremental, and sustainable, and it must be supported with appropriate data collection, metrics with which to measure efficacious progress, and criteria on the basis of which to act. The relevance and efficacy of each guiding principle is demonstrated by applying it to the U.S. Federal Aviation Administration complex Next Generation (NextGen) system of systems. PMID:22548671

Haimes, Yacov Y

2012-05-01

285

Uncertainty Quantification in Climate Modeling  

NASA Astrophysics Data System (ADS)

We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis requires a large number of training runs, as well as an output parameterization with respect to a fast-growing spectral basis set. To alleviate this issue, we adopt the Bayesian view of compressive sensing, well-known in the image recognition community. The technique efficiently finds a sparse representation of the model output with respect to a large number of input variables, effectively obtaining a reduced order surrogate model for the input-output relationship. The methodology is preceded by a sampling strategy that takes into account input parameter constraints by an initial mapping of the constrained domain to a hypercube via the Rosenblatt transformation, which preserves probabilities. Furthermore, a sparse quadrature sampling, specifically tailored for the reduced basis, is employed in the unconstrained domain to obtain accurate representations. The work is supported by the U.S. Department of Energy's CSSEF (Climate Science for a Sustainable Energy Future) program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

2011-12-01

286

Confronting uncertainty in peatland ecohydrology  

NASA Astrophysics Data System (ADS)

Background and Rationale: Peatlands are heavily water-controlled systems; long-term peat accumulation relies on slow organic matter decay in cool, saturated soil conditions. This interdependence of ecological, hydrological and biogeochemical processes makes peatlands prime examples of ecohydrological systems. Peatland ecohydrology exhibits a number of facets of complexity in the form of multiple mutual interdependencies between physical and biological processes and structures. Uncertainty as to the underlying mechanisms that control complex systems arises from a wide variety of sources; in this paper we explore three types of uncertainty in reference to peatland ecohydrology. 1) Parameterization. Analysis of complex systems such as peatlands lends itself naturally to a simulation modelling approach. An obvious source of uncertainty under a modelling approach is that of parameterization. A central theme in modelling studies is often that of sensitivity analysis: parameters to which model behavior is sensitive must be understood with high fidelity; in less sensitive areas of a model a greater level of uncertainty may be tolerated. Using a simple peatland water-budget model we demonstrate the importance of separating uncertainty from sensitivity. Using a Monte Carlo approach to analyze the model's behavior we identify those parameters that are both uncertain and to which the model's behavior is sensitive, and which therefore exhibit the most pressing need for further research. 2) Model structure. A more subtle form of uncertainty surrounds the assumed algorithmic structure of a model. We analyze the behavior of a simple ecohydrological model of long-term peatland development. By sequentially switching different feedbacks on and off we demonstrate that the level of complexity represented in the model is of central importance to the model's behavior, distinct from parameterization. 3) Spatial heterogeneity. We examine the role of horizontal spatial heterogeneity by extending the 1-D model used in section (2) to include a horizontal dimension. The spatially-explicit model simulates the growth of a domed bog over 5,000 years using the same equations, algorithmic structures and parameter values as the one-dimensional model. However, the behavior of the two models' two state variables (peat thickness, central water-table depth) is substantially different. The inclusion of spatial heterogeneity therefore not only leads to the prediction of spatial structures that simply cannot be represented in 1-D models, but also exerts an independent effect on state variables. This finding adds weight to the argument that spatial interactions play a non-trivial role in governing the behaviour of ecohydrological systems, and that failure to take account of spatial heterogeneity may fundamentally undermine models of ecohydrological systems. Synthesis: We demonstrate how exploring and confronting sources of uncertainty in peatland ecohydrology may be used to reduce the complexity of these and other systems, and to identify clearly the most urgent priorities for future observational research.

Morris, P. J.; Waddington, J. M.; Baird, A. J.; Belyea, L. R.

2011-12-01

287

Glyphs for Visualizing Uncertainty in Vector Fields  

Microsoft Academic Search

Environmental data have inherent uncertainty which is often ignored in visualization. Meteorological stations and doppler radars, including their time series averages, have a wealth of uncertainty information that traditional vector visualization methods such as meteorological wind barbs and arrow glyphs simply ignore. We have developed a new vector glyph to visualize uncertainty in winds and ocean currents. Our approach is

Craig M. Wittenbrink; Alex Pang; Suresh K. Lodha

1996-01-01

288

DAM SAFETY RISK ASSESSMENT WITH UNCERTAINTY ANALYSIS  

Microsoft Academic Search

The use of uncertainty analysis in conjunction with risk assessment provides enhanced information for decision makers. Uncertainties in risk analysis inputs are propagated through the risk analysis and evaluation steps of risk assessment to obtain estimates of the level of confidence in the risk assessment outcomes. This paper presents a framework for uncertainty analysis in dam safety risk assessment, including

Sanjay S. Chauhan; David S. Bowles

2003-01-01

289

Anxious Uncertainty and Reactive Approach Motivation (RAM)  

Microsoft Academic Search

In 4 experiments anxious uncertainty threats caused reactive approach motivation (RAM). In Studies 1 and 2, academic anxious uncertainty threats caused RAM as assessed by behavioral neuroscience and implicit measures of approach motivation. In Study 3 the effect of a relational anxious uncertainty threat on approach-motivated personal projects in participants' everyday lives was mediated by the idealism of those projects.

Ian McGregor; Kyle Nash; Nikki Mann; Curtis E. Phills

2010-01-01

290

Information-theoretic approach to uncertainty importance  

Microsoft Academic Search

A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the

C. K. Park; R. A. Bari

1985-01-01

291

The Stock Market: Risk vs. Uncertainty.  

ERIC Educational Resources Information Center

|This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty

Griffitts, Dawn

2002-01-01

292

10 CFR 436.24 - Uncertainty analyses.  

Code of Federal Regulations, 2010 CFR

...2009-01-01 2009-01-01 false Uncertainty analyses. 436.24 Section 436...Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items...Federal agencies may examine the impact of uncertainty on the calculation of life...

2009-01-01

293

Regarding Uncertainty in Teachers and Teaching  

ERIC Educational Resources Information Center

|The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to…

Helsing, Deborah

2007-01-01

294

10 CFR 436.24 - Uncertainty analyses.  

Code of Federal Regulations, 2010 CFR

...2010-01-01 2010-01-01 false Uncertainty analyses. 436.24 Section 436...Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items...Federal agencies may examine the impact of uncertainty on the calculation of life...

2010-01-01

295

Experimentation and uncertainty analysis for engineers  

Microsoft Academic Search

The application of uncertainty analysis (UA) methods to experimental programs is discussed in an introduction for advanced undergraduate and graduate students of engineering and the physical sciences. Chapters are devoted to experimental errors and uncertainty; statistical considerations in measurement uncertainties; general UA methods for experiment planning; detailed UA methods for experiment design; problems due to variable but deterministic bias errors,

H. W. Coleman; W. G. Jr. Steele

1989-01-01

296

CONSENSUS ON GLOBAL GOVERNANCE PRINCIPLES?  

Microsoft Academic Search

Is there a consensus on governance principles beyond national borders? There seems to be a converging trend towards widely accepted Global Governance Principles as expressed in the OECD Principles for example. This paper argues that formal and informal governance mechanisms should be integrated. The latter focuses on relationship-based enabling access to scarce resources - so typical in Asia - whereas

Peter Verhezen; Paul Morse

2009-01-01

297

The Sidetracking Meta-Principle  

Microsoft Academic Search

The sidetracking principle is nothing but an instance of the well-known principle of procrastination, advising postponement of the problematic until the inevitable has been dealt with, in the hope that the problematic will either be no longer an issue or becomes less problematic. The aim of this paper is to show how the sidetracking principle as a method and technique

Luís Moniz Pereira; José Júlio Alferes; Carlos Damásio

298

Archimedes' Principle in General Coordinates  

ERIC Educational Resources Information Center

|Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

Ridgely, Charles T.

2010-01-01

299

Complex Correspondence Principle  

SciTech Connect

Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

Bender, Carl M.; Meisinger, Peter N. [Department of Physics, Washington University, St. Louis, Missouri 63130 (United States); Hook, Daniel W. [Theoretical Physics, Imperial College London, London SW7 2AZ (United Kingdom); Wang Qinghai [Department of Physics, National University of Singapore, Singapore 117542 (Singapore)

2010-02-12

300

Principles of smile design  

PubMed Central

An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile.

Bhuvaneswaran, Mohan

2010-01-01

301

Principles of Seismology  

NASA Astrophysics Data System (ADS)

Principles of Seismology is targeted for upper-level undergraduate and beginning graduate students in seismology. It is quite good as an intermediate-level theoretical seismology text, and it may also serve as a general reference book for seismologists. The author intended to make this book “student-friendly,” and to a large degree he has accomplished this purpose. Compared to other recent seismology texts at a similar level, this text explains the fundamental aspects of theoretical seismology more thoroughly, and the traditional core topics are well presented.

Wu, Francis T.

302

Remote Sensing Principles  

NSDL National Science Digital Library

This introduction to Earth observation includes definitions of several terms, examples taken from real situations, and questions, answers, and exercises. A simple example of traditional chorological mapping methods and is used to show some fundamental principles of satellite images. Histogram, pixel and classification are introduced. There are discussions about remote sensing, the history of Earth observation, and geostationary and solar synchronous orbits. In addition, the basic physical concepts underlying remote sensing are explained, with the help of some relatively simple viewgraphs. This site is also available in German, French, Italian and Spanish.

303

Theory Comparison: Uncertainty Reduction, Problematic Integration, Uncertainty Management, and Other Curious Constructs.  

ERIC Educational Resources Information Center

|Compares three theories examining the role of communication in producing and coping with subjective uncertainty. Notes that uncertainty reduction theory offers axioms and derived theorems that describe communicative and noncommunicative causes and consequences of uncertainty. Compares meanings of "uncertainty" in the three theories as well as the…

Bradac, James J.

2001-01-01

304

Uncertainty assessment in probabilistic risk assessment  

SciTech Connect

This paper focuses on our proposal for the different roles that data and expert opinion play in uncertainty analysis. Parameters for which reliable data exist are estimated by classical statistical techniques. Their uncertainty bounds are statistical confidence limits. Uncertainty about data-free parameters is expressed as a range, or set, of plausible values, with no probabilistic connotations. For parameters with both data and opinion sources, conditional confidence limits can be used to assess both total uncertainty, and the separate contributions of data-based and data-free uncertainties.

Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

1985-01-01

305

Differentiate climate change uncertainty from other uncertainty sources for water quality modeling with Bayesian framework  

NASA Astrophysics Data System (ADS)

Prediction of water quality under future climate changes is always associated with significant uncertainty resulting from the use of climate models and stochastic weather generator. The future related uncertainty is usually mixed with the intrinsic uncertainty sources arising from model structure and parameterization which present also for modeling past and current events. For an effective water quality management policy, the uncertainty sources have to be differentiated and quantified separately. This work applies the Baysian framework in two steps to quantify the climate change uncertainty as input uncertainty and the parameter uncertainty respectively. The HYPE model (Hydrological Prediction for the Environment) from SMHI is applied to simulate the nutrient (N, P) sources in a 100 km2 agricultural low-land catchment in Germany, Weida. The results show that climate change shifts the uncertainty space in terms of probability density function (PDF), and a large portion of future uncertainty is not covered by current uncertainty.

Jiang, S.; Liu, M.; Rode, M.

2011-12-01

306

Principles of Safety Pharmacology  

PubMed Central

Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.).

Pugsley, M K; Authier, S; Curtis, M J

2008-01-01

307

Principle of relative locality  

SciTech Connect

We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

Amelino-Camelia, Giovanni [Dipartimento di Fisica, Universita 'La Sapienza', and Sez. Roma1 INFN, P. le A. Moro 2, 00185 Roma (Italy); Freidel, Laurent; Smolin, Lee [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario N2J 2Y5 (Canada); Kowalski-Glikman, Jerzy [Institute for Theoretical Physics, University of Wroclaw, Pl. Maxa Borna 9, 50-204 Wroclaw (Poland)

2011-10-15

308

Great Lakes Literacy Principles  

NASA Astrophysics Data System (ADS)

Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

Fortner, Rosanne W.; Manzo, Lyndsey

2011-03-01

309

Congress probes climate change uncertainties  

NASA Astrophysics Data System (ADS)

Policymakers are demanding information about climate change faster than it can be turned out by scientists. This conflict between politics and science was debated at a recent congressional hearing on priorities in global change research. On October 8 and 10, panels of scientists that included AGU president-elect Ralph J. Cicerone of the University of California attempted to identify scientific uncertainties in global warming research before the House Science Committee's Subcommittee on Science.“Decisionmakers provided with incomplete information are left with the problem of choosing among options where the consequences of a wrong choice could be disastrous,” said subcommittee chair Rick Boucher (D-Va.).

Simarski, Lynn Teo

310

Basics of Estimating Measurement Uncertainty  

PubMed Central

Summary All measurements are imperfect and have many potential sources of variation.An estimate of measurement uncertainty (MU) provides an interval of values within which the true value is believed to lie with a stated probability, and is therefore a quantitative indication of the reliability of a measurement.MU estimates are essential for assessing whether methods are suitable for clinical use and for comparison of results of a similar type.MU estimates can help identify method limitations and opportunities for improvement.

White, Graham H

2008-01-01

311

Aspects of universally valid Heisenberg uncertainty relation  

NASA Astrophysics Data System (ADS)

A numerical illustration of a universally valid Heisenberg uncertainty relation, which was proposed recently, is presented by using the experimental data on spin-measurements by J. Erhart et al. [Nat. Phys. 8, 185 (2012)]. This uncertainty relation is closely related to a modified form of the Arthurs-Kelly uncertainty relation, which is also tested by the spin-measurements. The universally valid Heisenberg uncertainty relation always holds, but both the modified Arthurs-Kelly uncertainty relation and the Heisenberg error-disturbance relation proposed by Ozawa, which was analyzed in the original experiment, fail in the present context of spin-measurements, and the cause of their failure is identified with the assumptions of unbiased measurement and disturbance. It is also shown that all the universally valid uncertainty relations are derived from Robertson's relation and thus the essence of the uncertainty relation is exhausted by Robertson's relation, as is widely accepted.

Fujikawa, Kazuo; Umetsu, Koichiro

2013-01-01

312

The Precautionary Principle in EU and US Chemicals Policy: A Comparison of Industrial Chemicals Legislation  

Microsoft Academic Search

\\u000a In this chapter, the precautionary principle will be considered as the ­starting point for decision-making on chemicals in\\u000a cases of scientific uncertainty. The principle will serve as the reference point for an analysis and a comparison of chemicals­\\u000a policies and, in particular, of legislation for industrial chemicals in the European Union and the United States of America.\\u000a In the second

Mikael Karlsson

313

Scientific basis for the Precautionary Principle  

SciTech Connect

The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as 'stressors', inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones.

Vineis, Paolo [Imperial College London, St. Mary's Campus, Norfolk Place, W2 1PG London (United Kingdom)]. E-mail: p.vineis@imperial.ac.uk

2005-09-01

314

The legal status of uncertainty  

NASA Astrophysics Data System (ADS)

Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We are, in fact, presently experiencing a detrimental increase of legal actions taken against the authorities of civil protection whom, relying on the forecasts of mathematical models, fail in protecting the population. It is our profound concern that civilians have granted the right of being protected by any means, and at the same extent, from natural hazards and from the fallacious behaviour of whom should grant individual safety. But, at the same time, a dangerous overcriminalization could have a negative impact on the Civil Protection system inducing a dangerous defensive behaviour which is costly and ineffective. A few case studies are presented in which the role of uncertainty, in numerical predictions, is made evident and discussed. Scientists, thus, need to help policymakers to agree on sound procedures that must recognize the real level of unpredictability. Hence, we suggest the creation of an international and interdisciplinary committee, with the scope of having politics, jurisprudence and science communicate, to find common solutions to a common problem.

Ferraris, L.; Miozzo, D.

2009-09-01

315

Improved water ?2H and ?18O calibration and calculation of measurement uncertainty using a simple software tool.  

PubMed

The calibration of all ?(2)H and ?(18)O measurements on the VSMOW/SLAP scale should be performed consistently, based on similar principles, independent of the instrumentation used. The basic principles of a comprehensive calibration strategy are discussed taking water as example. The most common raw data corrections for memory and drift effects are described. Those corrections result in a considerable improvement in data consistency, especially in laboratories analyzing samples of quite variable isotopic composition (e.g. doubly labelled water). The need for a reliable uncertainty assessment for all measurements is discussed and an easy implementation method proposed. A versatile evaluation method based on Excel macros and spreadsheets is presented. It corrects measured raw data for memory and drift effects, performs the calibration and calculates the combined standard uncertainty for each measurement. It allows the easy implementation of the discussed principles in any user laboratory. Following these principles will improve the comparability of data among laboratories. PMID:21913248

Gröning, Manfred

2011-10-15

316

BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance  

NASA Astrophysics Data System (ADS)

Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

Lira, Ignacio

2003-08-01

317

Novel Mathematical and Computational Techniques for Robust Uncertainty Quantification.  

National Technical Information Service (NTIS)

Uncertainty quantification refers to a broad set of techniques for understanding the impact of uncertainties in complicated mechanical and physical systems. In this context 'uncertainty' can take on many meanings. Aleatoric uncertainty refers to inherent ...

D. Gottlieb J. Hesthaven P. Dupuis

2011-01-01

318

Principles to minimize scars.  

PubMed

Principles to minimize scars include attention to a multitude of intrinsic and extrinsic patient factors preoperatively, operatively, and postoperatively. Preoperatively the goal is to maximize the treatment of patient specific comorbidities and limit the usage of medications that can have negative effects on healing. Operatively, the focus is on proper incisional planning, meticulous surgical technique and hemostasis, judicious use of prophylactic antibiotics, and focus on tensionless closures. Postoperatively, we must maximize the healing environment by keeping the wound well hydrated and closely monitoring and intervening early in high-risk wounds. We also have the responsibility to provide evidence-based recommendations, to the best of our ability, regarding the myriad of over-the-counter products. The field of scar prevention is ever changing and the new frontier focuses on controlling the microenvironment of wounds and altering signaling molecules to promote near scarless healing. PMID:23027213

Gantwerker, Eric Alan; Hom, David B

2012-10-01

319

The Anthropic Principle  

NASA Astrophysics Data System (ADS)

The questions that were purely in the realms of philosophy are now beginning to be answered by science. The second Venice Conference on Cosmology and Philosophy explores the anthropic principle which states that the Universe has the conditions we observe because we are here. Out of all possible universes we can only experience the restricted class that permits observers. This realization has profound implications for cosmology, philosophy and theology; all of which are explored in this book by thirteen contributors who gathered to discuss and share their theories within the context of science. The result is a unique collection of papers of great value to professional astronomers and philosophers interested in the role of observers in the Universe.

Bertola, F.; Curi, U.

1993-07-01

320

Dynamical principles in neuroscience  

SciTech Connect

Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I. [Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and GNB, Departamento de Ingenieria Informatica, Universidad Autonoma de Madrid, 28049 Madrid, Spain and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States); Department of Physics and Marine Physical Laboratory, Scripps Institution of Oceanography and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States)

2006-10-15

321

Optimal invasive species management under multiple uncertainties.  

PubMed

The management programs for invasive species have been proposed and implemented in many regions of the world. However, practitioners and scientists have not reached a consensus on how to control them yet. One reason is the presence of various uncertainties associated with the management. To give some guidance on this issue, we characterize the optimal strategy by developing a dynamic model of invasive species management under uncertainties. In particular, focusing on (i) growth uncertainty and (ii) measurement uncertainty, we identify how these uncertainties affect optimal strategies and value functions. Our results suggest that a rise in growth uncertainty causes the optimal strategy to involve more restrained removals and the corresponding value function to shift up. Furthermore, we also find that a rise in measurement uncertainty affects optimal policies in a highly complex manner, but their corresponding value functions generally shift down as measurement uncertainty rises. Overall, a rise in growth uncertainty can be beneficial, while a rise in measurement uncertainty brings about an adverse effect, which implies the potential gain of precisely identifying the current stock size of invasive species. PMID:21704642

Kotani, Koji; Kakinaka, Makoto; Matsuda, Hiroyuki

2011-06-17

322

Theoretical uncertainty in baryon oscillations  

SciTech Connect

We discuss the systematic uncertainties in the recovery of dark energy properties from the use of baryon acoustic oscillations as a standard ruler. We demonstrate that while unknown relativistic components in the universe prior to recombination would alter the sound speed, the inferences for dark energy from low-redshift surveys are unchanged so long as the microwave background anisotropies can measure the redshift of matter-radiation equality, which they can do to sufficient accuracy. The mismeasurement of the radiation and matter densities themselves (as opposed to their ratio) would manifest as an incorrect prediction for the Hubble constant at low-redshift. In addition, these anomalies do produce subtle but detectable features in the microwave anisotropies.

Eisenstein, Daniel [Steward Observatory, University of Arizona, Tucson, Arizona 85721 (United States); White, Martin [Departments of Physics and Astronomy, University of California, Berkeley, California 94720 (United States)

2004-11-15

323

Environmental load uncertainties for offshore structures  

SciTech Connect

A methodology for assessing the effect of different sources of uncertainty on the calculation of load effect on offshore structures is presented. A consistent classification of uncertainties was adopted and used as a basis to develop models to estimate the effect of different uncertainties on specified design loads. It is shown that distribution parameter uncertainties arising from limitations on the quantity of statistical data are not likely to have a significant effect on design loads. By contrast, model uncertainties can greatly increase the design loads, and the increase is sensitive to the probabilistic models used to describe model error. The methodology and results can be used by design engineers to take model uncertainties into account in estimating specified loads. They also form the basis for developing and calibrating a new information-sensitive code format.

Nessim, M.A.; Hong, H.P. [Centre for Engineering Research Inc., Edmonton, Alberta (Canada); Jordaan, I.J. [Memorial Univ. of Newfoundland, St. John`s, Newfoundland (Canada). Faculty of Engineering and Applied Science

1995-11-01

324

Bayesian Uncertainty Analyses Via Deterministic Model  

NASA Astrophysics Data System (ADS)

Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

Krzysztofowicz, R.

2001-05-01

325

Certainties and uncertainties of orbitally tuned timescales  

NASA Astrophysics Data System (ADS)

High precision timescales are often based on an integrated stratigraphical approach using several dating techniques. Because many records show an imprint of orbital climate forcing, these imprints can be used to obtain high resolution stratigraphies. However, uncertainties are hardly ever assigned to these timescales, mainly because uncertainties are not straightforward to quantify. We discuss different sources of uncertainty (the uncertainty of the astronomical parameters used for the tuning procedure, climatic response time to orbital forcing, non-constant sedimentation rates) and discuss the assignment of realistic ages and (un)certainties for chronologies based on orbital tuning. Besides discussing uncertainties, it is important to note that very most cyclostratigraphic studies are based on an integrated stratigraphic approach, and are not solely based on orbital tuning.

Zeeden, Christian; Rivera, Tiffany; Lourens, Lucas; Hilgen, Frederik

2013-04-01

326

Uncertainty relations and entanglement in fermion systems  

Microsoft Academic Search

The violation of uncertainty relations is used as a signature of entanglement for both pure and mixed states of two identical fermions. In the case of fermions with a four-dimensional single-particle Hilbert space we obtain several different types of uncertainty-related entanglement criteria based on local uncertainty relations, on the sum of variances of projectors, and on various entropic measures. Within

C. Zander; A. R. Plastino

2010-01-01

327

Geostatistical modelling of uncertainty in soil science  

Microsoft Academic Search

This paper addresses the issue of modelling the uncertainty about the value of continuous soil attributes, at any particular unsampled location (local uncertainty) as well as jointly over several locations (multiple-point or spatial uncertainty). Two approaches are presented: kriging-based and simulation-based techniques that can be implemented within a parametric (e.g. multi-Gaussian) or non-parametric (indicator) frameworks. As expected in theory and

P. Goovaerts

2001-01-01

328

Uncertainty analysis for soil-terrain models  

Microsoft Academic Search

The aim of the study was to examine how robust soil?terrain models are to uncertainty in the source elevation data. The study site was a 74 ha agricultural field in Australia. A global positioning system was used to measure elevation and the uncertainty of the measurement, therefore allowing maps of elevation and its uncertainty to be created. Monte?Carlo simulation with a

Thomas F. A. Bishop; Budiman Minasny; Alex B. Mcbratney

2006-01-01

329

Importance of scientific uncertainty in decision making  

Microsoft Academic Search

Uncertainty in environmental decision making should not be thought of as a problem that is best ignored. In fact, as is illustrated\\u000a in a simple example, we often informally make use of awareness of uncertainty by hedging decisions away from large losses.\\u000a This hedging can be made explicit and formalized using the methods of decision analysis. While scientific uncertainty is

Kenneth H. Reckhow

1994-01-01

330

Uncertainty Analysis in the Estimation of Exposure  

Microsoft Academic Search

This article addresses the ubiquitous nature of uncertainty in industrial hygiene-related risk assessment and defines two basic types of uncertainty: natural variability and a basic lack of knowledge. A relatively simple physical-chemical modeling example is provided as an illustration in which uncertainty and sensitivity are described using two methods, a conventional technique and a readily available and user-friendly computer simulation

Michael A. Jayjock

1997-01-01

331

Quantifying reliability uncertainty : a proof of concept  

Microsoft Academic Search

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go\\/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative

Kathleen V. Diegert; Michael A. Dvorack; James T. Ringland; Michael Joseph Mundt; Aparna Huzurbazar; John F. Lorio; Quinn Fatherley; Christine Anderson-Cook; Alyson G. Wilson; Rena M. Zurn

2009-01-01

332

Uncertainty in integrated coastal zone management  

Microsoft Academic Search

Uncertainty plays a major role in Integrated Coastal Zone Management (ICZM). A large part of this uncertainty is connected\\u000a to our lack of knowledge of the integrated functioning of the coastal system and to the increasing need to act in a pro-active\\u000a way. Increasingly, coastal managers are forced to take decisions based on information which is surrounded by uncertainties.\\u000a Different

Henriëtte S. Otter

2000-01-01

333

The Le Châtelier Principle: How Much a Principle?  

Microsoft Academic Search

The Principle of Le Châtelier is analyzed for the case of reactions taking place with real gases. The Principle, as usually stated in textbooks, is true only for ideal gas mixtures, and may be violated when the nonideal behavior of a gas is taken into consideration.

Jaime Wisniak

1999-01-01

334

Move over LOCF: Principled Methods for Handling Missing Data in Sleep Disorder Trials  

Microsoft Academic Search

Missing data, e.g. patient attrition, are endemic in sleep disorder clinical trials. Common approaches for dealing with this situation include “complete-case analysis” and last observation carried forward (LOCF). Although these methods are simple to implement, they are deeply flawed in that they may introduce bias and underestimate uncertainty, leading to erroneous conclusions. There are alternative principled approaches, however, that are

Maren K. Olsen; Karen M. Stechuchak; Jack D. Edinger; Christi S. Ulmer; Robert F. Woolson

335

Updated uncertainty budgets for NIST thermocouple calibrations  

NASA Astrophysics Data System (ADS)

We have recently updated the uncertainty budgets for calibrations in the NIST Thermocouple Calibration Laboratory. The purpose for the updates has been to 1) revise the estimated values of the relevant uncertainty elements to reflect the current calibration facilities and methods, 2) provide uncertainty budgets for every standard calibration service offered, and 3) make the uncertainty budgets more understandable to customers by expressing all uncertainties in units of temperature (°C) rather than emf. We have updated the uncertainty budgets for fixed-point calibrations of type S, R, and B thermocouples and comparison calibrations of type R and S thermocouples using a type S reference standard. In addition, we have constructed new uncertainty budgets for comparison calibrations of type B thermocouples using a type B reference standard as well as using both a type S and type B reference standard (for calibration over a larger range). We have updated the uncertainty budgets for comparison calibrations of base-metal thermocouples using a type S reference standard and alternately using a standard platinum resistance thermometer reference standard. Finally, we have constructed new uncertainty budgets for comparison tests of noble-metal and base-metal thermoelements using a type S reference standard. A description of these updates is presented in this paper.

Meyer, C. W.; Garrity, K. M.

2013-09-01

336

Sources of uncertainty in pesticide fate modelling.  

PubMed

There is worldwide interest in the application of probabilistic approaches to pesticide fate models to account for uncertainty in exposure assessments. The first steps in conducting a probabilistic analysis of any system are: (i) to identify where the uncertainties come from; and (ii) to pinpoint those uncertainties that are likely to affect most of the predictions made. This article aims at addressing those two points within the context of exposure assessment for pesticides through a review of the different sources of uncertainty in pesticide fate modelling. The extensive listing of sources of uncertainty clearly demonstrates that pesticide fate modelling is laced with uncertainty. More importantly, the review suggests that the probabilistic approaches, which are typically being deployed to account for uncertainty in the pesticide fate modelling, such as Monte Carlo modelling, ignore a number of key sources of uncertainty, which are likely to have a significant effect on the prediction of environmental concentrations for pesticides (e.g. model error, modeller subjectivity). Future research should concentrate on quantifying the impact these uncertainties have on exposure assessments and on developing procedures that enable their integration within probabilistic assessments. PMID:14630412

Dubus, Igor G; Brown, Colin D; Beulke, Sabine

2003-12-30

337

Vedic principles of therapy.  

PubMed

This paper introduces Vedic principles of therapy as a holistic integration of healing and human development. The most integrative aspect is a "consciousness-based" approach in which the bottom line of the mind is consciousness itself, accessed by transcending mental activity to its simplest ground state. This directly contrasts with "unconscious-based" approaches that hold the basis of conscious mind is the unconscious, such as analytic, humanistic, and cognitive-behavioral approaches. Although not presented as a specific therapeutic approach, interventions associated with this Vedic approach have extensive support in the applied research literature. A brief review of experimental research toward a general model of mind-and cutting-edge developments in quantum physics toward nonlocal mind-shows a convergence on the ancient Vedic model of mind. Comparisons with contemporary therapies further show that the simplicity, subtlety, and holistic nature of the Vedic approach represent a significant advance over approaches which have overlooked the fundamental ground state of the mind. PMID:22225931

Boyer, R W

338

Uncertainty Analysis of CROPGRO-Cotton Model  

NASA Astrophysics Data System (ADS)

An application of crop simulation models have become an inherent part of research and decision making process. As many decision making processes solely rely on the results obtained from simulation models, consideration of model uncertainties along with model accuracy in decision making processes have also become increasingly important. Newly developed crop model, CROPGRO - Cotton model is complex simulation model that has been heavily parameterized. The values of those parameters were obtained from literature which also carries uncertainties. True uncertainty associated with important model parameters were not known. The objective of this study was to estimate uncertainties associated with model parameters and associated uncertainties in model outputs. The uncertainty assessment was carried out using widely accepted Geenralized Likelihood Uncertainty Estimation (GLUE technique. Dataset on this analysis was collected from four different experiments at three geographic locations. Primary results show that the amount of uncertainties in model input parameters were narrowed down significantly from the priori knowledge of selected parameters. The expected means of parameters obtained from their posterior distributions were not considerably different from their prior means and default values in the model. However, importantly the coefficient of variation of those parameters were reduced considerably. Maximum likelihood estimates of selected parameter improved the model performance. The fitting of the model to measured LAI, and biomass components was reasonably well with R-squared values for total above ground biomass for all four sites ranging between 0.86 and 0.98. Approximate reduction of uncertainties in input parameters ranged between 25%-85% and corresponding model output uncertainties reductions ranged between 62%-76%. Most of the measurements were covered within the 95% confidence interval estimated from 2.5% and 97.5% quantiles of cumulative distributions of model outputs generated from posterior distribution of model parameters. The study demonstrated an efficient prediction of uncertainties in model input and outputs using a widely accepted GLUE methodology.

Pathak, T. B.; Jones, J. W.; Fraisse, C.; Wright, D.; Hoogenboom, G.; Judge, J.

2009-12-01

339

Basic principles of celestial navigation  

Microsoft Academic Search

Celestial navigation is a technique for determining one's geographic position by the observation of identified stars, identified planets, the Sun, and the Moon. This subject has a multitude of refinements which, although valuable to a professional navigator, tend to obscure the basic principles. I describe these principles, give an analytical solution of the classical two-star-sight problem without any dependence on

James A. van Allen

2004-01-01

340

A variational principle of hydromechanics  

Microsoft Academic Search

1. The purpose of this paper is to reveal the part played in variational principles of hydromechanics by a certain group of infinitesimal transformations of the fields of density and velocity. This group, called here the hydromechanical variation and originating from some elementary requirement concerning variation of matter, seems to be essential in variational principles of hydromechanic,4, iaot only in

S. Drobot; A. Rybarski

1958-01-01

341

Binding Principles in Down syndrome  

Microsoft Academic Search

In an experiment designed to tap into knowledge of Binding in individuals with Down syndrome (DS), it was found that subjects had specific difficulties assigning appropriate interpretation to reflexives, traditionally claimed to be governed by Principle A of standard Binding Theory, as opposed to pronouns, constrained by Principle B in the same framework. This pattern, not previously evidenced in the

ALEXANDRA PEROVIC

2001-01-01

342

Principles of canonical action research  

Microsoft Academic Search

Despite the growing prominence of canonical action research (CAR) in the information systems discipline, a paucity of methodological guidance contin- ues to hamper those conducting and evaluating such studies. This article elicits a set of five principles and associated criteria to help assure both the rigor and the relevance of CAR in information systems. The first principle relates to the

Robert M. Davison; Maris G. Martinsons; Ned Kock

2004-01-01

343

Data Anlaysis: Fundamental Counting Principle  

NSDL National Science Digital Library

This lesson plan presents an activity where students use charts and tree diagrams to show the possible outcomes of probability experiments and the likelihood of each event. In the plan the teacher guides the class to understand and apply the fundamental counting principle. Two independent worksheets provide students with more practice creating sample spaces and applying the fundamental counting principle.

2012-01-01

344

Some principles for youth learning  

Microsoft Academic Search

This paper proposes some principles for youth learning developed from a major research project. Specific parts of the project have been published in other literature, and this paper summarises key findings before proposing a set of principles to support their learning. The findings of the research about youth learners and how they learn were analysed in the context of adult

Sarojni Choy; Brian Delahaye

345

Principles for Participatory Action Research  

Microsoft Academic Search

For serious practitioners of participatory action research, it is helpful to identify its principles. This paper outlines some principles of participatory action research in Australia that have been derived from theory and practice in both Western and cross-cultural contexts. Participatory action research is identified with critical social theory and is exemplified with two perspectives from participatory action research in Northern

Robin McTaggart

1991-01-01

346

Principles of Play for Soccer  

ERIC Educational Resources Information Center

Soccer coaches must understand the principles of play if they want to succeed. The principles of play are the rules of action that support the basic objectives of soccer and the foundation of a soccer coaching strategy. They serve as a set of permanent criteria that coaches can use to evaluate the efforts of their team. In this article, the author…

Ouellette, John

2004-01-01

347

PRINCIPLES AND PROCEDURES ON FISCAL  

Microsoft Academic Search

Fiscal science advertise in most analytical situations, while the principles reiterated by specialists in the field in various specialized works. The two components of taxation, the tax system relating to the theoretical and the practical procedures relating to tax are marked by frequent references and invocations of the underlying principles to tax. This paper attempts a return on equity fiscal

Morar Ioan Dan

2011-01-01

348

Principles, Practices, and Social Movements  

Microsoft Academic Search

Consider two current controversies in American law and politics: the first is whether the expansion of copyright, trademark, and other forms of intellectual property conflicts with the free speech principle; the second is whether government collection and use of racial data (in the census or in law enforcement) violates the antidiscrimination principle. What do these controversies have in common? Both

Jack M Balkin; Reva B Siegel

2006-01-01

349

Investment and institutional uncertainty: A comparative study of different uncertainty measures  

Microsoft Academic Search

Investment and Institutional Uncertainty: A Comparative Study of Different Uncertainty Measures. — There is ample empirical\\u000a evidence of a negative relationship between aspects of institutional uncertainty and investment. Most studies, however, do\\u000a not allow a comparison between different dimensions of such uncertainty because they focus on specific indicators, particular\\u000a regions or different periods. The paper concludes with an evaluation of

Aymo Brunetti; Beatrice Weder

1998-01-01

350

The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes  

Microsoft Academic Search

A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

Pierluigi Fortini; Roberto Onofrio; Alessandro Rioli

1993-01-01

351

Herbivore and pathogen damage on grassland and woodland plants: a test of the herbivore uncertainty principle  

Microsoft Academic Search

Researchers can alter the behaviour and ecology of their study organisms by conducting such seemingly benign activities as non-destructive measurements and observations. In plant communities, researcher visitation and measurement of plants may increase herbivore damage in some plant species while decreasing it in others. Simply measuring plants could change their competitive ability by altering the amount of herbivore damage that

Stefan A. Schnitzer; Peter B. Reich; Belle Bergner; Walter P. Carson

2002-01-01

352

Physical Principles of Evolution  

NASA Astrophysics Data System (ADS)

Theoretical biology is incomplete without a comprehensive theory of evolution, since evolution is at the core of biological thought. Evolution is visualized as a migration process in genotype or sequence space that is either an adaptive walk driven by some fitness gradient or a random walk in the absence of (sufficiently large) fitness differences. The Darwinian concept of natural selection consisting in the interplay of variation and selection is based on a dichotomy: All variations occur on genotypes whereas selection operates on phenotypes, and relations between genotypes and phenotypes, as encapsulated in a mapping from genotype space into phenotype space, are central to an understanding of evolution. Fitness is conceived as a function of the phenotype, represented by a second mapping from phenotype space into nonnegative real numbers. In the biology of organisms, genotype-phenotype maps are enormously complex and relevant information on them is exceedingly scarce. The situation is better in the case of viruses but so far only one example of a genotype-phenotype map, the mapping of RNA sequences into RNA secondary structures, has been investigated in sufficient detail. It provides direct information on RNA selection in vitro and test-tube evolution, and it is a basis for testing in silico evolution on a realistic fitness landscape. Most of the modeling efforts in theoretical and mathematical biology today are done by means of differential equations but stochastic effects are of undeniably great importance for evolution. Population sizes are much smaller than the numbers of genotypes constituting sequence space. Every mutant, after all, has to begin with a single copy. Evolution can be modeled by a chemical master equation, which (in principle) can be approximated by a stochastic differential equation. In addition, simulation tools are available that compute trajectories for master equations. The accessible population sizes in the range of 10^7le Nle 10^8 molecules are commonly too small for problems in chemistry but sufficient for biology.

Schuster, Peter

353

Principles of animal extrapolation  

SciTech Connect

Animal Extrapolation presents a comprehensive examination of the scientific issues involved in extrapolating results of animal experiments to human response. This text attempts to present a comprehensive synthesis and analysis of the host of biomedical and toxicological studies of interspecies extrapolation. Calabrese's work presents not only the conceptual basis of interspecies extrapolation, but also illustrates how these principles may be better used in selection of animal experimentation models and in the interpretation of animal experimental results. The book's theme centers around four types of extrapolation: (1) from average animal model to the average human; (2) from small animals to large ones; (3) from high-risk animal to the high risk human; and (4) from high doses of exposure to lower, more realistic, doses. Calabrese attacks the issues of interspecies extrapolation by dealing individually with the factors which contribute to interspecies variability: differences in absorption, intestinal flora, tissue distribution, metabolism, repair mechanisms, and excretion. From this foundation, Calabrese then discusses the heterogeneticity of these same factors in the human population in an attempt to evaluate the representativeness of various animal models in light of interindividual variations. In addition to discussing the question of suitable animal models for specific high-risk groups and specific toxicological endpoints, the author also examines extrapolation questions related to the use of short-term tests to predict long-term human carcinogenicity and birth defects. The book is comprehensive in scope and specific in detail; for those environmental health professions seeking to understand the toxicological models which underlay health risk assessments, Animal Extrapolation is a valuable information source.

Calabrese, E.J.

1991-01-01

354

Individuation, Counting, and Statistical Inference: The Role of Frequency and Whole-Object Representations in Judgment Under Uncertainty  

Microsoft Academic Search

Evolutionary approaches to judgment under uncertainty have led to new data showing that untutored subjects reliably produce judgments that conform to many principles of probability theory when (a) they are asked to compute a frequency instead of the probability of a single event and (b) the relevant information is expressed as frequencies. But are the frequency-computation systems implicated in these

Gary L. Brase; Leda Cosmides; John Tooby

1998-01-01

355

Funding the Unfundable: Mechanisms for Managing Uncertainty in Decisions on the Introduction of New and Innovative Technologies into Healthcare Systems  

Microsoft Academic Search

As tensions between payers, responsible for ensuring prudent and principled use of scarce resources, and both providers and patients, who legitimately want access to technologies from which they could benefit, continue to mount, interest in approaches to managing the uncertainty surrounding the introduction of new health technologies has heightened. The purpose of this project was to compile an inventory of

Tania Stafinski; Christopher J. McCabe; Devidas Menon

2010-01-01

356

Environmental uncertainty and vertical integration in a small business network : The case of Natural Valley Farms Inc  

Microsoft Academic Search

Purpose – The paper intends to identify and explain key managerial principles for vertical integration in the cattle industry during a key period of environment uncertainty. Design\\/methodology\\/approach – Following Yin's advice on using case studies for exploratory theory development, this study builds on existing theories of vertical integration through a case study that explores potential prospects for cattle producers in

Sylvain Charlebois; Ronald D. Camp II

2007-01-01

357

Individuation, counting, and statistical inference: The role of frequency and whole-object representations in judgment under uncertainty  

Microsoft Academic Search

Evolutionary approaches to judgment under uncertainty have led to new data showing that untutored subject reliably produce judgments that conform to may principles of probability theory when (a) they are asked to compute a frequency instead of the probability of a single event, and (b) the relevant information is expressed as frequencies. But are the frequency- computation systems implicated in

Gary L. Brase; Leda Cosmides; John Tooby

1998-01-01

358

Relational Uncertainty Predicting Appraisals of Face Threat in Courtship: Integrating Uncertainty Reduction Theory and Politeness Theory  

Microsoft Academic Search

This article combines uncertainty reduction theory and politeness theory to deduce hypotheses about how relational uncertainty may predict appraisals of face threat. Participants were 273 individuals who (a) reported on relational uncertainty; (b) simulated leaving a voice mail message for their dating partner to address the assigned goal of catching up, sharing information, giving comfort, or receiving comfort; and (c)

Leanne K. Knobloch; Kristen L. Satterlee; Stephen M. DiDomenico

2010-01-01

359

Robustness with respect to delay uncertainties of a predictor-observer based discrete-time controller  

Microsoft Academic Search

This paper focuses on the delay-dependent stability problem of a discrete-time prediction scheme to stabilize possible unstable continuous-time systems. The delay-dependent stability condition is expressed in terms of LMIs. The separation principle between the proposed predictor and a state observer is also proved. The closed-loop system is shown to be robust with respect to uncertainties in the knowledge on the

P. Garcia; P. Castillo; R. Lozano; P. Albertos

2006-01-01

360

Free Motion of a Dirac Particle with a Minimum Uncertainty in Position  

NASA Astrophysics Data System (ADS)

In this paper, we present a covariant, relativistic noncommutative algebra which includes two small deformation parameters. Using this algebra, we obtain a generalized uncertainty principle which predicts a minimal observable length in measurement of space-time distances. Then, we introduce a new representation for coordinate and momentum operators which leads to a generalized Dirac equation. The solutions of the generalized Dirac equation for a free particle will be explicitly obtained. We also obtain the modified fermionic propagator for a free Dirac particle.

Shokrollahi, Arman

2012-08-01

361

Uncertainty and Learning in Pharmaceutical Demand  

Microsoft Academic Search

Exploiting a rich panel data set on anti-ulcer drug prescriptions, we measure the effects of uncertainty and learning in the demand for pharmaceutical drugs. We estimate a dynamic matching model of demand under uncertainty in which patients learn from prescription experience about the effectiveness of alternative drugs. Unlike previous models, we allow drugs to have distinct symptomatic and curative effects,

Gregory S. Crawford; Matthew Shum

2005-01-01

362

UNCERTAINTY AND LEARNING IN PHARMACEUTICAL DEMAND  

Microsoft Academic Search

Exploiting a rich panel data set on anti-ulcer drug prescriptions, we measure the ef- fects of uncertainty and learning in the demand for pharmaceutical drugs. We estimate a dynamic matching model of demand under uncertainty in which patients learn from prescription experience about the effectiveness of alternative drugs. Unlike previous models, we allow drugs to have distinct symptomatic and curative

GREGORY S. CRAWFORD; MATTHEW SHUM

2000-01-01

363

Uncertainties in Astrophysical Capture Rate Calculations  

NASA Astrophysics Data System (ADS)

Astrophysical capture rates can be calculated within the framework of Hauser-Feshbach theory for nuclei for which there is no experimental data. However, there remain large uncertainties in the calculations. We investigate these uncertainties and demonstrate their effect on nucleosynthesis simulations.

Bertolli, Michael

2012-10-01

364

Human capital and growth under political uncertainty  

Microsoft Academic Search

In this paper we show how political uncertainty may impede economic growth by reducing public investment in the formation of human capital, and how this negative effect of political uncertainty can be offset by a government contract. We present a model of growth with accumulation of human capital and government investment in education. We show that in a country with

Nigar Hashimzade; George Davis

2006-01-01

365

Critical analysis of uncertainties during particle filtration.  

PubMed

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data. PMID:23020418

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

366

Time Dependent Uncertainties and Optimal Control  

Microsoft Academic Search

Optimal control problems involve the difficult task of determining time-varying profiles through dynamic optimization. Such problems become even more complex in practical situations where handling time dependent uncertainties becomes an important issue. This work presents a new approach based on real option theory to the solution of optimal control problems under uncertainty. First, using the fundamentals provided by real option

Vicente Rico-Ramirez; Salvador Hernandez-Castro; Urmila M. Diwekar

2004-01-01

367

Production uncertainty and trade policy commitment  

Microsoft Academic Search

Agricultural markets are characterized by production and marketing lags. Uncertainty is also an inherent feature of agricultural markets. This paper investigates if two policy active importers will choose to commit to their import levels or keep the flexibility to revise their ex-ante import levels once production decisions are made and the uncertainty is resolved. This is the constant dilemma faced

Jean-Philippe Gervais

2001-01-01

368

Optimal linear filtering under parameter uncertainty  

Microsoft Academic Search

This paper addresses the problem of designing a guaranteed minimum error variance robust filter for convex bounded parameter uncertainty in the state, output, and input matrices. The design procedure is valid for linear filters that are obtained from the minimization of an upper bound of the error variance holding for all admissible parameter uncertainty. The results provided generalize the ones

Jose C. Geromel

1999-01-01

369

Methods of Dealing with Uncertainty: Panel Presentation.  

ERIC Educational Resources Information Center

|Rising energy costs, changing tax bases, increasing numbers of non-traditional students, and ever changing educational technology point to the fact that community college administrators will have to accept uncertainty as a normal planning component. Rather than ignoring uncertainty, a tendency that was evidenced in the reluctance of…

Pearce, Frank C.

370

Identifying uncertainties in Arctic climate change projections  

NASA Astrophysics Data System (ADS)

Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

2013-06-01

371

MODELING TECHNICAL TRADE BARRIERS UNDER UNCERTAINTY  

Microsoft Academic Search

As traditional forms of agricultural protection continue to decline, agricultural interests will likely seek alternative protection in the form of technical barriers. A flexible framework for theoretically and empirically analyzing technical barriers under various sources of uncertainty is derived. Attention is focused on uncertainty arising from the variation in the product attribute levels, a source not yet considered by the

Alan P. Ker

2000-01-01

372

Pay-performance sensitivity and production uncertainty  

Microsoft Academic Search

The pay-performance sensitivity of linear incentive contracts can increase with increasing production uncertainty, depending upon the timing and nature of this uncertainty. This provides a possible explanation for the failure of empirical tests to yield convincing support for a negative relationship.

Ján Zábojník

1996-01-01

373

Uncertainty identification by the maximum likelihood method  

Microsoft Academic Search

To incorporate uncertainty in structural analysis, a knowledge of the uncertainty in the model parameters is required. This paper describes efficient techniques to identify and quantify variability in the parameters from experimental data by maximising the likelihood of the measurements, using the well-established Monte Carlo or perturbation methods for the likelihood computation. These techniques are validated numerically and experimentally on

José R. Fonseca; Michael I. Friswell; John E. Mottershead; Arthur W. Lees

2005-01-01

374

Return policy in product reuse under uncertainty  

Microsoft Academic Search

One complicating factor in a reverse logistics activity is the uncertainty in the volume of the reverse product flow coupled with uncertain demand. These uncertainties are creating a problem for the reuse businesses because, in order to have a profitable business, their plants need some minimum number of used products to operate efficiently. Several researches have indicated that there is

Samar K. Mukhopadhyay; Robert Setaputra

2011-01-01

375

Programmatic methods for addressing contaminated volume uncertainties  

Microsoft Academic Search

Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly

C. R. Rieman; H. L. Spector; L. A. Durham; R. L. Johnson

2007-01-01

376

Programmatic methods for addressing contaminated volume uncertainties  

Microsoft Academic Search

Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly

L. A. DURHAM; R. L. JOHNSON; C. R. RIEMAN; H. L. SPECTOR

2007-01-01

377

Uncertainty and Engagement with Learning Games  

ERIC Educational Resources Information Center

Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

Howard-Jones, Paul A.; Demetriou, Skevi

2009-01-01

378

The Economic Implications of Carbon Cycle Uncertainty  

SciTech Connect

This paper examines the implications of uncertainty in the carbon-cycle for the cost of stabilizing carbon-dioxide concentrations. We find that uncertainty in our understanding of the carbon-dioxide has significant implications for the costs of a climate stabilization policy, equivalent to a change in concentration target of up to 100 ppmv.

Smith, Steven J.; Edmonds, James A.

2006-10-17

379

Observations on the worst case uncertainty  

NASA Astrophysics Data System (ADS)

The paper discuss the computation of the worst case uncertainty (WCU) in common measurement problems. The usefulness of computing the WCU besides the standard uncertainty is illustrated. A set of equations to compute the WCU in almost all practical situations is presented. The application of the equations to real-world cases is shown.

Fabbiano, Laura; Giaquinto, Nicola; Savino, Mario; Vacca, Gaetano

2013-09-01

380

Uncertainty Propagation in an Ecosystem Nutrient Budget.  

EPA Science Inventory

New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

381

Critical analysis of uncertainties during particle filtration  

NASA Astrophysics Data System (ADS)

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters ``maximum critical retention concentration'' and ``minimum injection velocity'' and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental ``critical retention concentration vs velocity''-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

382

Nonclassicality in phase-number uncertainty relations  

SciTech Connect

We show that there are nonclassical states with lesser joint fluctuations of phase and number than any classical state. This is rather paradoxical since one would expect classical coherent states to be always of minimum uncertainty. The same result is obtained when we replace phase by a phase-dependent field quadrature. Number and phase uncertainties are assessed using variance and Holevo relation.

Matia-Hernando, Paloma; Luis, Alfredo [Departamento de Optica, Facultad de Ciencias Fisicas, Universidad Complutense, 28040 Madrid (Spain)

2011-12-15

383

Uncertainty and Engagement with Learning Games  

ERIC Educational Resources Information Center

|Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.…

Howard-Jones, Paul A.; Demetriou, Skevi

2009-01-01

384

Uncertainty and Decisions in Medical Informatics 1  

Microsoft Academic Search

This paper presents a tutorial introduction to the handling of uncertainty and decision-making in medical reasoning systems. It focuses on the central role of uncertainty in all of medicine and identifies the major themes that arise in re- search papers. It then reviews simple Bayesian formulations of the problem and pursues their generalization to the Bayesian network methods that are

Peter Szolovits

1995-01-01

385

UNCERTAINTY MODELING VIA FREQUENCY DOMAIN MODEL VALIDATION  

Microsoft Academic Search

The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much attention in the literature. The influence of the level of accuracy of the uncertainty model on closed loop performance has received

Martin R. Waszak

1999-01-01

386

The Marketing Mix Decision Under Uncertainty  

Microsoft Academic Search

This paper develops a marketing mix model under uncertainty using the Capital Asset Pricing Model (CAPM) valuation framework. The model is general because it treats price, advertising and personal selling simultaneously, and allows for general patterns of uncertainty. Because the manager often lacks precise quantitative information about the sales response function, the analysis focuses on the qualitative properties of the

Harsharanjeet S. Jagpal; Ivan E. Brick

1982-01-01

387

Assessing uncertainties in urban drainage models  

NASA Astrophysics Data System (ADS)

The current state of knowledge regarding uncertainties in urban drainage models is poor. This is in part due to the lack of clarity in the way model uncertainty analyses are conducted and how the results are presented and used. There is a need for a common terminology and a conceptual framework for describing and estimating uncertainties in urban drainage models. Practical tools for the assessment of model uncertainties for a range of urban drainage models are also required to be developed. This paper, produced by the International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, is a contribution to the development of a harmonised framework for defining and assessing uncertainties in the field of urban drainage modelling. The sources of uncertainties in urban drainage models and their links are initially mapped out. This is followed by an evaluation of each source, including a discussion of its definition and an evaluation of methods that could be used to assess its overall importance. Finally, an approach for a Global Assessment of Modelling Uncertainties (GAMU) is proposed, which presents a new framework for mapping and quantifying sources of uncertainty in urban drainage models.

Deletic, A.; Dotto, C. B. S.; McCarthy, D. T.; Kleidorfer, M.; Freni, G.; Mannina, G.; Uhl, M.; Henrichs, M.; Fletcher, T. D.; Rauch, W.; Bertrand-Krajewski, J. L.; Tait, S.

388

EDITORIAL: Squeezed states and uncertainty relations  

NASA Astrophysics Data System (ADS)

This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfüred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besançon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly from Latin American countries including, of course, Mexico. There were many talks on the subjects traditionally covered in this conference series, including quantum fluctuations, different forms of squeezing, unlike kinds of nonclassical states of light, and distinct representations of the quantum superposition principle, such as even and odd coherent states. The entanglement phenomenon, frequently in the form of the EPR paradox, is responsible for the main advantages of quantum engineering compared with classical methods. Even though entanglement has been known since the early days of quantum mechanics, its properties, such as the most appropriate entanglement measures, are still under current investigation. The phenomena of dissipations and decoherence of the initial pure states are very important because the fast decoherence can destroy all the advantages of quantum processes in teleportation, quantum computing and image processing. Due to this, methods of controlling the decoherence, such as by the use of different kinds of nonlinearities and deformations, are also under study. From the very beginning of quantum mechanics, the uncertainty relations were basic inequalities distinguishing the classical and quantum worlds. Among the theoretical methods for quantum optics and quantum mechanics, this conference covered phase space and group representations, such as the Wigner and probability distribution functions, which provide an alternative approach to the Schr\\"odinger or Heisenberg picture. Different forms of probability representations of quantum states are important tools to be applied in studying various quantum phenomena, such as quantum interference, decoherence and quantum tomography. They have been established also as a very useful tool in all branches of classical optics. From the mathematical point of view, it is well known that the coherent and squeezed states are representations of the Lorentz group. It was noted throughout the conference that another form of the Lorentz group, namely, the 2 x 2 representation of the SL(2,c) group, is becoming

Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector

2004-06-01

389

Uncertainty analysis for geologic disposal of radioactive waste  

NASA Astrophysics Data System (ADS)

The incorporation and representation of uncertainty in the analysis of the consequences and risk associated with the geologic disposal of high-level radioactive waste are discussed. Such uncertainty has three primary components: process modeling uncertainty, model input data uncertainty, and scenario uncertainty. The following topics are considered in connection with the preceding components: propagation of uncertainty in the modeling of a disposal site, sampling of input data for models, and uncertainty associated with model output.

Cranwell, R. N.; Helton, J. C.

390

Ideas underlying quantification of margins and uncertainties(QMU): a white paper.  

SciTech Connect

This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

2006-09-01

391

Uncertainty estimation for Bayesian reconstructions from low-count spect data  

SciTech Connect

Bayesian analysis is especially useful to apply to low-count medical imaging data, such as gated cardiac SPECT, because it allows one to solve the nonlinear, ill-posed, inverse problems associated with such data. One advantage of the Bayesian approach is that it quantifies the uncertainty in estimated parameters through the posterior probability. We compare various approaches to exploring the uncertainty in Bayesian reconstructions from SPECT data including: the standard estimation of the covariance of an estimator using a frequentist approach; a new technique called the `hard truth` in which one applies `forces` to the parameters and observes their displacements; and Markov-chain Monte Carlo sampling of the posterior probability distribution, which in principle provides a complete uncertainty characterization.

Cunningham, G.S.; Hanson, K.M.

1996-12-31

392

Deterministic sensitivity and uncertainty analysis for large-scale computer models  

SciTech Connect

This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab.

Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

1988-01-01

393

Uncertainty quantification and granular thermodynamics  

NASA Astrophysics Data System (ADS)

To objectively assess a DEM model for granular flow, the DEM model must be able to randomly sample possible flow histories. Objective assessment is then based on comparing a sample of flows from the model with a sample of experimental flows by means of statistical inference. The practical implementation of this method of model assessment yields a model for the relationships among small collections of macroscopic flow variables. This model of relationships is analogous to models from classical thermodynamics, but is a fully stochastic summarization of bulk flow properties for a physical system that is not at equilibrium. It strongly distinguishes the microscale disorder within the flow from the between-experiment unpredictability of the bulk properties of the flow. It clearly identifies the sources of bulk property uncertainties, and allows ignorance of the detailed dynamics of the flow to be incorporated into the modelling process. This approach to constructing granular thermodynamic models will be compared with approaches to model development arising from modern statistical physics and with the development of the first thermodynamic models.

Picka, Jeffrey D.

2013-06-01

394

Optic eikonal, Fermat's principle and the least action principle  

NASA Astrophysics Data System (ADS)

A generalized refractive index in the form of optic eikonal is defined through comparing frame definitions of left-handed and right-handed sets and indicates the sign of the refractive index covered by the quadratic form of the eikonal equation. Fermat’s principle is generalized, and the general refractive law is derived directly. Under this definition, the comparison between Fermat’s principle and the least action principle is made through employing path integral and analogizing L. de Broglie’s theory.

Tan, Kangbo; Liang, Changhong; Shi, Xiaowei

2008-12-01

395

Optimality principles for the visual code  

NASA Astrophysics Data System (ADS)

One way to try to make sense of the complexities of our visual system is to hypothesize that evolution has developed nearly optimal solutions to the problems organisms face in the environment. In this thesis, we study two such principles of optimality for the visual code. In the first half of this dissertation, we consider the principle of decorrelation. Influential theories assert that the center-surround receptive fields of retinal neurons remove spatial correlations present in the visual world. It has been proposed that this decorrelation serves to maximize information transmission to the brain by avoiding transfer of redundant information through optic nerve fibers of limited capacity. While these theories successfully account for several aspects of visual perception, the notion that the outputs of the retina are less correlated than its inputs has never been directly tested at the site of the putative information bottleneck, the optic nerve. We presented visual stimuli with naturalistic image correlations to the salamander retina while recording responses of many retinal ganglion cells using a microelectrode array. The output signals of ganglion cells are indeed decorrelated compared to the visual input, but the receptive fields are only partly responsible. Much of the decorrelation is due to the nonlinear processing by neurons rather than the linear receptive fields. This form of decorrelation dramatically limits information transmission. Instead of improving coding efficiency we show that the nonlinearity is well suited to enable a combinatorial code or to signal robust stimulus features. In the second half of this dissertation, we develop an ideal observer model for the task of discriminating between two small stimuli which move along an unknown retinal trajectory induced by fixational eye movements. The ideal observer is provided with the responses of a model retina and guesses the stimulus identity based on the maximum likelihood rule, which involves sums over all random walk trajectories. These sums can be implemented in a biologically plausible way. The necessary ingredients are: neurons modeled as a cascade of a linear filter followed by a static nonlinearity, a recurrent network with additive and multiplicative interactions between neurons, and divisive global inhibition. This architecture implements Bayesian inference by representing likelihoods as neural activity which can then diffuse through the recurrent network and modulate the influence of later information. We also develop approximation methods for characterizing the performance of the ideal observer. We find that the effect of positional uncertainty is essentially to slow the acquisition of signal. The time scaling is related to the size of the uncertainty region, which is in turn related to both the signal strength and the statistics of the fixational eye movements. These results imply that localization cues should determine the slope of the performance curve in time.

Pitkow, Xaq

396

Dealing with uncertainties - communication between disciplines  

NASA Astrophysics Data System (ADS)

Climate adaptation research inevitably involves uncertainty issues - whether people are building a model, using climate scenarios, or evaluating policy processes. However, do they know which uncertainties are relevant in their field of work? And which uncertainties exist in the data from other disciplines that they use (e.g. climate data, land use, hydrological data) and how they propagate? From experiences in Dutch research programmes on climate change in the Netherlands we know that disciplines often deal differently with uncertainties. This complicates communication between disciplines and also with the various users of data and information on climate change and its impacts. In October 2012 an autumn school was organized within the Knowledge for Climate Research Programme in the Netherlands with as central theme dealing with and communicating about uncertainties, in climate- and socio-economic scenarios, in impact models and in the decision making process. The lectures and discussions contributed to the development of a common frame of reference (CFR) for dealing with uncertainties. The common frame contains the following: 1. Common definitions (typology of uncertainties, robustness); 2. Common understanding (why do we consider it important to take uncertainties into account) and aspects on which we disagree (how far should scientists go in communication?); 3. Documents that are considered important by all participants; 4. Do's and don'ts in dealing with uncertainties and communicating about uncertainties (e.g. know your audience, check how your figures are interpreted); 5. Recommendations for further actions (e.g. need for a platform to exchange experiences). The CFR is meant to help researchers in climate adaptation to work together and communicate together on climate change (better interaction between disciplines). It is also meant to help researchers to explain to others (e.g. decision makers) why and when researchers agree and when and why they disagree, and on what exactly. During the presentation some results of this autumn school will be presented.

Overbeek, Bernadet; Bessembinder, Janette

2013-04-01

397

Principles of antimicrobial therapy.  

PubMed

We have discussed important factors involved in choosing appropriate antimicrobial regimens for the treatment of bacterial meningitis and brain abscess to illustrate common themes relevant to the treatment of these diseases. We have limited this review to these conditions for two main reasons: (1) the principles involved in optimal antimicrobial therapy for these diseases likely apply to others CNS infections, such as viral and fungal diseases; and (2) little pharmacological information is currently available for other types of CNS infections. Many of the studies addressing the relevant pharmacological and microbiological aspects of antimicrobial therapy for CNS infections have been performed in experimental animal models and, as a result, the information derived from these studies may be different when examined in appropriate human studies. Our current understanding of appropriate antimicrobial therapy for CNS infections may be summarized as follows: 1. Choose bactericidal antimicrobials that effectively cross the BBB to achieve CSF concentrations well above the MBC (? 10-fold) for the suspected bacterial pathogen(s). 2. Take into consideration the relevant PD parameters the bactericidal activity of the antimicrobials used to treat bacterial meningitis, such as t > MBC or AUC/MBC. 3. Tailor the antimicrobial regimen based on microbiological information, once available. However, with respect to brain abscess therapy, keep in mind that anaerobes are commonly involved, but difficult to culture, and consider including antianaerobic therapy even if the bacterial cultures do not grow anaerobes. 4. Treat bacterial meningitis caused by nonmeningococcal pathogens for 7-10 days, but monitor clinical progress to determine whether the patient should continue on a more prolonged antimicrobial course. Meningococcal meningitis may be treated with 3-4 days of effective antimicrobial therapy, again with the caveat that the patients clinical course should dictate duration of therapy. 5. Treat brain abscess, preferably after aspiration/drainage, for at least 6 weeks with intravenous antimicrobials for brain abscess on the clinical response (e.g., improved symptoms, lack of new neurological findings) and radiographic changes (e.g., reduction in cavity size). PMID:20109672

Tessier, Jeffrey M; Scheld, W Michael

2010-01-19

398

Using Interpolation to Estimate System Uncertainty in Gene Expression Experiments  

PubMed Central

The widespread use of high-throughput experimental assays designed to measure the entire complement of a cell's genes or gene products has led to vast stores of data that are extremely plentiful in terms of the number of items they can measure in a single sample, yet often sparse in the number of samples per experiment due to their high cost. This often leads to datasets where the number of treatment levels or time points sampled is limited, or where there are very small numbers of technical and/or biological replicates. Here we introduce a novel algorithm to quantify the uncertainty in the unmeasured intervals between biological measurements taken across a set of quantitative treatments. The algorithm provides a probabilistic distribution of possible gene expression values within unmeasured intervals, based on a plausible biological constraint. We show how quantification of this uncertainty can be used to guide researchers in further data collection by identifying which samples would likely add the most information to the system under study. Although the context for developing the algorithm was gene expression measurements taken over a time series, the approach can be readily applied to any set of quantitative systems biology measurements taken following quantitative (i.e. non-categorical) treatments. In principle, the method could also be applied to combinations of treatments, in which case it could greatly simplify the task of exploring the large combinatorial space of future possible measurements.

Falin, Lee J.; Tyler, Brett M.

2011-01-01

399

The legal status of Uncertainty  

NASA Astrophysics Data System (ADS)

An exponential improvement of numerical weather prediction (NWP) models was observed during the last decade (Lynch, 2008). Civil Protection (CP) systems exploited Meteo services in order to redeploy their actions towards the prediction and prevention of events rather than towards an exclusively response-oriented mechanism1. Nevertheless, experience tells us that NWP models, even if assisted by real time observations, are far from being deterministic. Complications frequently emerge in medium to long range forecasting, which are subject to sudden modifications. On the other hand, short term forecasts, if seen through the lens of criminal trials2, are to the same extent, scarcely reliable (Molini et al., 2009). One particular episode related with wrong forecasts, in the Italian panorama, has deeply frightened CP operators as the NWP model in force missed a meteorological adversity which, in fact, caused death and dealt severe damage in the province of Vibo Valentia (2006). This event turned into a very discussed trial, lasting over three years, and intended against whom assumed the legal position of guardianship within the CP. A first set of data is now available showing that in concomitance with the trial of Vibo Valentia the number of alerts issued raised almost three folds. We sustain the hypothesis that the beginning of the process of overcriminalization (Husak, 2008) of CPs is currently increasing the number of false alerts with the consequent effect of weakening alert perception and response by the citizenship (Brezntiz, 1984). The common misunderstanding of such an issue, i.e. the inherent uncertainty in weather predictions, mainly by prosecutors and judges, and generally by whom deals with law and justice, is creating the basis for a defensive behaviour3 within CPs. This paper intends, thus, to analyse the social and legal relevance of uncertainty in the process of issuing meteo-hydrological alerts by CPs. Footnotes: 1 The Italian Civil Protection is working in this direction since 1992 (L. 225/92). An example of this effort is clearly given by the Prime Minister Decree (DPCM 20/12/2001 "Linee guida relative ai piani regionali per la programmazione delle attivita' di previsione, prevenzione e lotta attiva contro gli incendi boschivi - Guidelines for regional plans for the planning of prediction, prevention and forest fires fighting activities") that, already in 2001, emphasized "the most appropriate approach to pursue the preservation of forests is to promote and encourage prediction and prevention activities rather than giving priority to the emergency-phase focused on fire-fighting". 2 Supreme Court of the United States, In re Winship (No. 778), No. 778 argued: 20 January 1970, decided: 31 March 1970: Proof beyond a reasonable doubt, which is required by the Due Process Clause in criminal trials, is among the "essentials of due process and fair treatment" 3 In Kessler and McClellan (1996): "Defensive medicine is a potentially serious social problem: if fear of liability drives health care providers to administer treatments that do not have worthwhile medical benefits, then the current liability system may generate inefficiencies much larger than the costs of compensating malpractice claimants".

Altamura, M.; Ferraris, L.; Miozzo, D.; Musso, L.; Siccardi, F.

2011-03-01

400

Uncertainty and global climate change research  

SciTech Connect

The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

1994-06-01

401

Uncertainty quantification approaches for advanced reactor analyses.  

SciTech Connect

The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

Briggs, L. L.; Nuclear Engineering Division

2009-03-24

402

Climate change, uncertainty, and natural resource management  

USGS Publications Warehouse

Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

Nichols, J. D.; Koneff, M. D.; Heglund, P. J.; Knutson, M. G.; Seamans, M. E.; Lyons, J. E.; Morton, J. M.; Jones, M. T.; Boomer, G. S.; Williams, B. K.

2011-01-01

403

Design Principles for Learning Systems.  

National Technical Information Service (NTIS)

The paper reviews the major aspects of human learning, and discusses several learning mechanisms and some of the design principles for learning systems. A comparison is made between human learning and machine learning. Some aspects of human learning are c...

J. T. Tou

1965-01-01

404

Get Provoked: Applying Tilden's Principles.  

ERIC Educational Resources Information Center

This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

Shively, Carol A.

1995-01-01

405

Questionnaire Design Principles - Applied Research  

Cancer.gov

Skip to Content Cancer Control and Population Sciences Home Applied Research Home Questionnaire Design & Testing: Questionnaire Design Principles Cognitive Testing Item Response Theory Modeling Guides & Reports International Collaboration Areas of Research Tools Surveys

406

Statistical Uncertainty Analysis Applied to Criticality Calculation  

SciTech Connect

In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W. [Centre for Nuclear Informatics Development, National Nuclear Energy Agency of Indonesia (Indonesia)

2010-06-22

407

Uncertainty on Multi-objective Optimization Problems  

NASA Astrophysics Data System (ADS)

In general, parameters in multi-objective optimization are assumed as deterministic with no uncertainty. However, uncertainty in the parameters can affect both variable and objective spaces. The corresponding Pareto optimal fronts, resulting from the disturbed problem, define a cloud of curves. In this work, the main objective is to study the resulting cloud of curves in order to identify regions of more robustness and, therefore, to assist the decision making process. Preliminary results, for a very limited set of problems, show that the resulting cloud of curves exhibits regions of less variation, which are, therefore, more robust to parameter uncertainty.

Costa, Lino; Santo, Isabel A. C. P. Espírito; Oliveira, Pedro

2011-09-01

408

Uncertainty Quantification on Prompt Fission Neutrons Spectra  

NASA Astrophysics Data System (ADS)

Uncertainties in the evaluated prompt fission neutrons spectra present in ENDF/B-VII.0 are assessed in the framework of the Los Alamos model. The methodology used to quantify the uncertainties on an evaluated spectrum is introduced. We also briefly review the Los Alamos model and single out the parameters that have the largest influence on the calculated results. Using a Kalman filter, experimental data and uncertainties are introduced to constrain model parameters, and construct an evaluated covariance matrix for the prompt neutrons spectrum. Preliminary results are shown in the case of neutron-induced fission of 235U from thermal up to 15 MeV incident energies.

Talou, P.; Madland, D. G.; Kawano, T.

2008-12-01

409

Two basic Uncertainty Relations in Quantum Mechanics  

SciTech Connect

In the present article, we discuss two types of uncertainty relations in Quantum Mechanics-multiplicative and additive inequalities for two canonical observables. The multiplicative uncertainty relation was discovered by Heisenberg. Few years later (1930) Erwin Schroedinger has generalized and made it more precise than the original. The additive uncertainty relation is based on the three independent statistical moments in Quantum Mechanics-Cov(q,p), Var(q) and Var(p). We discuss the existing symmetry of both types of relations and applicability of the additive form for the estimation of the total error.

Angelow, Andrey [Institute of Solid State Physics, Bulgarian Academy of Sciences, 72 Tzarigradsko chaussee, 1784 Sofia (Bulgaria)

2011-04-07

410

Principles of Scanning Probe Microscopy  

NSDL National Science Digital Library

This site offers a beautifully illustrated introduction to the principles of scanning probe microscopy. The text is interspersed with links to additional information, much of it from the Interface Physics Group at Leiden University. There are several animations included and links to visual galleries which illustrate both principles and utilization of scanning probe microscopy. An additional "links" page takes the user to sites of research groups involved in ongoing developmental work in surface science.

Frenken, Joost

2011-04-07

411

Equivalence Principle and Gravitational Redshift  

SciTech Connect

We investigate leading order deviations from general relativity that violate the Einstein equivalence principle in the gravitational standard model extension. We show that redshift experiments based on matter waves and clock comparisons are equivalent to one another. Consideration of torsion balance tests, along with matter-wave, microwave, optical, and Moessbauer clock tests, yields comprehensive limits on spin-independent Einstein equivalence principle-violating standard model extension terms at the 10{sup -6} level.

Hohensee, Michael A.; Chu, Steven; Mueller, Holger [Department of Physics, University of California, Berkeley, California 94720 (United States); Peters, Achim [Institut fuer Physik, Humboldt-Universitaet zu Berlin, Newtonstrasse 15, 12489 Berlin (Germany)

2011-04-15

412

Experimental Basis for Robust On-orbit Uncertainty Estimates for CLARREO InfraRed Sensors  

NASA Astrophysics Data System (ADS)

As defined by the National Research Council Decadal Survey of 2006, the CLimate Absolute Radiance and REfractivity Observatory (CLARREO) satisfies the need for “a long-term global benchmark record of critical climate variables that are accurate over very long time periods, can be tested for systematic errors by future generations, are unaffected by interruption, and are pinned to international standards.” These observational requirements— testing for systematic errors, accuracy over indefinite time, and linkage to internationally recognized measurement standards—are achievable through an appeal to the concept of SI traceability. That is, measurements are made such that they are linked through an unbroken chain of comparisons, where each comparison has a stated and credible uncertainty, back to the definitions of the International System (SI) Units. While the concept of SI traceability is a straightforward one, achieving credible estimates of uncertainty, particularly in the case of complex sensors deployed in orbit, poses a significant challenge. Recently, a set of principles has been proposed to guide the development of sensors that realize fully the benefits of SI traceability. The application of these principles to the spectral infrared sensor that is part of the CLARREO mission is discussed. These principles include, but are not limited to: basing the sensor calibration on a reproducible physical property of matter, devising experimental tests for known sources of measurement bias (or systematic uncertainty), and providing independent system-level checks for the end-to-end radiometric performance of the sensor. The application of these principles to the infrared sensor leads to the following conclusions. To obtain the lowest uncertainty (or highest accuracy), the calibration should be traceable to the definition of the Kelvin—that is, the triple point of water. Realization of a Kelvin-based calibration is achieved through the use of calibration blackbodies. It is necessary to implement experimental tests for changes in the optical and thermodynamic properties of the blackbodies, in addition to implementing tests for radiometric uncertainties arising from polarization, stray light, and detector chain nonlinearities, and for sensitivity to influencing parameters from the local sensor environment and the target radiation. The implication of these conclusions is that a multi-institutional effort to design and test the sensor is necessary to achieve the transparency required to bolster the credibility of the observational results and their associated uncertainties. Practical options for pursuing this effort will be explored.

Dykema, J. A.; Revercomb, H. E.; Anderson, J.

2009-12-01

413

Developmental principles: fact or fiction.  

PubMed

While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, "whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm". We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them. PMID:22489210

Durston, A J

2012-02-15

414

32 CFR 203.6 - Cost principles.  

Code of Federal Regulations, 2013 CFR

... 2013-07-01 false Cost principles. 203.6 ...PARTICIPATION (TAPP) IN DEFENSE ENVIRONMENTAL RESTORATION ACTIVITIES § 203.6 Cost principles. (a) Non-profit...contractors must comply with the cost principles in OMB...

2013-07-01

415

Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty  

SciTech Connect

Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites.

Porter, D.W.

1996-04-01

416

Quantifying Uncertainty in Scenarios for Integrated Assessment of Climate Change.  

National Technical Information Service (NTIS)

The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore see...

2006-01-01

417

Representation of Analysis Results Involving Aleatory and Epistemic Uncertainty.  

National Technical Information Service (NTIS)

Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic...

C. J. Sallaberry J. C. Helton J. D. Johnson W. L. Oberkampf

2008-01-01

418

Uncertainty Modeling via Frequency Domain Model Validation.  

National Technical Information Service (NTIS)

The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much att...

M. R. Waszak D. Andrisani

1999-01-01

419

Information-theoretic approach to uncertainty importance  

SciTech Connect

A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results.

Park, C.K.; Bari, R.A.

1985-01-01

420

Uncertainties in the North Korean Nuclear Threat.  

National Technical Information Service (NTIS)

North Korea is a master at denying the United States information on its sensitive military capabilities. The resulting lack of information on the North Korean nuclear weapon threat makes that highly uncertain. This briefing addresses those uncertainties, ...

B. W. Bennett

2010-01-01

421

Uncertainty Assessment in Life Cycle Cost Analysis.  

National Technical Information Service (NTIS)

Several of the leading probabilistic methods for measuring uncertainties in construction project life cycle costs were studied to evaluate their applicability for military construction projects. The confidence index and statistical testing approaches are ...

1985-01-01

422

MOS UNCERTAINTY ESTIMATES IN AN ENSEMBLE FRAMEWORK  

Microsoft Academic Search

It is being increasingly recognized that the uncertainty in weather forecasts should be quantified and furnished to users along with the single-value forecasts usually provided. Probabilistic forecasts of \\

Bob Glahn; Matthew Peroutka; Jerry Wiedenfeld; John Wagner; Bryan Jackson; BRYAN SCHUKNECHT

2008-01-01

423

Impact of Uncertainty on Terror Forecasting.  

National Technical Information Service (NTIS)

Intelligence analysts and military planners need accurate forecasting techniques for predicting future terror events. Terror forecasts must consider historical events, up-to-date geospatial features, terrorist behavior, and uncertainty and error in the in...

G. S. Schmidt J. Goffeney R. Willis

2007-01-01

424

QUANTIFICATION OF UNCERTAINTY IN COMPUTATIONAL FLUID DYNAMICS  

Microsoft Academic Search

This review covers Verification, Validation, Confirmation and related subjects for computational fluid dynamics (CFD), including error taxonomies, error estima- tion and banding, convergence rates, surrogate estimators, nonlinear dynamics, and error estimation for grid adaptation vs Quantification of Uncertainty.

P. J. Roache

1997-01-01

425

Quantifying reliability uncertainty : a proof of concept.  

SciTech Connect

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.

2009-10-01

426

Utilizing general information theories for uncertainty quantification  

SciTech Connect

Uncertainties enter into a complex problem from many sources: variability, errors, and lack of knowledge. A fundamental question arises in how to characterize the various kinds of uncertainty and then combine within a problem such as the verification and validation of a structural dynamics computer model, reliability of a dynamic system, or a complex decision problem. Because uncertainties are of different types (e.g., random noise, numerical error, vagueness of classification), it is difficult to quantify all of them within the constructs of a single mathematical theory, such as probability theory. Because different kinds of uncertainty occur within a complex modeling problem, linkages between these mathematical theories are necessary. A brief overview of some of these theories and their constituents under the label of Generalized lnforrnation Theory (GIT) is presented, and a brief decision example illustrates the importance of linking at least two such theories.

Booker, J. M. (Jane M.)

2002-01-01

427

Performance and robustness analysis for structured uncertainty  

Microsoft Academic Search

This paper introduces a nonconservative measure of performance for linear feedback systems in the face of structured uncertainty. This measure is based on a new matrix function, which we call the Structured Singular Value.

John C. Doyle; Joseph E. Wall; Gunter Stein

1982-01-01

428

Uncertainty Quantification Techniques of SCALE/TSUNAMI  

SciTech Connect

The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.

Rearden, Bradley T [ORNL; Mueller, Don [ORNL

2011-01-01

429

An integrated approach for addressing uncertainty in the delineation of groundwater management areas.  

PubMed

Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible. PMID:23507137

Sousa, Marcelo R; Frind, Emil O; Rudolph, David L

2013-02-16

430

The ends of uncertainty: Air quality science and planning in Central California  

SciTech Connect

Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.

Fine, James

2003-09-01

431

Uncertainty Regarding Waste Handling in Everyday Life  

Microsoft Academic Search

According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper) but not between others (e.g., packaging and “non-packaging”). Thus a frequent cause

Greger Henriksson; Lynn Åkesson; Susanne Ewert

2010-01-01

432

Visualisation of Information Uncertainty: Progress and Challenges  

Microsoft Academic Search

\\u000a Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem.\\u000a Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter\\u000a are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and\\u000a their dimensions of complexity; (2)

Binh Pham; Alex Streit; Ross Brown

2009-01-01

433

How LCA studies deal with uncertainty  

Microsoft Academic Search

In recent years many workers have examined the implications of various sources of uncertainty for the reliability of Life\\u000a Cycle Assessment (LCA). Indeed, the International Standardization Organization (ISO) has recognised the relevance of this\\u000a work by including several cautionary statements in the ISO 14040 series of standards. However, in practice, there is a risk\\u000a that the significance of these uncertainties

Stuart Ross; David Evans; Michael Webber

2002-01-01

434

The illness uncertainty concept: A review  

Microsoft Academic Search

Illness uncertainty is present for both acute and chronic illnesses and has been described in the literature as a cognitive\\u000a stressor, a sense of loss of control, and a perceptual state of doubt that changes over time. Illness uncertainty is associated\\u000a with poor adjustment, but often needs to be appraised as a threat to have its deleterious effect. In pain

Lisa Johnson Wright; Niloofar Afari; Alex Zautra

2009-01-01

435

Reducing Uncertainty With Seismic Measurements While Drilling  

Microsoft Academic Search

This paper discusses both seismic checkshot data inversion and seismic waveform look-ahead imaging while drilling. We investigate the estimation under real-time data uncertainties of interval velocity profiles calculated from checkshot measurements acquired while drilling. It is found that real-time checkshots may suffer from downhole time-picking errors in addition to an unpredictable clock drift uncertainty. We developed a method to account

Hugues Djikpesse; Phil Armstrong; Rogelio Rufino; Andy Hawthorn

2010-01-01

436

Propagating nu -Interaction Uncertainties via Event Reweighting  

NASA Astrophysics Data System (ADS)

We present an event reweighting scheme for propagating neutrino cross-section and intranuclear hadron transport model uncertainties which has been developed for the GENIE-based (C. Andreopoulos et al., arXiv:0905. 2517[hep-ph]) neutrino physics simulations. We discuss the motivations, implementation and validation of the scheme and show an example application where it is used to evaluate the associated systematic uncertainties for neutral current pi 0 production.

Dobson, J.; Andreopoulos, C.

2009-09-01

437

Uncertainties of Nutrigenomics and Their Ethical Meaning  

Microsoft Academic Search

Again and again utopian hopes are connected with the life sciences (no hunger, health for everyone; life without diseases,\\u000a longevity), but simultaneously serious research shows uncertain, incoherent, and ambivalent results. It is unrealistic to\\u000a expect that these uncertainties will disappear. We start by providing a not exhaustive list of five different types of uncertainties\\u000a end-users of nutrigenomics have to cope

Michiel KorthalsRixt Komduur; Rixt Komduur

2010-01-01

438

Model uncertainty, performance persistence and flows  

Microsoft Academic Search

Model uncertainty makes it difficult to draw clear inference about mutual fund performance persistence. I propose a new performance\\u000a measure, Bayesian model averaged (BMA) alpha, which explicitly accounts for model uncertainty. Using BMA alphas, I find evidence\\u000a of performance persistence in a large sample of US funds. There is a positive and asymmetric relation between flows and past\\u000a BMA alphas,

Yee Cheng Loon

2011-01-01

439

Uncertainty of Pyrometers in a Casting Facility  

SciTech Connect

This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.

Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.

2001-12-07

440

Road map for measurement uncertainty evaluation  

Microsoft Academic Search

Various methods and tools are now available to evaluate measurement uncertainty. These new methods comply with the concepts and recommendations of the Guide to the expression of uncertainty in measurement (GUM). In this paper, the authors introduce several alternatives for laboratories, notably those based on intra-laboratory and interlaboratory approaches.The intra-laboratory approaches will include the “modelling approach” (application of the procedure

Michèle Désenfant; Marc Priel

2006-01-01

441

Uncertainty evaluation in algorithms with conditional statement  

Microsoft Academic Search

This paper proposes a methodology for the uncertainty evaluation of digital signal-processing algorithms including conditional instructions (if-then-else statements). In agreement with the suggestion of the ISO guide for the expression of uncertainty in measurement, a probabilistic approach is proposed. The application to different algorithms characterized by significant decision phases is approached theoretically, verifying the results also by suitable simulation. The

Giovanni Betta; Consolatina Liguori; Antonio Pietrosanto

2004-01-01

442

UNCERTAINTY QUANTIFICATION FOR MODULAR AND HIERARCHICAL MODELS  

Microsoft Academic Search

We propose a modular\\/hierarchical uncertainty quantication framework based on a recently developed methodology using concentration-of-measure inequalities for probability-of-failure upper bound calculations. In this framework, the relations between the variables of the underlying input-output model are represented by directed, acyclic graphs and the bounded uncertainty in the input variables is propagated to the output variable (performance measure) in an inductive manner

L. J. LUCAS; M. ORTIZ; H. OWHADI; U. TOPCUy

443

Declarative representation of uncertainty in mathematical models.  

PubMed

An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form. PMID:22802941

Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F

2012-07-03

444

Feature uncertainty activates anterior cingulate cortex.  

PubMed

In visual discrimination tasks, the relevant feature to discriminate is defined before stimulus presentation. In feature uncertainty tasks, a cue about the relevant feature is provided after stimulus offset. We used (15)O-butanol positron emission tomography (PET) in order to investigate brain activation during a feature uncertainty task. There was greater activity during the feature uncertainty task, compared with stimulus detection and discrimination of orientation and spatial frequency, in the lateral and medial prefrontal cortex, the cuneus, superior temporal and inferior parietal cortex, cortical motor areas, and the cerebellum. The most robust and consistent activation was observed in the dorsal anterior cingulate cortex (Brodmann area 32; x = 0 y = 16, z = 40). The insula, located near the claustrum (x = -38, y = 8, z = 4), was activated during the discrimination tasks compared with the feature uncertainty condition. These results suggest that the dorsal anterior cingulate cortex is important in feature uncertainty conditions, which include divided attention, expectancy under uncertainty, and cognitive monitoring. PMID:14689507

Kéri, Szabolcs; Decety, Jean; Roland, Per E; Gulyás, Balázs

2004-01-01

445

Sampling uncertainty in satellite rainfall estimates  

NASA Astrophysics Data System (ADS)

Accurate estimates of global precipitation patterns are essential for a better understanding of the hydrological cycle. Satellite observations allow for large scale estimates of rainfall intensities. Uncertainties in current satellite based rainfall estimates are due to uncertainties in the retrieval process as well as the different temporal and spatial sampling patterns of the observation systems. The focus of this study is set on analyzing sampling associated uncertainty for thirteen low Earth orbiting satellites carrying microwave instruments suitable for rainfall measurement. Satellites were grouped by the types of microwave sensors, where NOAA satellites with cross-track sounders and DMSP satellites with conical scanners make the core part of the constellations. The effect of three hourly geostationary measurements on the sampling uncertainty was evaluated as well. A precise orbital model SGP4 was used to generate realistic satellite overpasses database where orbital shifts are taken into account. Using the overpasses database we resampled rain gauge timeseries to simulate satellites rainfall estimates free of retrieval and calibration errors. We look at two regions, Germany and Benin, areas with different precipitation regimes . Our analysis show that sampling uncertainty for all available satellites may differ up to 100% for different latitudes and precipitation regimes. However the performance of various satellite groups is similar to each other, with greater differences in higher latitudes. Addition of three hourly geostationary observations reduces the sampling uncertainty but only to a limited extent.

Itkin, M.; Loew, A.

2012-04-01

446

About uncertainties in practical salinity calculations  

NASA Astrophysics Data System (ADS)

In the current state of the art, salinity is a quantity computed from conductivity ratio measurements, with temperature and pressure known at the time of the measurement, and using the Practical Salinity Scale algorithm of 1978 (PSS-78). This calculation gives practical salinity values S. The uncertainty expected in PSS-78 values is ±0.002, but no details have ever been given on the method used to work out this uncertainty, and the error sources to include in this calculation. Following a guide published by the Bureau International des Poids et Mesures (BIPM), using two independent methods, this paper assesses the uncertainties of salinity values obtained from a laboratory salinometer and Conductivity-Temperature-Depth (CTD) measurements after laboratory calibration of a conductivity cell. The results show that the part due to the PSS-78 relations fits is sometimes as significant as the instrument's. This is particularly the case with CTD measurements where correlations between variables contribute mainly to decreasing the uncertainty of S, even when expanded uncertainties of conductivity cell calibrations are for the most part in the order of 0.002 mS cm-1. The relations given here, and obtained with the normalized GUM method, allow a real analysis of the uncertainties' sources and they can be used in a more general way, with instruments having different specifications.

Le Menn, M.

2011-10-01

447

Conditional first-order second-moment method and its application to the quantification of uncertainty in groundwater modeling  

NASA Astrophysics Data System (ADS)

Decision making in water resources management usually requires the quantification of uncertainties. Monte Carlo techniques are suited for this analysis but imply a huge computational effort. An alternative and computationally efficient approach is the first-order second-moment (FOSM) method which directly propagates parameter uncertainty into the result. We apply the FOSM method to both the groundwater flow and solute transport equations. It is shown how conditioning on the basis of measured heads and/or concentrations yields the ``principle of interdependent uncertainty'' that correlates the uncertainties of feasible hydraulic conductivities and recharge rates. The method is used to compute the uncertainty of steady state heads and of steady state solute concentrations. It is illustrated by an application to the Palla Road Aquifer in semiarid Botswana, for which the quantification of the uncertainty range of groundwater recharge is of prime interest. The uncertainty bounds obtained by the FOSM method correspond well with the results obtained by the Monte Carlo method. The FOSM method, however, is much more advantageous with respect to computational efficiency. It is shown that at the planned abstraction rate the probability of exceeding the natural replenishment of the Palla Road Aquifer by overpumping is 30%.

Kunstmann, Harald; Kinzelbach, Wolfgang; Siegfried, Tobias

2002-04-01

448

Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty  

SciTech Connect

This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program.

Mensing, R.W.

1985-01-01

449

Understanding Climate Uncertainty with an Ocean Focus  

NASA Astrophysics Data System (ADS)

Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.

Tokmakian, R. T.

2009-12-01

450

Mechanical Principles of Biological Nanocomposites  

NASA Astrophysics Data System (ADS)

Biological nanocomposites, such as bone, tooth, shell, and wood, exhibit exceptional mechanical properties. Much recent effort has been directed at exploring the basic mechanical principles behind the microstructures of these natural materials to provide guidelines for the development of novel man-made nanocomposites. This article reviews some of the recent studies on mechanical properties of biological nanocomposites, including their stiffness, strength, toughness, interface properties, and elastic stability. The discussion is focused on the mechanical principles of biological nanocomposites, including the generic nanostructure of hard-mineral crystals embedded in a soft protein matrix, the flaw-tolerant design of the hard phase, the role of the soft matrix, the hybrid interface between protein and mineral, and the structural hierarchy. The review concludes with some discussion of and outlook on the development of biomimicking synthetic materials guided by the principles found in biological nanocomposites.

Ji, Baohua; Gao, Huajian

2010-08-01

451

Principles of Virus Structural Organization  

PubMed Central

Viruses, the molecular nanomachines infecting hosts ranging from prokaryotes to eukaryotes, come in different sizes, shapes and symmetries. Questions such as what principles govern their structural organization, what factors guide their assembly, how these viruses integrate multifarious functions into one unique structure have enamored researchers for years. In the last five decades, following Caspar and Klug's elegant conceptualization of how viruses are constructed, high resolution structural studies using X-ray crystallography and more recently cryo-EM techniques have provided a wealth of information on structures of variety of viruses. These studies have significantly furthered our understanding of the principles that underlie structural organization in viruses. Such an understanding has practical impact in providing a rational basis for the design and development of antiviral strategies. In this chapter, we review principles underlying capsid formation in a variety of viruses, emphasizing the recent developments along with some historical perspective.

Prasad, B.V. Venkataram; Schmid, Michael F

2013-01-01

452

Astronomical principles of satellite positioning  

NASA Astrophysics Data System (ADS)

The Global Navigation Satellite Systems (GNSS) dominate the positioning technologies at the beginning of this millennium. The new concept, already common in all users' segments, refer to those radio-navigation systems providing highly precise time and position information, continuously and globally, disregarding the weather status. In the present paper, a comparison between the principles of celestial navigation and satellite navigation is intended, offering additional reasoning to conclude that the roots of the global satellite navigation systems are connected to the classical principles of celestial navigation.

Dragusan, Adrian; Lupu, Sergiu; Cojocaru, Stelian

2008-09-01

453

RPL Dosimetry: Principles and Applications  

SciTech Connect

The principle of radio-photoluminescence (RPL) is applied to the glass dosimeter, which is one of the most excellent solid state dosimeters. The silver activated phosphate glass irradiated with ionizing radiations emits luminescence when exposed to UV light. This phenomenon is called RPL. The most characteristic features of the glass dosimeters are data accumulation and no fading. The basic principle of RPL is described and then how it is applied to the glass dosimeter is explained. Finally some applications of RPL will be introduced.

Yamamoto, Takayoshi [Radioisotope Research Center, Osaka University, 2-4, Yamadaoka, Suita, Osaka 565-0871 (Japan) and Oarai Research Center, Chiyoda Technol Corporation, 3681, Narita-cho, Oarai-machi, Higashi-ibaraki-gun, Ibaraki-ken, 311-1313 (Japan)

2011-05-05

454

Bayes and the Simplicity Principle in Perception  

ERIC Educational Resources Information Center

Discussions of the foundations of perceptual inference have often centered on 2 governing principles, the likelihood principle and the simplicity principle. Historically, these principles have usually been seen as opposed, but contemporary statistical (e.g., Bayesian) theory tends to see them as consistent, because for a variety of reasons simpler…

Feldman, Jacob

2009-01-01

455

Uncertainty visualisation in the Model Web  

NASA Astrophysics Data System (ADS)

Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

2012-04-01

456

Quantifying uncertainty in chemical systems modeling.  

SciTech Connect

This study compares two techniques for uncertainty quantification in chemistry computations, one based on sensitivity analysis and error propagation, and the other on stochastic analysis using polynomial chaos techniques. The two constructions are studied in the context of H{sub 2}-O{sub 2} ignition under supercritical-water conditions. They are compared in terms of their prediction of uncertainty in species concentrations and the sensitivity of selected species concentrations to given parameters. The formulation is extended to one-dimensional reacting-flow simulations. The computations are used to study sensitivities to both reaction rate pre-exponentials and enthalpies, and to examine how this information must be evaluated in light of known, inherent parametric uncertainties in simulation parameters. The results indicate that polynomial chaos methods provide similar first-order information to conventional sensitivity analysis, while preserving higher-order information that is needed for accurate uncertainty quantification and for assigning confidence intervals on sensitivity coefficients. These higher-order effects can be significant, as the analysis reveals substantial uncertainties in the sensitivity coefficients themselves.

Reagan, Matthew T.; Knio, Omar M. (The Johns Hopkins University, Baltimore, MD); Najm, Habib N.; Ghanem, Roger Georges (The Johns Hopkins University, Baltimore, MD); Pebay, Philippe Pierre

2004-09-01

457

Uncertainty in climate change and drought  

USGS Publications Warehouse

A series of projections of climate change were applied to a watershed model of the Delaware River basin to identify sources of uncertainty in predicting effects of climate change on drought in the basin as defined by New York City reservoir contents. The watershed model is a calibrated, monthly time-step water-balance model that incorporates the operation of reservoirs and diversion canals, and accounts for all inflows to and outflows from the basin at several key nodes. The model assesses the effects of projected climate change on reservoir contents by calculating the frequency with which the basin enters drought conditions under a range of climate-change conditions. Two primary sources of uncertainty that affect predictions of drought frequency in the Delaware River basin were considered: (1) uncertainty in the amount of change in mean air temperature and precipitation, and (2) uncertainty in the effects of natural climate variability on future temperature and precipitation. Model results indicate that changes in drought frequency in the Delaware River basin are highly sensitive to changes in mean precipitation; therefore, the uncertainty associated with predictions of future precipitation has a large effect on the prediction of future drought frequency in the basin.

McCabe, Jr. , Gregory, J.; Wolock, David, M.; Tasker, Gary, D.; Ayers, Mark, A.

1991-01-01

458

Dissociating Uncertainty Responses and Reinforcement Signals in the Comparative Study of Uncertainty Monitoring  

ERIC Educational Resources Information Center

|Although researchers are exploring animals' capacity for monitoring their states of uncertainty, the use of some paradigms allows the criticism that animals map avoidance responses to error-causing stimuli not because of uncertainty monitored but because of feedback signals and stimulus aversion. The authors addressed this criticism with an…

Smith, J. David; Redford, Joshua S.; Beran, Michael J.; Washburn, David A.

2006-01-01

459

Dissociating Uncertainty Responses and Reinforcement Signals in the Comparative Study of Uncertainty Monitoring  

ERIC Educational Resources Information Center

Although researchers are exploring animals' capacity for monitoring their states of uncertainty, the use of some paradigms allows the criticism that animals map avoidance responses to error-causing stimuli not because of uncertainty monitored but because of feedback signals and stimulus aversion. The authors addressed this criticism with an…

Smith, J. David; Redford, Joshua S.; Beran, Michael J.; Washburn, David A.

2006-01-01

460

Astronomical principles of satellite positioning  

Microsoft Academic Search

The Global Navigation Satellite Systems (GNSS) dominate the positioning technologies at the beginning of this millennium. The new concept, already common in all users' segments, refer to those radio-navigation systems providing highly precise time and position information, continuously and globally, disregarding the weather status. In the present paper, a comparison between the principles of celestial navigation and satellite navigation is

Adrian Dragusan; Sergiu Lupu; Stelian Cojocaru

2008-01-01

461

The ecological principle in medicine  

Microsoft Academic Search

The ecological principle demands information of a total quality. It is not satisfied with a chemical explanation alone, nor one at the biological level alone, nor yet one at the purely psychological level––each must give its fraction to the total picture. Just what the intricate equating at these roughly indicated levels may be constitutes the work of the conscientious physician.

S. E. Jelliffe

1937-01-01

462

Reversing the Balance Wheel Principle  

ERIC Educational Resources Information Center

|The paper discusses funding principles and policies of higher education during the recession period. The role of state appropriations for the viability of public higher education institutions is emphasized. State funding affecting institutional behaviour is another issue raised. The paper analyzes the possibility of expanding state funding for…

Orkodashvili, Mariam

2008-01-01

463

Demonstrating Fermat's Principle in Optics  

ERIC Educational Resources Information Center

|We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either…

Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

2011-01-01

464

Vacuum Technology:. Principles and Applications  

Microsoft Academic Search

This work is devoted on principles and applications of vacuum technology. Classification and properties of vacuum are discussed. Various pumping mechanisms as well as three basic flow regimes namely viscous, intermediate and molecular are briefly presented. Gas-surface interaction concepts including physisorption and chemisorption states with their distinctive character as well as desorption phenomenon are considered. Two types of surface reaction

A. Z. Moshfegh

2004-01-01

465

EduMedia: Superposition principle+-  

NSDL National Science Digital Library

This applet illustrates the principle of superposition as applied to two oppositely charged electric charges. It has learning objectives and instructions on how to use the animation. These flash images are a shareware version that times out after 3 minutes. Reloading the page starts the animations again. This animation is part of a large collection of similar animations.

2007-09-04

466

Aesthetic Principles for Instructional Design  

ERIC Educational Resources Information Center

|This article offers principles that contribute to developing the aesthetics of instructional design. Rather than describing merely the surface qualities of things and events, the concept of aesthetics as applied here pertains to heightened, integral experience. Aesthetic experiences are those that are immersive, infused with meaning, and felt as…

Parrish, Patrick E.

2009-01-01

467

Local Taxation: Principles and Scope  

Microsoft Academic Search

The paper discusses principles of local taxation such as accountability, benefit-tax link, non-distortion, regional equity, long-term efficiency, reliability and stability of tax bases, tax sharing as implicit insurance, and administrative simplicity. Not all of the criteria for local taxation are consistent with each other and could be realized at the same time.

Paul Bernd Spahn

1995-01-01

468

On the Dirichlet's Box Principle  

ERIC Educational Resources Information Center

In this note, we will focus on several applications on the Dirichlet's box principle in Discrete Mathematics lesson and number theory lesson. In addition, the main result is an innovative game on a triangular board developed by the authors. The game has been used in teaching and learning mathematics in Discrete Mathematics and some high schools in…

Poon, Kin-Keung; Shiu, Wai-Chee

2008-01-01

469

Principles and Guidelines for Transfer  

ERIC Educational Resources Information Center

Transfer relationships in British Columbia (BC) are governed by statements which were adopted by the Council in 1993 after consultation with the institutions of the BC Transfer System. Principles and guidelines in this document are based on those formulated by the British Columbia Post-Secondary Coordinating Committee and approved by university…

British Columbia Council on Admissions and Transfer, 2003

2003-01-01

470

Management Principles for Nonproliferation Organizations  

Microsoft Academic Search

This paper identifies business models and six management principles that can be applied by a nonproliferation organization to maximize the value and effectiveness of its products. The organizations responsible for reducing the nuclear proliferation threat have experienced a substantial growth in responsibility and visibility since the September 11 attacks. Since then, the international community has witnessed revelations of clandestine nuclear

Sarah L. Frazar; Gretchen Hund

2012-01-01

471

Principles underlying the design of \\  

Microsoft Academic Search

BACKGROUND: Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a \\

Anna J Wilson; Stanislas Dehaene; Philippe Pinel; Susannah K Revkin; Laurent Cohen; David Cohen

2006-01-01

472

Interpersonal psychotherapy: principles and applications  

Microsoft Academic Search

This article briefly describes the fundamental principles and some of the clinical applications of interpersonal psychotherapy (IPT), a time-limited, empirically validated treatment for mood disorders. IPT has been tested with general success in a series of clinical trials for mood and, increasingly, non-mood disorders; as both an acute and maintenance treatment; and in differing treatment formats. As a result of

JOHN C. MARKOWITZ; MYRNA M. WEISSMAN

2004-01-01

473

Principles of Critical Discourse Analysis  

Microsoft Academic Search

This paper discusses some principles of critical discourse analysis, such as the explicit sociopolitical stance of discourse analysts, and a focus on dominance relations by elite groups and institutions as they are being enacted, legitimated or otherwise reproduced by text and talk. One of the crucial elements of this analysis of the relations between power and discourse is the patterns

Teun A. van Dijk

1993-01-01

474

Nursing Principles & Skills. Teacher Edition.  

ERIC Educational Resources Information Center

This curriculum guide contains 14 units for a course on nursing principles and skills needed by practical nurses. The 14 units of instruction cover the following: (1) using medical terminology; (2) practicing safety procedures; (3) using the nursing process for care planning; (4) using infection control techniques; (5) preparing a patient…

Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

475

The principles of geometrical geodesy  

NASA Astrophysics Data System (ADS)

The principles of spheroidal geodesy and triangulation are presented. Coordinate systems which are used in geodesy are described, the direct and inverse problems of geodesy are examined, and the solution of geodesy problems by means of ellipsoid chords is considered. A translation into Russian of Schmehl's (1927) paper concerning the general triaxial ellipsoid of the earth is presented. Questions of selenodesy are also considered.

Zagrebin, D. V.

476

From probabilities to Hamilton's principle  

NASA Astrophysics Data System (ADS)

It is shown that Hamilton's principle of classical mechanics can be derived from the relativistic invariance of the generalized spacetime Fisher information corresponding to the statistical description of results of measurement of the coordinates and the motion in space by means of the probability density and probability density current.

Kapsa, V.; Skála, L.

2009-08-01

477

Electronic Structure Principles and Aromaticity  

ERIC Educational Resources Information Center

|The relationship between aromaticity and stability in molecules on the basis of quantities such as hardness and electrophilicity is explored. The findings reveal that aromatic molecules are less energetic, harder, less polarizable, and less electrophilic as compared to antiaromatic molecules, as expected from the electronic structure principles.|

Chattaraj, P. K.; Sarkar, U.; Roy, D. R.

2007-01-01

478

On the Einstein equivalence principle  

SciTech Connect

The Einstein equivalence principle, the cornerstone of our present day understanding of gravity, is used to explore a deeper connection between the deflection of starlight by a spinning object and the Lense-Thirring dragging of inertial frames. It is also noted that experiment has not established that the gravitomagnetic coupling to currents of particle rest-mass energy, to currents of electromagnetic energy, and to currents of all other types of energy are identical as predicted by the Einstein equivalence principle. The detailed analysis of how atomic physics experiments originated by Hughes and by Drever can constrain such possible violations of the Einstein equivalence principle is given. Atomic clocks are also important tools used to test local Lorentz invariance and hence one important aspect of Einstein equivalence principle. The sensitivity of atomic clocks to preferred-frame effects is studied here for the first time, and the behavior of the hydrogen-maser clocks of the Gravity Probe A experiment is analyzed to illustrate use of the techniques involved.

Gabriel, M.D.

1989-01-01

479

Principles for Teaching Problem Solving  

NSDL National Science Digital Library

This 14-page monograph addresses the need to teach problem solving and other higher order thinking skills. After summarizing research and positions of various organizations, it defines several models and describes cognitive and attitudinal components of problem solving and the types of knowledge that are required. The authors provide a list of principles for teaching problem solving and include a list of references.

Kirkley, Rob F.

2003-01-01

480

Physical principles of heat pipes  

Microsoft Academic Search

Heat pipes are used whenever high rates of heat transfer or the control or conversion of heat flows are required. This book covers the physical principles of operation of heat pipes and choice of working fluid related to temperature range. The authors demonstrate how performance is limited by capillary pumping action in the wick together with impedance to liquid and

M. N. Ivanovskii; V. P. Sorokin; I. V. Yagodkin

1982-01-01

481

Demonstrating Fermat's Principle in Optics  

ERIC Educational Resources Information Center

We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either…

Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

2011-01-01

482

Interpreting the unknown: uncertainty and the management of transboundary groundwater  

Microsoft Academic Search

This paper shows how uncertainty undermines collaborative transboundary groundwater management. Focusing on the Santa Cruz Aquifer, spanning the United States–Mexico border between Arizona and Sonora, the authors describe the uncertainties within the aquifer using interviews and hydrologic studies. We discuss how data requirements and ambiguous interpretations exacerbate these uncertainties, and explain how each country's water-management culture combines with this uncertainty

Anita Milman; Isha Ray

2011-01-01

483

Quantifying uncertainty in LCA-modelling of waste management systems  

SciTech Connect

Highlights: Black-Right-Pointing-Pointer Uncertainty in LCA-modelling of waste management is significant. Black-Right-Pointing-Pointer Model, scenario and parameter uncertainties contribute. Black-Right-Pointing-Pointer Sequential procedure for quantifying uncertainty is proposed. Black-Right-Pointing-Pointer Application of procedure is illustrated by a case-study. - Abstract: Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but