These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Economic uncertainty principle? Alexander Harin  

E-print Network

Economic uncertainty principle? Alexander Harin This preliminary paper presents a qualitative description of the economic principle of (hidden, latent) uncertainty. Mathematical expressions of principle problems are reviewed. Contents Introduction

Paris-Sud XI, Université de

2

Gerbes and Heisenberg's Uncertainty Principle  

E-print Network

We prove that a gerbe with a connection can be defined on classical phase space, taking the U(1)-valued phase of certain Feynman path integrals as Cech 2-cocycles. A quantisation condition on the corresponding 3-form field strength is proved to be equivalent to Heisenberg's uncertainty principle.

J. M. Isidro

2005-12-19

3

Uncertainty Principles for Compact Groups  

E-print Network

We establish an operator-theoretic uncertainty principle over arbitrary compact groups, generalizing several previous results. As a consequence, we show that if f is in L^2(G), then the product of the measures of the supports of f and its Fourier transform ^f is at least 1; here, the dual measure is given by the sum, over all irreducible representations V, of d_V rank(^f(V)). For finite groups, our principle implies the following: if P and R are projection operators on the group algebra C[G] such that P commutes with projection onto each group element, and R commutes with left multiplication, then the squared operator norm of PR is at most rank(P)rank(R)/|G|.

Gorjan Alagic; Alexander Russell

2006-08-28

4

Quantum Mechanics and the Generalized Uncertainty Principle  

E-print Network

The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

Jang Young Bang; Micheal S. Berger

2006-10-11

5

Disturbance, the uncertainty principle and quantum optics  

NASA Technical Reports Server (NTRS)

It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

Martens, Hans; Demuynck, Willem M.

1993-01-01

6

Schrodinger equation from an exact uncertainty principle  

Microsoft Academic Search

An exact uncertainty principle, formulated as the assumption that a classical\\u000aensemble is subject to random momentum fluctuations of a strength which is\\u000adetermined by and scales inversely with uncertainty in position, leads from the\\u000aclassical equations of motion to the Schrodinger equation. Thus there is an\\u000aexact formulation of the uncertainty principle which precisely captures the\\u000aessence of what

Michael J. W. Hall; Marcel Reginatto

2001-01-01

7

Quantum wells and the generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

The finite and infinite square wells are potentials typically discussed in undergraduate quantum mechanics courses. In this paper, we discuss these potentials in the light of the recent studies of the modification of the Heisenberg uncertainty principle into a generalized uncertainty principle (GUP) as a consequence of attempts to formulate a quantum theory of gravity. The fundamental concepts of the minimal length scale and the GUP are discussed and the modified energy eigenvalues and transmission coefficient are derived.

Blado, Gardo; Owens, Constance; Meyers, Vincent

2014-11-01

8

Naturalistic Misunderstanding of the Heisenberg Uncertainty Principle.  

ERIC Educational Resources Information Center

The Heisenberg Uncertainty Principle, which concerns the effect of observation upon what is observed, is proper to the field of quantum physics, but has been mistakenly adopted and wrongly applied in the realm of naturalistic observation. Discusses the misuse of the principle in the current literature on naturalistic research. (DM)

McKerrow, K. Kelly; McKerrow, Joan E.

1991-01-01

9

Generalized Uncertainty Principle: Approaches and Applications  

E-print Network

We review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed. We compare between them. They entered the literature as the Generalized Uncertainty Principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of Applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker--Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high--energy collisions. One of the higher--order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

Abdel Nasser Tawfik; Abdel Magied Diab

2014-09-29

10

Generalized Uncertainty Principle: Approaches and Applications  

E-print Network

We review highlights from string theory, black hole physics and doubly special relativity and some "thought" experiments which were suggested to probe the shortest distance and/or the maximum momentum at the Planck scale. The models which are designed to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed entered the literature as the Generalized Uncertainty Principle (GUP). We compare between them. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. Furthermore, assuming modified dispersion relation allows for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of the gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. Another one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

Abdel Nasser Tawfik; Abdel Magied Diab

2014-09-29

11

The uncertainty principle and quantum chaos  

NASA Technical Reports Server (NTRS)

The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

Chirikov, Boris V.

1993-01-01

12

Illinois PER Interactive Examples: Uncertainty Principle I  

NSDL National Science Digital Library

This interactive homework problem shows some particles passing through a single slit of known width. After the particles pass through the slit they spread out over a range of angles. The user is asked to use the Heisenberg uncertainty principle to determine the minimum range of angles. The problem is accompanied by a Socratic-dialog "help" sequence designed to encourage critical thinking as users do a guided conceptual analysis before attempting the mathematics. Immediate feedback is provided for both correct and incorrect responses. This item is part of a larger collection of interactive problems developed by the Illinois Physics Education Research Group.

Gladding, Gary

2008-07-20

13

Black Holes and the Generalized Uncertainty Principle  

NASA Astrophysics Data System (ADS)

We propose a new way in which black holes connect macrophysics and microphysics. The Generalized Uncertainty Principle suggests corrections to the Uncertainty Principle as the energy increases towards the Planck value. It also provides a natural transition between the expressions for the Compton wavelength below the Planck mass and the black hole event horizon size above it. This suggests corrections to the the event horizon size as the black hole mass falls towards the Planck value, leading to the concept of a Generalized Event Horizon. Extrapolating this expression below the Planck mass suggests the existence of a new kind of black hole, whose size is of order its Compton wavelength. Recently it has been found that such a black hole solution is permitted by loop quantum gravity, its unusual properties deriving from the fact that it is hidden behind the throat of a wormhole. This has important implications for the formation and evaporation of black holes in the early Universe, especially if there are extra spatial dimensions.

Carr, B. J.

2013-12-01

14

Uncertainty Principle Consequences at Thermal Equilibrium  

E-print Network

Contrary to the conventional wisdom that deviations from standard thermodynamics originate from the strong coupling to the bath, it is shown that these deviations are intimately linked to the power spectrum of the thermal bath. Specifically, it is shown that the lower bound of the dispersion of the total energy of the system, imposed by the uncertainty principle, is dominated by the bath power spectrum and therefore, quantum mechanics inhibits the system thermal-equilibrium-state from being described by the canonical Boltzmann's distribution. This is in sharp contrast to the classical case, for which the thermal equilibrium distribution of a system interacting via central forces with pairwise-self-interacting environment, irrespective of the interaction strength, is shown to be \\emph{exactly} characterized by the canonical Boltzmann distribution. As a consequence of this analysis, we define an \\emph{effective coupling} to the environment that depends on all energy scales in the system and reservoir interactio...

Pachon, Leonardo A; Zueco, David; Brumer, Paul

2014-01-01

15

Heisenberg's Uncertainty Principle and Interpretive Research in Science Education.  

ERIC Educational Resources Information Center

Heisenberg's uncertainty principle and the derivative notions of interdeterminacy, uncertainty, precision, and observer-observed interaction are discussed and their applications to social science research examined. Implications are drawn for research in science education. (PR)

Roth, Wolff-Michael

1993-01-01

16

Uncertainty Principle Consequences at Thermal Equilibrium  

E-print Network

Contrary to the conventional wisdom that deviations from standard thermodynamics originate from the strong coupling to the bath, it is shown that these deviations are intimately linked to the power spectrum of the thermal bath. Specifically, it is shown that the lower bound of the dispersion of the total energy of the system, imposed by the uncertainty principle, is dominated by the bath power spectrum and therefore, quantum mechanics inhibits the system thermal-equilibrium-state from being described by the canonical Boltzmann's distribution. This is in sharp contrast to the classical case, for which the thermal equilibrium distribution of a system interacting via central forces with pairwise-self-interacting environment, irrespective of the interaction strength, is shown to be \\emph{exactly} characterized by the canonical Boltzmann distribution. As a consequence of this analysis, we define an \\emph{effective coupling} to the environment that depends on all energy scales in the system and reservoir interaction. Sample computations in regimes predicted by this effective coupling are demonstrated. For example, for the case of strong effective coupling, deviations from standard thermodynamics are present and, for the case of weak effective coupling, quantum features such as stationary entanglement are possible at high temperatures.

Leonardo A. Pachon; Johan F. Triana; David Zueco; Paul Brumer

2014-01-07

17

Heisenberg Uncertainty Principle for the q-Bessel Fourier transform  

E-print Network

Heisenberg Uncertainty Principle for the q-Bessel Fourier transform Lazhar Dhaouadi Abstract further variant of Heisen- bergs uncertainty principle. Let f be the Fourier transform of f defined by f entropy argument de give an un- certainty inequality for the q-Bessel Fourier transform (also called q

Paris-Sud XI, Université de

18

The Uncertainty Principle, Virtual Particles and Real Forces  

ERIC Educational Resources Information Center

This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert

Jones, Goronwy Tudor

2002-01-01

19

Single-Slit Diffraction and the Uncertainty Principle  

ERIC Educational Resources Information Center

A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

Rioux, Frank

2005-01-01

20

Wave-particle Duality and the Uncertainty Principle Frank Rioux  

E-print Network

, neutron and photon. The wave-particle duality of quons is captured succinctly by the deBroglie relationWave-particle Duality and the Uncertainty Principle Frank Rioux CSB/SJU Nick Herbert, author of Quantum Reality, has proposed the name quon for "any entity that exhibits both wave and particle

Rioux, Frank

21

Stringy coherent states inspired by generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

Ghosh, Subir; Roy, Pinaki

2012-05-01

22

Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle  

NASA Astrophysics Data System (ADS)

The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4?). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple linear filter models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

Oppenheim, Jacob N.; Magnasco, Marcelo O.

2013-01-01

23

Moments of the Wigner Distribution and a Generalized Uncertainty Principle  

E-print Network

The nonnegativity of the density operator of a state is faithfully coded in its Wigner distribution, and this places constraints on the moments of the Wigner distribution. These constraints are presented in a canonically invariant form which is both concise and explicit. Since the conventional uncertainty principle is such a constraint on the first and second moments, our result constitutes a generalization of the same to all orders. Possible application in quantum state reconstruction using optical homodyne tomography is noted.

R. Simon; N. Mukunda

1997-08-22

24

Uncertainty principle of genetic information in a living cell  

PubMed Central

Background Formal description of a cell's genetic information should provide the number of DNA molecules in that cell and their complete nucleotide sequences. We pose the formal problem: can the genome sequence forming the genotype of a given living cell be known with absolute certainty so that the cell's behaviour (phenotype) can be correlated to that genetic information? To answer this question, we propose a series of thought experiments. Results We show that the genome sequence of any actual living cell cannot physically be known with absolute certainty, independently of the method used. There is an associated uncertainty, in terms of base pairs, equal to or greater than ?s (where ? is the mutation rate of the cell type and s is the cell's genome size). Conclusion This finding establishes an "uncertainty principle" in genetics for the first time, and its analogy with the Heisenberg uncertainty principle in physics is discussed. The genetic information that makes living cells work is thus better represented by a probabilistic model rather than as a completely defined object. PMID:16197549

Strippoli, Pierluigi; Canaider, Silvia; Noferini, Francesco; D'Addabbo, Pietro; Vitale, Lorenza; Facchin, Federica; Lenzi, Luca; Casadei, Raffaella; Carinci, Paolo; Zannotti, Maria; Frabetti, Flavia

2005-01-01

25

Generalized Uncertainty Principle and Recent Cosmic Inflation Observations  

E-print Network

The recent background imaging of cosmic extragalactic polarization (BICEP2) observations are believed as an evidence for the cosmic inflation. BICEP2 provided a first direct evidence for the inflation, determined its energy scale and debriefed witnesses for the quantum gravitational processes. The ratio of scalar-to-tensor fluctuations $r$ which is the canonical measurement of the gravitational waves, was estimated as $r=0.2_{-0.05}^{+0.07}$. Apparently, this value agrees well with the upper bound value corresponding to PLANCK $r\\leq 0.012$ and to WMAP9 experiment $r=0.2$. It is believed that the existence of a minimal length is one of the greatest predictions leading to modifications in the Heisenberg uncertainty principle or a GUP at the Planck scale. In the present work, we investigate the possibility of interpreting recent BICEP2 observations through quantum gravity or GUP. We estimate the slow-roll parameters, the tensorial and the scalar density fluctuations which are characterized by the scalar field $\\phi$. Taking into account the background (matter and radiation) energy density, $\\phi$ is assumed to interact with the gravity and with itself. We first review the Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe and then suggest modification in the Friedmann equation due to GUP. By using a single potential for a chaotic inflation model, various inflationary parameters are estimated and compared with the PLANCK and BICEP2 observations. While GUP is conjectured to break down the expansion of the early Universe (Hubble parameter and scale factor), two inflation potentials based on certain minimal supersymmetric extension of the standard model result in $r$ and spectral index matching well with the observations. Corresponding to BICEP2 observations, our estimation for $r$ depends on the inflation potential and the scalar field. A power-law inflation potential does not.

Abdel Nasser Tawfik; Abdel Magied Diab

2014-10-29

26

Quantum covariance, quantum Fisher information and the uncertainty principle  

E-print Network

In this paper the relation between quantum covariances and quantum Fisher informations are studied. This study is applied to generalize a recently proved uncertainty relation based on quantum Fisher information. The proof given hereconsiderably simplifies the previously proposed proofs and leads to more general inequalities.

Paolo Gibilisco; Fumio Hiai; Denes Petz

2007-12-07

27

Path Integral for Dirac oscillator with generalized uncertainty principle  

SciTech Connect

The propagator for Dirac oscillator in (1+1) dimension, with deformed commutation relation of the Heisenberg principle, is calculated using path integral in quadri-momentum representation. As the mass is related to momentum, we then adapt the space-time transformation method to evaluate quantum corrections and this latter is dependent from the point discretization interval.

Benzair, H. [Laboratoire LRPPS, Universite de Kasdi Merbah-Ouargla, BP 511, Route Ghardaia, 30000 Ouargla (Algeria); Laboratoire de Physique Theorique, Universite de Jijel BP98 Ouled Aissa, 18000 Jijel (Algeria); Boudjedaa, T. [Laboratoire de Physique Theorique, Universite de Jijel BP98 Ouled Aissa, 18000 Jijel (Algeria); Merad, M. [Laboratoire (L.S.D.C), Universite de Oum El Bouaghi, 04000 Oum El Bouaghi (Algeria)

2012-12-15

28

Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students depictions  

NSDL National Science Digital Library

Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, this paper's findings are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2012-05-21

29

Generalized uncertainty principle and Gaussian wave packets in discrete quantum phase space  

NASA Astrophysics Data System (ADS)

We construct the generalized uncertainty principle and the minimum uncertainty states using a one-dimensional quantum mechanical model which involves discrete coordinate space. To this end, we compactify momentum space that results in the discrete coordinate space. We find that it involves the usual Heisenberg uncertainty principle with modification terms suppressed by various powers of the momentum operator and the terms like where n is an integer. Next, we extend our result to quantum mechanics with discrete phase space which results from compactifying both coordinate and momentum spaces. Further, we investigate the time evolution of minimum uncertainty state wave packets in discrete quantum phase space. We find that minimum wave packets exhibit revival dynamics due to the discreteness of phase space.

Bang, Jang Young

30

Removing the Big Bang Singularity: The role of the generalized uncertainty principle in quantum gravity  

E-print Network

The possibility of avoiding the big bang singularity by means of a generalized uncertainty principle is investigated. In relation with this matter, the statistical mechanics of a free-particle system obeying the generalized uncertainty principle is studied and it is shown that the entropy of the system has a finite value in the infinite temperature limit. It is then argued that negative temperatures and negative pressures are possible in this system. Finally, it is shown that this model can remove the big bang singularity.

Reza Rashidi

2012-09-11

31

The entropy of the noncommutative acoustic black hole based on generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

In this paper we investigate statistical entropy of a 3-dimensional rotating acoustic black hole based on generalized uncertainty principle. In our results we obtain an area entropy and a correction term associated with the noncommutative acoustic black hole when ? introduced in the generalized uncertainty principle takes a specific value. However, in this method, it is not needed to introduce the ultraviolet cut-off and divergences are eliminated. Moreover, the small mass approximation is not necessary in the original brick-wall model.

Anacleto, M. A.; Brito, F. A.; Passos, E.; Santos, W. P.

2014-10-01

32

Double Special Relativity with a Minimum Speed and the Uncertainty Principle  

NASA Astrophysics Data System (ADS)

The present work aims to search for an implementation of a new symmetry in the spacetime by introducing the idea of an invariant minimum speed scale (V). Such a lowest limit V, being unattainable by the particles, represents a fundamental and preferred reference frame connected to a universal background field (a vacuum energy) that breaks Lorentz symmetry. So there emerges a new principle of symmetry in the spacetime at the subatomic level for very low energies close to the background frame (v ? V), providing a fundamental understanding for the uncertainty principle, i.e. the uncertainty relations should emerge from the spacetime with an invariant minimum speed.

Nassif, Cludio

33

Zero-point energies, the uncertainty principle and positivity of the quantum Brownian density operator  

E-print Network

High temperature and white noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white noise approximation is avoided, it is shown that if the zero point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.

Allan Tameshtit

2012-04-09

34

The entropy of the noncommutative acoustic black hole based on generalized uncertainty principle  

E-print Network

In this paper we investigate statistical entropy of a 3-dimensional rotating acoustic black hole based on generalized uncertainty principle. In our results we obtain an area entropy and a correction term associated with the noncommutative acoustic black hole when $\\lambda$ introduced in the generalized uncertainty principle takes a specific value. However, in this method, it is not needed to introduce the ultraviolet cut-off and divergences are eliminated. Moreover, the small mass approximation is not necessary in the original brick-wall model.

M. A. Anacleto; F. A. Brito; E. Passos; W. P. Santos

2014-05-08

35

Principles and applications of measurement and uncertainty analysis in research and calibration  

SciTech Connect

Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

Wells, C.V.

1992-11-01

36

Principles and applications of measurement and uncertainty analysis in research and calibration  

SciTech Connect

Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

Wells, C.V.

1992-11-01

37

Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions  

ERIC Educational Resources Information Center

Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2011-01-01

38

The Precautionary Principle: Scientific Uncertainty and Omitted Research in the Context of GMO Use and Release  

Microsoft Academic Search

Commercialization of genetically modified organisms (GMOs) have sparked profound controversies concerning adequate approaches to risk regulation. Scientific uncertainty and ambiguity, omitted research areas, and lack of basic knowledge crucial to risk assessmentshave become apparent. The objective of this article is to discuss the policy and practical implementation of the Precautionary Principle. A major conclusion is that the void in scientific

Anne Ingeborg Myhr; Terje Traavik

2002-01-01

39

Experimental investigation of the uncertainty principle in the presence of quantum memory  

E-print Network

Heisenberg's uncertainty principle provides a fundamental limitation on an observer's ability to simultaneously predict the outcome when one of two measurements is performed on a quantum system. However, if the observer has access to a particle (stored in a quantum memory) which is entangled with the system, his uncertainty is generally reduced. This effect has recently been quantified by Berta et al. [Nature Physics 6, 659 (2010)] in a new, more general uncertainty relation, formulated in terms of entropies. Using entangled photon pairs, an optical delay line serving as a quantum memory and fast, active feed-forward we experimentally probe the validity of this new relation. The behaviour we find agrees with the predictions of quantum theory and satisfies the new uncertainty relation. In particular, we find lower uncertainties about the measurement outcomes than would be possible without the entangled particle. This shows not only that the reduction in uncertainty enabled by entanglement can be significant in practice, but also demonstrates the use of the inequality to witness entanglement.

Robert Prevedel; Deny R. Hamel; Roger Colbeck; Kent Fisher; Kevin J. Resch

2010-12-01

40

Thermodynamics of (2+1)-dimensional acoustic black hole based on the generalized uncertainty principle  

E-print Network

We study thermodynamic quantities of an acoustic black hole and its thermodynamic stability in a cavity based on the generalized uncertainty principle. It can be shown that there is a minimal black hole which can be a stable remnant after black hole evaporation. Moreover, the behavior of the free energy shows that the large black hole is stable too. Therefore, the acoustic black hole can decay into the remnant or the large black hole.

Wontae Kim; Edwin J. Son; Myungseok Yoon

2008-01-09

41

Self-calibrating quantum random number generator based on the uncertainty principle  

E-print Network

We present an efficient method to self-calibrate a quantum random number generator (RNG). The method is based on the measurements of the system in two mutually unbiased basis. Thanks to the uncertainty principle applied to min- and max- entropies, a lower bound of the achievable true randomness can be evaluated. We tested our method with two different RNGs: a single photon RNG and a pair of photons RNG. Our result may have important implication for practical quantum random number generators.

G. Vallone; D. Marangon; M. Tomasin; P. Villoresi

2014-01-30

42

Geodesics, Mass and the Uncertainty Principle in a Warped de Sitter Space-time  

E-print Network

We present the explicit solution to the geodesic equations in a warped de Sitter space-time proposed by Randall-Sundrum. We find that a test particle moves in the bulk and is not restricted on a 3-brane (to be taken as our universe). On the 3-brane, the test particle moves with uniform velocity, giving the appearance that it is not subject to a force. But computing the particle's energy using the energy-momentum tensor yields a time-dependent energy that suggests a time-dependent mass. Thus, the extra force, which is the effect of the warped extra dimension on the particle's motion on the 3-brane, does not change the velocity but the mass of the particle. The particle's motion in the bulk also results in a time-dependent modification of the Heisenberg uncertainty principle as viewed on the 3-brane. These two results show that the classical physics along the extra dimension results in the time-dependence of particle masses and the uncertainty principle. If the particle masses are time-independent and the Heisenberg's uncertainty principle is to remain unchanged, then there must be a non-gravitational force that will restrict all particles on the 3-brane. Finally, we just note that although classically, these time-dependent corrections on the 3-brane can be removed, quantum mechanical corrections along the extra dimension will restore back the problem.

Jose A. Magpantay

2011-08-03

43

The uncertainty principle enables non-classical dynamics in an interferometer.  

PubMed

The quantum uncertainty principle stipulates that when one observable is predictable there must be some other observables that are unpredictable. The principle is viewed as holding the key to many quantum phenomena and understanding it deeper is of great interest in the study of the foundations of quantum theory. Here we show that apart from being restrictive, the principle also plays a positive role as the enabler of non-classical dynamics in an interferometer. First we note that instantaneous action at a distance should not be possible. We show that for general probabilistic theories this heavily curtails the non-classical dynamics. We prove that there is a trade-off with the uncertainty principle that allows theories to evade this restriction. On one extreme, non-classical theories with maximal certainty have their non-classical dynamics absolutely restricted to only the identity operation. On the other extreme, quantum theory minimizes certainty in return for maximal non-classical dynamics. PMID:25105741

Dahlsten, Oscar C O; Garner, Andrew J P; Vedral, Vlatko

2014-01-01

44

Interpretation of the Cosmological Constant Problem within the Framework of Generalized Uncertainty Principle  

E-print Network

We propose an improved exponential Generalized Uncertainty Principle (GUP) by introducing a positive integer $n$ called the suppressing index. Due to the UV/IR mixing brought by the GUP, the states with momenta smaller than the critical momentum ($P P_{\\rm Crit}$) and thus have no contributions to the energy density of the vacuum. By considering the contributions just from the states with momenta larger than the critical momentum ($P > P_{\\rm Crit}$) and choosing a suitable suppressing index, $n \\sim 10^{123}$, we calculate the cosmological constant consistent with the experimentally observed value.

Yan-Gang Miao; Ying-Jie Zhao

2013-12-15

45

Generalized uncertainty principle in f(R) gravity for a charged black hole  

SciTech Connect

Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

Said, Jackson Levi [Physics Department, University of Malta, Msida (Malta); Adami, Kristian Zarb [Physics Department, University of Malta, Msida (Malta); Physics Department, University of Oxford, Oxford (United Kingdom)

2011-02-15

46

Corrected Black Hole's Thermodynamics and Tunneling Radiation with Generalized Uncertainty Principle and Modified Dispersion Relation  

NASA Astrophysics Data System (ADS)

Although the tunneling approach is well established for black hole radiation, much works have been done to support the extension of this approach to more general situations. In this article the Parikh-Kraus-Wilczek tunneling proposal of black hole radiation is considered. The black hole thermodynamics and tunneling radiation are studied, based on both, the generalized uncertainty principle and the modified dispersion relation analysis. It is shown that entropy, temperature and the original Parikh-Kraus-Wilczek tunneling radiation calculation receives new corrections. it has been shown that the results of this two alternative approaches are identical if one uses the suitable expansion coefficients.

Dehghani, M.

2011-02-01

47

Geodesics, Mass and the Uncertainty Principle in a Warped de Sitter Space-time  

E-print Network

We present the explicit solution to the geodesic equations in a warped de Sitter space-time proposed by Randall-Sundrum. We find that a test particle moves in the bulk and is not restricted on a 3-brane (to be taken as our universe). On the 3-brane, the test particle moves with uniform velocity, giving the appearance that it is not subject to a force. But computing the particle's energy using the energy-momentum tensor yields a time-dependent energy that suggests a time-dependent mass. Thus, the extra force, which is the effect of the warped extra dimension on the particle's motion on the 3-brane, does not change the velocity but the mass of the particle. The particle's motion in the bulk also results in a time-dependent modification of the Heisenberg uncertainty principle as viewed on the 3-brane. These two results show that the classical physics along the extra dimension results in the time-dependence of particle masses and the uncertainty principle. If the particle masses are time-independent and the Heise...

Magpantay, Jose A

2011-01-01

48

Casimir effect in minimal length theories based on a generalized uncertainty principle  

NASA Astrophysics Data System (ADS)

We study the corrections to the Casimir effect in the classical geometry of two parallel metallic plates, separated by a distance a, due to the presence of a minimal length (??) arising from quantum mechanical models based on a generalized uncertainty principle (GUP). The approach for the quantization of the electromagnetic field is based on projecting onto the maximally localized states of a few specific GUP models and was previously developed to study the Casimir-Polder effect. For each model we compute the lowest order correction in the minimal length to the Casimir energy and find that it scales with the fifth power of the distance between the plates a-5 as opposed to the well known QED result which scales as a-3 and, contrary to previous claims, we find that it is always attractive. The various GUP models can be in principle differentiated by the strength of the correction to the Casimir energy as every model is characterized by a specific multiplicative numerical constant.

Frassino, A. M.; Panella, O.

2012-02-01

49

Relative locality and relative Co-locality as extensions of the Generalized Uncertainty Principle  

E-print Network

We interpret Relative locality as the variation of the Ultraviolet (UV) cut-off with respect to the observer's position relative to an event in agreement with an extended version of the Generalized Uncertainty Principle (GUP). As a consequence there is a natural red-shift effect for the events when they are observed at a given distance x. We then introduce the concept of Relative Co-locality as the variation of the infrared (IR) cut-off with respect to the observer's momentum relative to the event. As a consequence, there is a natural blue-shift effect for the events when the observer has a given momentum p with respect to them. Both effects are dual each other inside the formalism of quantum groups $SU(n)_q$ symmetric Heisenberg algebras and their q-Bargmann Fock representations. When Relative locality and Co-locality are introduced, the q-deformation parameter takes the form $q\\approx 1+\\sqrt{\\frac{\\vert p\\vert \\vert x\\vert}{r_\\Lambda m_{pl}c}}$ with the Relative Co-locality defined as $\\Delta P\\approx \\fra...

Arraut, Ivan

2013-01-01

50

A Dark Energy Model with Generalized Uncertainty Principle in the Emergent, Intermediate and Logamediate Scenarios of the Universe  

E-print Network

This work is motivated by the work of Kim et al (2008), which considered the equation of state parameter for the new agegraphic dark energy based on generalized uncertainty principle coexisting with dark matter without interaction. In this work, we have considered the same dark energy inter- acting with dark matter in emergent, intermediate and logamediate scenarios of the universe. Also, we have investigated the statefinder, kerk and lerk parameters in all three scenarios under this inter- action. The energy density and pressure for the new agegraphic dark energy based on generalized uncertainty principle have been calculated and their behaviors have been investigated. The evolu- tion of the equation of state parameter has been analyzed in the interacting and non-interacting situations in all the three scenarios. The graphical analysis shows that the dark energy behaves like quintessence era for logamediate expansion and phantom era for emergent and intermediate expansions of the universe.

Rahul Ghosh; Surajit Chattopadhyay; Ujjal Debnath

2011-05-23

51

On a generalized uncertainty principle, coherent states, and the moment map  

NASA Astrophysics Data System (ADS)

It is shown that, under general circumstances, symplectic G-orbits in a hamiltonian manifold acted on (symplectically) by a Lie group G provide critical points for the norm squared of the moment map. This fact yields a "variational" interpretation of the symplectic orbits appearing in the projective space attached to an irreducible representation of a compact simple Lie group (according to work of Kostant and Sternberg and of Giavarini and Onofri), where the previous function is also related to the invariant uncertainty considered by Delbourgo and Perelomov. A notion of generalized canonical conjugate variables (in the Khler case) is also presented and used in the framework of a Khler geometric interpretation of the Heisenberg uncertainty relations (building on the analysis given by Cirelli, Mani and Pizzocchero and by Provost and Vallee); it is proved, in particular, that the generalized coherent states of Rawnsley minimize the uncertainty relations for any pair of generalized canonically conjugate variables.

Spera, Mauro

1993-09-01

52

Hawking Effect with Generalized Uncertainty Principle and Modified Dispersion Relations in de Sitter-Schwarzschild Black Hole  

NASA Astrophysics Data System (ADS)

Although the tunneling approach is well established for black hole's radiation, much works have been done to support the extension of this approach to more general settings. In This letter the Parikh-Kraus-Wilczek tunneling proposal of black hole's tunneling radiation has been considered. The de Sitter-Schwarzschild black hole's thermodynamics has been studied based on both, the generalized uncertainty principle and the modified dispersion relations analysis. It is shown that entropy, temperature and the original Parikh-Kraus-Wilczek calculations of the black hole's tunneling probability receive new corrections. The results of these two alternative approaches are identical if one uses the suitable expansion coefficients.

Dehghani, M.

2011-02-01

53

A Quantum Mechanical Interpretation of Singleslit Diffraction Or, Using Diffration Phenomena to Illustrate the Uncertainty Principle  

E-print Network

: Quantum interference with slits, Thomas Marcella which appeared in European Journal of Physics 23, 615621 (2002). See also: Calculating diffraction patterns, F. Rioux in European Journal of Physics, 24, N1N3 Educator, 9, 1216 (2004). Singleslit Diffraction and the Uncertainty Prinicple, F. Rioux in Journal

Rioux, Frank

54

Theoretical formulation of finite-dimensional discrete phase spaces: II. On the uncertainty principle for Schwinger unitary operators  

SciTech Connect

We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the MassarSpindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the WienerKhinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the MassarSpindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. Determination of new restrictions upon the selective process of signals and wavelet bases. Demonstration of looser bounds interpolating between the tightest bound and the MassarSpindel inequality. Construction of finite ground states properly describing the tightest bound. Establishment of an important connection with the discrete Weyl function.

Marchiolli, M.A., E-mail: marcelo_march@bol.com.br [Avenida General Osrio 414, Centro, 14.870-100 Jaboticabal, SP (Brazil); Mendona, P.E.M.F., E-mail: pmendonca@gmail.com [Academia da Fora Area, C.P. 970, 13.643-970 Pirassununga, SP (Brazil)] [Academia da Fora Area, C.P. 970, 13.643-970 Pirassununga, SP (Brazil)

2013-09-15

55

Reverse-reconciliation continuous-variable quantum key distribution based on the uncertainty principle  

NASA Astrophysics Data System (ADS)

A big challenge in continuous-variable quantum key distribution is to prove security against arbitrary coherent attacks including realistic assumptions such as finite-size effects. Recently, such a proof has been presented in [Phys. Rev. Lett. 109, 100502 (2012), 10.1103/PhysRevLett.109.100502] for a two-mode squeezed state protocol based on a novel uncertainty relation with quantum memories. But the transmission distances were fairly limited due to a direct reconciliation protocol. We prove here security against coherent attacks of a reverse-reconciliation protocol under similar assumptions but allowing distances of over 16 km for experimentally feasible parameters. We further clarify the limitations when using the uncertainty relation with quantum memories in security proofs of continuous-variable quantum key distribution.

Furrer, Fabian

2014-10-01

56

Quantum discord and classical correlation can tighten the uncertainty principle in the presence of quantum memory  

E-print Network

Uncertainty relations capture the essence of the inevitable randomness associated with the outcomes of two incompatible quantum measurements. Recently, Berta et al. have shown that the lower bound on the uncertainties of the measurement outcomes depends on the correlations between the observed system and an observer who possesses a quantum memory. If the system is maximally entangled with its memory, the outcomes of two incompatible measurements made on the system can be predicted precisely. Here, we obtain a new uncertainty relation that tightens the lower bound of Berta et al., by incorporating an additional term that depends on the quantum discord and the classical correlations of the joint state of the observed system and the quantum memory. We discuss several examples of states for which our new lower bound is tighter than the bound of Berta et al. On the application side, we discuss the relevance of our new inequality for the security of quantum key distribution and show that it can be used to provide bounds on the distillable common randomness and the entanglement of formation of bipartite quantum states.

Arun Kumar Pati; Mark M. Wilde; A. R. Usha Devi; A. K. Rajagopal; Sudha

2012-04-17

57

Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle  

NASA Astrophysics Data System (ADS)

The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

McLeod, David; McLeod, Roger

2008-04-01

58

On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical  

NASA Astrophysics Data System (ADS)

The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

Mazurova, Elena; Lapshin, Aleksey

2013-04-01

59

Adding a strategic edge to human factors/ergonomics: principles for the management of uncertainty as cornerstones for system design.  

PubMed

It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735

Grote, Gudela

2014-01-01

60

Adopting the Uncertainty Principle for the Entropy Estimation of Black Holes, de Sitter Space and Rindler Space  

E-print Network

By a simple physical consideration and uncertain principle, we derive that temperature is proportional to the surface gravity and entropy is proportional to the surface area of the black hole. We apply the same consideration to de Sitter space and estimate the temperature and entropy of the space, then we deduce that the entropy is proportional to the boundary surface area. By the same consideration, we estimate the temperature and entropy in the uniformly accelerated system (Rindler coordinate). The cases in higher dimensions are considered.

Tetsuya Hara; Keita Sakai; Daigo Kajiura

2006-08-14

61

Pauli effects in uncertainty relations  

NASA Astrophysics Data System (ADS)

In this Letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information-based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

Toranzo, I. V.; Snchez-Moreno, P.; Esquivel, R. O.; Dehesa, J. S.

2014-10-01

62

Two new kinds of uncertainty relations  

NASA Technical Reports Server (NTRS)

We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

Uffink, Jos

1994-01-01

63

Uncertainty in Computational Aerodynamics  

NASA Technical Reports Server (NTRS)

An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

2003-01-01

64

Is the precautionary principle unscientific?  

Microsoft Academic Search

The precautionary principle holds that we should not allow scientific uncertainty to prevent us from taking precautionary measures in response to potential threats that are irreversible and potentially disastrous. Critics of the principle claim that it deters progress and development, is excessively risk-aversive and is unscientific. This paper argues that the principle can be scientific provided that (1) the threats

David B. Resnik

2003-01-01

65

Comparison of Classical and Quantum Mechanical Uncertainties.  

ERIC Educational Resources Information Center

Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

Peslak, John, Jr.

1979-01-01

66

Teaching uncertainties  

NASA Astrophysics Data System (ADS)

The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of uncertainty based on the idea of degree of belief. This enables them to move on more quickly to the important problems actually met in laboratory work, namely the estimation of uncertainties and their propagation. We also consider the mean and the weighted mean of several results and the method of least squares.

Duerdoth, Ian

2009-03-01

67

Position-momentum uncertainty products  

NASA Astrophysics Data System (ADS)

We point out two interesting features of position-momentum uncertainty product: U = ?x?p. We show that two special (non-differentiable) eigenstates of the Schrdinger operator with the Dirac delta potential [V(x) = -V0?(x)], V0 > 0, also satisfy Heisenbergs uncertainty principle by yielding U> \\frac{\\hbar }{2}. One of these eigenstates is a zero-energy and zero-curvature bound state.

Ahmed, Zafar; Yadav, Indresh

2014-07-01

68

Measuring Uncertainty  

NSDL National Science Digital Library

This article, authored by P.G. Moore for the Royal Statistical Society's website, provides well-defined exercises to assess the probabilities of decision-making and the degree of uncertainty. The author states the focus of the article as: "When analyzing situations which involve decisions to be made as between alternative courses of action under conditions of uncertainty, decision makers and their advisers are often called upon to assess judgmental probability distributions of quantities whose true values are unknown to them. How can this judgment be taught?" Moore provides five different exercises and even external reference for those interested in further study of the topic.

Moore, P. G.

2009-04-08

69

Bernoulli's Principle  

ERIC Educational Resources Information Center

Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it

Hewitt, Paul G.

2004-01-01

70

Teaching Uncertainties  

ERIC Educational Resources Information Center

The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of

Duerdoth, Ian

2009-01-01

71

Bernoulli's Principle  

NSDL National Science Digital Library

Many physics teachers have an unclear understanding of Bernoulli's principle, particularly when the principle is applied to aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it altogether. The following simplified treatment of the principle ignores most of the complexities of aerodynamics and hopefully will encourage teachers to bring Bernoulli back into the classroom.

Hewitt, Paul G.

2004-09-01

72

Entropic uncertainty assisted by temporal memory  

E-print Network

The uncertainty principle brings out intrinsic quantum bounds on the precision of measuring non-commuting observables. Statistical outcomes in the measurement of incompatible observables reveal a trade-off on the sum of corresponding entropies. Massen-Uffink entropic uncertainty relation (Phys. Rev. Lett. 60, 1103 (1988)) constrains the sum of entropies associated with incompatible measurements. The entropic uncertainty principle in the presence of quantum memory (Nature Phys. 6, 659 (2010)) brought about a fascinating twist by showing that quantum side information, enabled due to entanglement, helps in beating the uncertainty of non-commuting observables. Here we explore the interplay between temporal correlations and uncertainty. We show that with the assistance of a prior quantum temporal information achieved by sequential observations on the same quantum system at different times, the uncertainty bound on entropies gets reduced.

H. S. Karthik; A. R. Usha Devi; J. Prabhu Tej; A. K. Rajagopal

2013-10-18

73

Generalized Entropic Uncertainty Relations with Tsallis' Entropy  

NASA Technical Reports Server (NTRS)

A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

Portesi, M.; Plastino, A.

1996-01-01

74

Machs Principle  

NSDL National Science Digital Library

This page, from Kyoto University, provides a discussion of Machs Principle, a concept that played an important role in forming Einstein's theory of general relativity. Excerpts from Machs original text are examined and discussed for his ideas that are closely related to this principle. The general ambiguity of Machs Principle, and Einsteins interpretations of it are also presented.

Uchii, Soshichi

2007-10-10

75

Uncertainty analysis  

SciTech Connect

An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

Thomas, R.E.

1982-03-01

76

Uncertainty in the Classroom--Teaching Quantum Physics  

ERIC Educational Resources Information Center

The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how

Johansson, K. E.; Milstead, D.

2008-01-01

77

Assessor Training Measurement Uncertainty  

E-print Network

NVLAP Assessor Training Measurement Uncertainty #12;Assessor Training 2009: Measurement Uncertainty Training 2009: Measurement Uncertainty 3 Measurement Uncertainty ·Calibration and testing labs performing Training 2009: Measurement Uncertainty 4 Measurement Uncertainty ·When the nature of the test precludes

78

Buridan's Principle  

NASA Astrophysics Data System (ADS)

Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

Lamport, Leslie

2012-08-01

79

Pascal's Principle  

NSDL National Science Digital Library

This site from HyperPhysics provides a description of Pascal's Principle, which explains how pressure is transmitted in an enclosed fluid. Drawings and sample calculations are provided. Examples illustrating the principle include a hydraulic press and an automobile hydraulic lift.

Nave, Carl R.

2011-11-28

80

The physical origins of the uncertainty theorem  

NASA Astrophysics Data System (ADS)

The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

Giese, Albrecht

2013-10-01

81

The uncertainties in estimating measurement uncertainties  

SciTech Connect

All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties.

Clark, J.P.; Shull, A.H.

1994-07-01

82

Rocket Principles  

NSDL National Science Digital Library

On this site from the NASA Glenn Research Center Learning Technologies Project, the science and history of rocketry is explained. Visitors will find out how rocket principles illustrate Newton's Laws of Motion. There is a second page of this site, Practical Rocketry, which discusses the workings of rockets, including propellants, engine thrust control, stability and control systems, and mass.

2008-07-29

83

Bernoulli's Principle  

NSDL National Science Digital Library

Bernoulli's principle relates the pressure of a fluid to its elevation and its speed. Bernoulli's equation can be used to approximate these parameters in water, air or any fluid that has very low viscosity. Students learn about the relationships between the components of the Bernoulli equation through real-life engineering examples and practice problems.

Integrated Teaching And Learning Program And Laboratory

84

Fine-grained uncertainty relation under the relativistic motion  

E-print Network

One of the most important features of quantum theory is the uncertainty principle. Amount various uncertainty relations, the profound Fine-Grained Uncertainty Relation (FGUR) is used to distinguish the uncertainty inherent in obtaining any combination of outcomes for different measurements. In this paper, we explore this uncertainty relation in relativistic regime. For observer undergoes an uniform acceleration who immersed in an Unruh thermal bath, we show that the uncertainty bound is dependent on the acceleration parameter and choice of Unruh modes. Dramatically, we find that the measurements in Mutually Unbiased Bases (MUBs), sharing same uncertainty bound in inertial frame, could be distinguished from each other for a noninertial observer. On the other hand, once the Unruh decoherence is prevented by utilizing the cavity, the entanglement could be generated from nonuniform motion. We show that, for the observer restricted in a single rigid cavity, the uncertainty exhibits a periodic evolution with respec...

Feng, Jun; Gould, Mark D; Fan, Heng

2014-01-01

85

Radar principles  

NASA Technical Reports Server (NTRS)

Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

Sato, Toru

1989-01-01

86

Hydrological model uncertainty assessment in southern Africa  

NASA Astrophysics Data System (ADS)

The importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures used in the southern Africa region. The region is characterized by a paucity of accurate data and limited human resources, but the need for informed development decisions is critical to social and economic development. One of the main sources of uncertainty is related to the estimation of the parameters of hydrological models. This paper proposes a framework for establishing parameter values, exploring parameter inter-dependencies and setting parameter uncertainty bounds for a monthly time-step rainfall-runoff model (Pitman model) that is widely used in the region. The method is based on well-documented principles of sensitivity and uncertainty analysis, but recognizes the limitations that exist within the region (data scarcity and accuracy, model user attitudes, etc.). Four example applications taken from different climate and physiographic regions of South Africa illustrate that the methods are appropriate for generating behavioural stream flow simulations which include parameter uncertainty. The parameters that dominate the model response and their degree of uncertainty vary between regions. Some of the results suggest that the uncertainty bounds will be too wide for effective water resources decision making. Further work is required to reduce some of the subjectivity in the methods and to investigate other approaches for constraining the uncertainty. The paper recognizes that probability estimates of uncertainty and methods to include input climate data uncertainties need to be incorporated into the framework in the future.

Hughes, D. A.; Kapangaziwiri, E.; Sawunyama, T.

2010-06-01

87

Universal uncertainty relations.  

PubMed

Uncertainty relations are a distinctive characteristic of quantum theory that impose intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring noncommuting observables. However, there is no fundamental reason for using entropies as quantifiers; any functional relation that characterizes the uncertainty of the measurement outcomes defines an uncertainty relation. Starting from a very reasonable assumption of invariance under mere relabeling of the measurement outcomes, we show that Schur-concave functions are the most general uncertainty quantifiers. We then discover a fine-grained uncertainty relation that is given in terms of the majorization order between two probability vectors, significantly extending a majorization-based uncertainty relation first introduced in M. H. Partovi, Phys. Rev. A 84, 052117 (2011). Such a vector-type uncertainty relation generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary uncertainty quantifiers. Our relation is therefore universal and captures the essence of uncertainty in quantum theory. PMID:24476234

Friedland, Shmuel; Gheorghiu, Vlad; Gour, Gilad

2013-12-01

88

Universal Uncertainty Relations  

NASA Astrophysics Data System (ADS)

Uncertainty relations are a distinctive characteristic of quantum theory that impose intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring noncommuting observables. However, there is no fundamental reason for using entropies as quantifiers; any functional relation that characterizes the uncertainty of the measurement outcomes defines an uncertainty relation. Starting from a very reasonable assumption of invariance under mere relabeling of the measurement outcomes, we show that Schur-concave functions are the most general uncertainty quantifiers. We then discover a fine-grained uncertainty relation that is given in terms of the majorization order between two probability vectors, significantly extending a majorization-based uncertainty relation first introduced in M. H. Partovi, Phys. Rev. A 84, 052117 (2011). Such a vector-type uncertainty relation generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary uncertainty quantifiers. Our relation is therefore universal and captures the essence of uncertainty in quantum theory.

Friedland, Shmuel; Gheorghiu, Vlad; Gour, Gilad

2013-12-01

89

Uncertainties in Gapped Graphene  

E-print Network

Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.

Eylee Jung; Kwang S. Kim; DaeKil Park

2011-07-27

90

Data uncertainty traced to SI units. Results reported in the International System of Units  

Microsoft Academic Search

Remote sensor data reported as being traceable to the Syst@me International d'Units (SI) implies certain principles are followed to evaluate the uncertainty with which the data are reported. Unless these principles are followed to evaluate the uncertainty, remote sensor results will continue to be misinterpreted frequently. The demand for higher accuracy, remote sensor data mandates improvements in the way data

D. B. Pollock; T. L. Murdock; R. U. Datla; A. Thompson

2003-01-01

91

Physics and Operational Research: measure of uncertainty via Nonlinear Programming  

NASA Astrophysics Data System (ADS)

Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

Davizon-Castillo, Yasser A.

2008-03-01

92

Heisenberg Uncertainty Relation Revisited  

E-print Network

It is shown that all the known uncertainty relations are the secondary consequences of Robertson's relation. The basic idea is to use the Heisenberg picture so that the time development of quantum mechanical operators incorporate the effects of the measurement interaction. A suitable use of triangle inequalities then gives rise to various forms of uncertainty relations. The assumptions of unbiased measurement and unbiased disturbance are important to simplify the resulting uncertainty relations and to give the familiar uncertainty relations such as a naive Heisenberg error-disturbance relation. These simplified uncertainty relations are however valid only conditionally. Quite independently of uncertainty relations, it is shown that the notion of precise measurement is incompatible with the assumptions of unbiased measurement and unbiased disturbance. We can thus naturally understand the failure of the naive Heisenberg's error-disturbance relation, as was demonstrated by the recent spin-measurement by J. Erhart, et al..

Kazuo Fujikawa

2013-12-29

93

MANAGING UNCERTAINTY IN ENGINEERING DESIGN USING IMPRECISE PROBABILITIES AND PRINCIPLES  

E-print Network

would not be the person that I am today without their nurturing, guidance, discipline, and encouragement topic and to push myself to make a significant contribution to the engineering design community. I would

94

Phase-space noncommutative formulation of Ozawa's uncertainty principle  

NASA Astrophysics Data System (ADS)

Ozawa's measurement-disturbance relation is generalized to a phase-space noncommutative extension of quantum mechanics. It is shown that the measurement-disturbance relations have additional terms for backaction evading quadrature amplifiers and for noiseless quadrature transducers. Several distinctive features appear as a consequence of the noncommutative extension: measurement interactions which are noiseless, and observables which are undisturbed by a measurement, or of independent intervention in ordinary quantum mechanics, may acquire noise, become disturbed by the measurement, or no longer be an independent intervention in noncommutative quantum mechanics. It is also found that there can be states which violate Ozawa's universal noise-disturbance trade-off relation, but verify its noncommutative deformation.

Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, Joo Nuno

2014-08-01

95

The Precautionary Principle in Environmental Science  

Microsoft Academic Search

Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, pro- posed as a new guideline in environmental decision making, has four central components: taking pre- ventive action in the face of uncertainty; shifting the burden of proof to the proponents of

David Kriebel; Joel Tickner; Paul Epstein; John Lemons; Richard Levins; Edward L. Loechler; Margaret Quinn; Ruthann Rudel; Ted Schettler; Michael Stoto

96

Information-Disturbance theorem and Uncertainty Relation  

E-print Network

It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.

Takayuki Miyadera; Hideki Imai

2007-07-31

97

MOUSE UNCERTAINTY ANALYSIS SYSTEM  

EPA Science Inventory

The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

98

Equivalence principles and electromagnetism  

NASA Technical Reports Server (NTRS)

The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

Ni, W.-T.

1977-01-01

99

Archimedes' Principle, Pascal's Law and Bernoulli's Principle  

NSDL National Science Digital Library

Students are introduced to Pascal's law, Archimedes' principle and Bernoulli's principle. Fundamental definitions, equations, practice problems and engineering applications are supplied. A PowerPoint® presentation, practice problems and grading rubric are provided.

National Science Foundation GK-12 and Research Experience for Teachers (RET) Programs,

100

Role of the precautionary principle in water recycling  

Microsoft Academic Search

In an engineering context the precautionary principle is often perceived as an excuse to do nothing or a substantial barrier to technical progress. The precautionary principle requires that remedial measures be taken in situations of scientific uncertainty where evidence of harm cannot be proven but potential damage to human or environmental health is significant. In this paper the scope of

A. I. Schfera; S. Beder

2006-01-01

101

Applying the precautionary principle to the issue of impacts by pet cats on urban wildlife  

Microsoft Academic Search

Despite evidence that pet cats prey on urban wildlife and may transmit disease, there is uncertainty over whether they cause declines in wildlife populations. The uncertainty fosters disagreement about whether and how pet cats should be managed, and hampers the implementation of regulations. We suggest that the precautionary principle could be used in this context. The principle mandates action to

Michael C. Calver; Jacky Grayson; Maggie Lilith; Christopher R. Dickman

2011-01-01

102

Fine-grained uncertainty relation under the relativistic motion  

E-print Network

One of the most important features of quantum theory is the uncertainty principle. Amount various uncertainty relations, the profound Fine-Grained Uncertainty Relation (FGUR) is used to distinguish the uncertainty inherent in obtaining any combination of outcomes for different measurements. In this paper, we explore this uncertainty relation in relativistic regime. For observer undergoes an uniform acceleration who immersed in an Unruh thermal bath, we show that the uncertainty bound is dependent on the acceleration parameter and choice of Unruh modes. Dramatically, we find that the measurements in Mutually Unbiased Bases (MUBs), sharing same uncertainty bound in inertial frame, could be distinguished from each other for a noninertial observer. On the other hand, once the Unruh decoherence is prevented by utilizing the cavity, the entanglement could be generated from nonuniform motion. We show that, for the observer restricted in a single rigid cavity, the uncertainty exhibits a periodic evolution with respect to the duration of the acceleration and the uncertainty bounds can be degraded by the entanglement generation during particular epoch. With properly chosen cavity parameters, the uncertainty bounds could be protected. Otherwise, the measurements in different MUBs could be distinguished due to the relativistic motion of cavity. Implications of our results for gravitation and thermodynamics are discussed.

Jun Feng; Yao-Zhong Zhang; Mark D. Gould; Heng Fan

2014-03-03

103

Computation of LFT Uncertainty Bounds with Repeated Parametric Uncertainties  

NASA Technical Reports Server (NTRS)

A new methodology in which linear fractional transformation uncertainty bounds are directly constructed for use in robust control design and analysis is proposed. Existence conditions for model validating solutions with or without repeated scalar uncertainty are given. The approach is based on minimax formulation to deal with multiple non-repeated structured uncertainty components subject to fixed levels of repeated scalar uncertainties. Input directional dependence and variations with different experiments are addressed by maximizing uncertainty levels over multiple experimental data sets. Preliminary results show that reasonable uncertainty bounds on structured non-repeated uncertainties can be identified directly from measurement data by assuming reasonable levels of repeated scalar uncertainties.

Lim, K. B.; Giesy, D. P.

1997-01-01

104

Evaluating prediction uncertainty  

SciTech Connect

The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

McKay, M.D. [Los Alamos National Lab., NM (United States)

1995-03-01

105

Uncertainty and Climate Change  

Microsoft Academic Search

Uncertainty is pervasive in analysis of climatechange. How should economists allow for this? And how have they allowed for it? This paperreviews both of these questions. Copyright Kluwer Academic Publishers 2002

Geoffrey M. Heal; Bengt Kristrm

2002-01-01

106

Communicating scientific uncertainty.  

PubMed

All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

Fischhoff, Baruch; Davis, Alex L

2014-09-16

107

Communicating scientific uncertainty  

PubMed Central

All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

Fischhoff, Baruch; Davis, Alex L.

2014-01-01

108

Living With Radical Uncertainty. The Exemplary case of Folding Protein  

E-print Network

Laplace's demon still makes strong impact on contemporary science, in spite of the fact that Logical Mathematics outcomes, Quantum Physics advent and more recently Complexity Science have pointed out the crucial role of uncertainty in the World's descriptions. We focus here on the typical problem of folding protein as an example of uncertainty, radical emergence and a guide to the "simple" principles for studying complex systems.

Ignazio Licata

2010-04-21

109

Principles and Methods Chromatography  

E-print Network

Edition AC 18-1022-29 Principles and Methods Affinity Chromatography #12;Antibody Purification-1142-75 Protein Purification Handbook 18-1132-29 Ion Exchange Chromatography Principles and Methods 18-1114-21 Affinity Chromatography Principles and Methods 18-1022-29 Hydrophobic Interaction Chromatography Principles

Lebendiker, Mario

110

Uncertainty propagation for quality assurance in Reinforcement Learning  

Microsoft Academic Search

AbstractIn this paper we address the reliability of policies derived,by Reinforcement,Learning,on a limited amount,of observations. This can be done in a principled manner,by taking into account the derived Q-functions uncertainty, which stems from,the uncertainty,of the estimators,used,for the MDPs transition probabilities and,the,reward,function. We,apply uncertainty,propagation,parallelly to the Bellman iteration and achieve,confidence,intervals for the Q-function. In a second step we,change,the Bellman,operator,as to achieve,a

Daniel Schneegass; Steffen Udluft; Thomas Martinetz

2008-01-01

111

Picture independent quantum action principle  

SciTech Connect

The Schwinger action principle for quantum mechanics is extended into a picture independent form. This displays the quantum connection. Time variations are formulated as variations of a time variable and included into the kinematical variations. Kets and bras represent experimental operations. Experimental operations at different times cannot be identified. The ket and the bra spaces are fiber bundles over time. The same applies to the classical configuration space. For the classical action principle the action can be varied by changing the path or the classical variables. The latter variation of classical functions corresponds to kinematical variations of quantum variables. The picture independent formulation represents time evolution by a connection. A standard experiment is represented by a ket, a connection and a bra. For particular start and end times of experiments, the action and the contraction into a transition amplitude are elements of a new tensor space of quantum correspondents of path functionals. The classical correspondent of the transition amplitude is the probability for a specified state to evolve along a particular path segment. The elements of the dual tensor space represent standard experiments or superpositions thereof. The kinematical variations of the quantum variables are commuting numbers. Variations that include the effect of Poincare or gauge transformations have different commutator properties. The Schwinger action principle is derived from the Feynman path integral formulation. The limitations from the time-energy uncertainty relation might be accommodated by superposing experiments that differ in their start- and end-times. In its picture independent form the action principle can be applied to all superpositions of standard experiments. This may involve superpositions of different connections. The extension of the superposition principle to connections allows representation of a quantum field by a part of the connection.

Mantke, W.J.

1992-01-01

112

Computation of LFT uncertainty bounds with repeated parametric uncertainties  

Microsoft Academic Search

A methodology in which linear fractional transformation uncertainty bounds are directly constructed for use in robust control design and analysis is proposed. Existence conditions for model validating solutions with or without repeated scalar uncertainty are given. The approach is based on minimax formulation to deal with multiple non-repeated structured uncertainty components subject to fixed levels of repeated scalar uncertainties. Input

K. B. Lim; D. P. Giesyt

1998-01-01

113

Visualization of Uncertainty  

NASA Astrophysics Data System (ADS)

The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs at the generating points (effectively the center of the polygon). This methodology readily admits to rigorous statistical analysis using standard components found in R and thus entirely compatible with the visualization package we use (Visit and/or ParaView), the language we use (Python) and the UVCDAT environment that provides the programmer and analyst workbench. We will demonstrate the power and effectiveness of this methodology in climate studies. We will further argue that our method of defining (or predicting) values in a region has many advantages over the traditional visualization notion of value at a point.

Jones, P. W.; Strelitz, R. A.

2012-12-01

114

Uncertainties in risk tolerability  

SciTech Connect

The management of risk is now recognized as central to the effective and efficient operation of industry and commerce and is widely practiced. Risk Management has economic, political and human dimensions, which in all cases involve pivotal judgments relating to the acceptability or (as appropriate) tolerability of the criteria which underpin the executive decisions and actions in the risk management process. How robust are the techniques used to arrive at such judgments? And how can existing variations in tolerability criteria be explained or justified? The developing methodologies contain many uncertainties (for example, selection of failure cases from a range of possibilities; failure possibilities in each case; scale of modeling and consequence uncertainties; model validation; parameter values of the models used; uncertainties in enhancing and mitigating factors). How far do these uncertainties affect the validity of risk management decisions? And how sensitive are these decisions to aspects of uncertainty? How far do the influences affecting public perception of the type, nature and magnitude of any risks affect the nature of risk management? (For example, issues such as voluntary vs involuntary exposure; natural vs man-made risks, perceptions of personal control, familiarity, perceptions of benefit or disbenefit, the nature of the hazard, the nature of the threat, the special vulnerability of sensitive groups, public perceptions of comparators, reversibility of effects, all may be felt to influence significantly the decision making process.) Expression and communication of risk (particularly methods of calculating and expressing societal risk) may compound any problems.

Cassidy, K.

1995-12-31

115

Site uncertainty, allocation uncertainty, and superfund liability valuation  

Microsoft Academic Search

The amount and timing of a firm's ultimate financial obligation for contingent liabilities is uncertain and subject to the outcome of future events. We decompose uncertainty about Superfund contingent liabilities into two sources: (1) uncertainty regarding site clean-up cost (site uncertainty); and (2) uncertainty regarding allocation of total site-clean-up cost across multiple parties associated with the site (allocation uncertainty). We

Katherine Campbell; Stephan E. Sefcik; Naomi S. Soderstrom

1998-01-01

116

Uncertainty and Labor Contract Durations  

Microsoft Academic Search

This paper provides an empirical investigation into the relationship between ex ante U.S. labor contract durations and uncertainty over the period 1970 to 1995. We construct measures of inflation uncertainty as well as aggregate nominal and real uncertainty. The results not only corroborate previous findings of an inverse relationship between contract duration and inflation uncertainty, but document that this relationship

Robert Rich; Joseph Tracy

2000-01-01

117

Mobility Reduces Uncertainty in MANETs  

Microsoft Academic Search

Evaluating and quantifying trust stimulates collab- oration in mobile ad hoc networks (MANETs). Many existing reputation systems sharply divide the trust value into right or wrong, thus ignore another core dimension of trust: uncertainty. As uncertainty deeply impacts a node's anticipation of others' behavior and decisions during interaction, we include uncertainty in the reputation system. Specifically, we use an uncertainty

Feng Li; Jie Wu

2007-01-01

118

Uncertainty and calibration analysis  

SciTech Connect

All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.

Coutts, D.A.

1991-03-01

119

A Certain Uncertainty  

NASA Astrophysics Data System (ADS)

1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

Silverman, Mark P.

2014-07-01

120

Asymmetric Uncertainty Expression for High Gradient Aerodynamics  

NASA Technical Reports Server (NTRS)

When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

Pinier, Jeremy T

2012-01-01

121

[Stereotactic body radiation therapy: uncertainties and margins].  

PubMed

The principles governing stereotactic body radiation therapy are tight margins and large dose gradients around targets. Every step of treatment preparation and delivery must be evaluated before applying this technique in the clinic. Uncertainties remain in each of these steps: delineation, prescription with the biological equivalent dose, treatment planning, patient set-up taking into account movements, the machine accuracy. The calculation of margins to take into account uncertainties differs from conventional radiotherapy because of the delivery of few fractions and large dose gradients around the target. The quest of high accuracy is complicated by the difficulty to reach it and the lack of consensus regarding the prescription. Many schemes dose/number of fractions are described in clinical studies and there are differences in the way describing the delivered doses. While waiting for the ICRU report dedicated to this technique, it seems desirable to use the quantities proposed in ICRU Report 83 (IMRT) to report the dose distribution. PMID:25023588

Lacornerie, T; Marchesi, V; Reynaert, N

2014-01-01

122

Simple Resonance Hierarchy for Surmounting Quantum Uncertainty  

SciTech Connect

For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

Amoroso, Richard L. [Noetic Advanced Studies Institute, Oakland, CA 94610-1422 (United States)

2010-12-22

123

Uncertainty Analysis Economic Evaluations  

E-print Network

Uncertainty Analysis in Economic Evaluations Oras Ibrahim Master Thesis June 2005 Free University. The decision to invest is based on the evaluation of the project profitability. But how certain is the calculated profitability? What if the costs overrun during implementation of the project? What

Bhulai, Sandjai

124

Coping with Uncertainty.  

ERIC Educational Resources Information Center

Draws conclusions on the scientific uncertainty surrounding most chemical use regulatory decisions, examining the evolution of law and science, benefit analysis, and improving information. Suggests: (1) rapid development of knowledge of chemical risks and (2) a regulatory system which is flexible to new scientific knowledge. (DH)

Wargo, John

1985-01-01

125

Chemical Principls Exemplified  

ERIC Educational Resources Information Center

Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

Plumb, Robert C.

1973-01-01

126

Pauli's Exclusion Principle  

NASA Astrophysics Data System (ADS)

1. The exclusion principle: a philosophical overview; 2. The origins of the exclusion principle: an extremely natural prescriptive rule; 3. From the old quantum theory to the new quantum theory: reconsidering Kuhn's incommensurability; 4. How Pauli's rule became the exclusion principle: from the Fermi-Dirac statistics to the spin-statistics theorem; 5. The exclusion principle opens up new avenues: from the eightfold way to quantum chromodynamics.

Massimi, Michela

2012-10-01

127

Chemical Principles Exemplified  

ERIC Educational Resources Information Center

This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that

Plumb, Robert C.

1970-01-01

128

Reconsidering Archimedes' Principle  

Microsoft Academic Search

Archimedes' principle as stated originally by Archimedes and in modern texts can lead to an incorrect prediction if the submerged object is in contact with a solid surface. In this paper we look experimentally at a submerged object and show that though the theoretical explanations of the principle are valid, the statement of the principle needs clarification.

Jeffrey Bierman; Eric Kincanon

2003-01-01

129

How to calculate lead concentration and concentration uncertainty in XRF in vivo bone lead analysis.  

PubMed

The authors provide a substantial correction for calculating estimates of lead concentration and uncertainty for in vivo X-ray fluorescent bone analysis with Cd-109 source. Based on general principles, they provide mathematical techniques for propagation of uncertainties in XRF analysis. They give additional considerations for lowering the detection limit and improving spectral data quality. PMID:11761103

Kondrashov, V S; Rothenberg, S J

2001-12-01

130

Deriving Acquisition Principles from Tutoring Principles  

Microsoft Academic Search

This paper describes our analysis of the literature on tutorial dialogues and presents a compilation of useful principles\\u000a that students and teachers typically follow in making tutoring interactions successful. The compilation is done in the context\\u000a of making use of those principles in building knowledge acquisition interfaces since acquisition interfaces can be seen as\\u000a students acquiring knowledge from the user.

Jihie Kim; Yolanda Gil

2002-01-01

131

PARTICIPATION UNDER UNCERTAINTY  

Microsoft Academic Search

This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an

Moses A. Boudourides

132

Calibration Under Uncertainty.  

SciTech Connect

This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

Swiler, Laura Painton; Trucano, Timothy Guy

2005-03-01

133

Essays on pricing under uncertainty  

E-print Network

This dissertation analyzes pricing under uncertainty focusing on the U.S. airline industry. It sets to test theories of price dispersion driven by uncertainty in the demand by taking advantage of very detailed information about the dynamics...

Escobari Urday, Diego Alfonso

2008-10-10

134

Essays on uncertainty in economics  

E-print Network

This thesis consists of four essays about "uncertainty" and how markets deal with it. Uncertainty is about subjective beliefs, and thus it often comes with heterogeneous beliefs that may be present temporarily or even ...

Simsek, Alp

2010-01-01

135

Visualizing Uncertainty About the Future  

Microsoft Academic Search

We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in

David Spiegelhalter; Mike Pearson; Ian Short

2011-01-01

136

Calculating efficiencies and their uncertainties  

SciTech Connect

The commonly used methods for the calculation of the statistical uncertainties in cut efficiencies (''Poisson'' and ''binomial'' errors) are both defective, as is seen in extreme cases. A method for the calculation of uncertainties based upon Bayes' Theorem is presented; this method has no problem with extreme cases. A program for the calculation of such uncertainties is also available.

Paterno, Marc; /Fermilab

2004-12-01

137

Quantum Mechanics and the Principle of Least Radix Economy  

E-print Network

It is shown that action naturally leads to define a base (radix) in which physical quantities are most efficiently expressed. This leads to formulate a new variational method which includes and generalizes the least action principle, unifying classical and quantum physics. The Schr\\"odinger and Klein-Gordon equations and Heisenberg uncertainty relationships are derived from this principle and the breaking of the commutativity of spacetime geometry is elucidated.

Garcia-Morales, Vladimir

2014-01-01

138

The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.  

PubMed

The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health. PMID:14585517

Antonopoulou, Lila; van Meurs, Philip

2003-11-01

139

Satellite altitude determination uncertainties  

NASA Technical Reports Server (NTRS)

Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

Siry, J. W.

1972-01-01

140

Picturing Data With Uncertainty  

NASA Technical Reports Server (NTRS)

NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

2004-01-01

141

Implementing the Precautionary Principle: Incorporting Science, Technology, Fairness, and Accountability in Environmental, Health and Safety Decisions  

E-print Network

The precautionary principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on allegedly more "rational" ...

Ashford, Nicholas

2005-01-01

142

Probabilistic Mass Growth Uncertainties  

NASA Technical Reports Server (NTRS)

Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

Plumer, Eric; Elliott, Darren

2013-01-01

143

Teaching Quantum Uncertainty1  

NASA Astrophysics Data System (ADS)

An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

Hobson, Art

2011-10-01

144

The maintenance of uncertainty  

NASA Astrophysics Data System (ADS)

Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ?-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary

Smith, L. A.

145

Uncertainty Quantification in Lattice QCD Calculations for Nuclear Physics  

E-print Network

The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. We review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

Silas R. Beane; William Detmold; Kostas Orginos; Martin J. Savage

2014-10-11

146

Uncertainty Quantification in Lattice QCD Calculations for Nuclear Physics  

E-print Network

The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. We review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

Beane, Silas R; Orginos, Kostas; Savage, Martin J

2014-01-01

147

Hamilton's Principle for Beginners  

ERIC Educational Resources Information Center

I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a

Brun, J. L.

2007-01-01

148

Assessment Principles and Tools  

PubMed Central

The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described. PMID:24791100

Golnik, Karl C.

2014-01-01

149

Identifying Product Scaling Principles  

E-print Network

Figure 33: Original rear housing with axles and differential simplified to a solid rear axle design with incorporated brake and pulley grasped through the application of the ?simplify system? principle (Ezgo, 2011) ............................. 86... Figure 34: Potential suspension and steering configurations derived by applying the ?change method? principle ................................................................................. 87 Figure 35: Golf cart brake and parking brake pedal...

Perez, Angel 1986-

2011-06-02

150

Principles of Paleogeography  

Microsoft Academic Search

The broad general principles of paleogeography, which I would cite as most fundamental, are as follows: 1. Ocean basins are permanent hollows of the earth's surface and have occupied their present sites since an early date in the development of geographic features. This principle does not exclude notable changes in the positions of their margins, which on the whole have

Bailey Willis

1910-01-01

151

The genetic difference principle.  

PubMed

In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice. PMID:15186680

Farrelly, Colin

2004-01-01

152

Buoyancy: Archimedes Principle  

NSDL National Science Digital Library

This applied mathematics lesson describes the mathematic principles behind buoyancy in aerostatic machines. In it, students are given an introduction to the forces at work in buoyancy, including Archimedes Principle, and are asked to solve problems relating to volume, density, weight, and buoyancy of objects in particular environments.

Hodanbosi, Carol

1996-08-01

153

Pauli Exclusion Principle  

NSDL National Science Digital Library

This tutorial provides instruction on Pauli's exclusion principle, formulated by physicist Wolfgang Pauli in 1925, which states that no two electrons in an atom can have identical quantum numbers. Topics include a mathematical statement of the principle, descriptions of some of its applications, and its role in ionic and covalent bonding, nuclear shell structure, and nuclear binding energy.

Nave, Rod

154

Uncertainty in Seismic Hazard Assessment  

NASA Astrophysics Data System (ADS)

Uncertainty is a part of our life, and society has to deal with it, even though it is sometimes difficult to estimate. This is particularly true in seismic hazard assessment for large events, such as the mega-tsunami in Southeast Asia and the great New Madrid earthquakes in the central United States. There are two types of uncertainty in seismic hazard assessment: temporal and spatial. Temporal uncertainty describes distribution of the events in time and is estimated from the historical records, while spatial uncertainty describes distribution of physical measurements generated at a specific point by the events and is estimated from the measurements at the point. These uncertainties are of different characteristics and generally considered separately in hazard assessment. For example, temporal uncertainty (i.e., the probability of exceedance in a period) is considered separately from spatial uncertainty (a confidence level of physical measurement) in flood hazard assessment. Although estimating spatial uncertainty in seismic hazard assessment is difficult because there are not enough physical measurements (i.e., ground motions), it can be supplemented by numerical modeling. For example, the ground motion uncertainty or tsunami uncertainty at a point of interest has been estimated from numerical modeling. Estimating temporal uncertainty is particularly difficult, especially for large earthquakes, because there are not enough instrumental, historical, and geological records. Therefore, the temporal and spatial uncertainties in seismic hazard assessment are of different characteristics and should be determined separately. Probabilistic seismic hazard analysis (PSHA), the most widely used method to assess seismic hazard for various aspects of public and financial policy, uses spatial uncertainty (ground motion uncertainty) to extrapolate temporal uncertainty (ground motion occurrence), however. This extrapolation, or so-called ergodic assumption, is caused by a mathematical error in hazard calculation of PSHA: incorrectly equating the conditional exceedance probability of the ground-motion attenuation relationship (a function) to the exceedance probability of the ground-motion uncertainty (a variable). An alternative approach has been developed to correct the error and to determine temporal and spatial uncertainties separately.

Wang, Z.

2006-12-01

155

Weak Equivalence Principle Test on a Sounding Rocket  

Microsoft Academic Search

SR-POEM, our principle of equivalence measurement on a sounding rocket, will compare the free fall rate of two substances yielding an uncertainty of E-16 in the estimate of \\\\eta. During the past two years, the design concept has matured and we have been working on the required technology, including a laser gauge that is self aligning and able to reach

James D. Phillips; Bijunath R. Patla; Eugeniu M. Popescu; Emanuele Rocco; Rajesh Thapa; Robert D. Reasenberg; Enrico C. Lorenzini

2010-01-01

156

Uncertainty quantification in molecular dynamics  

NASA Astrophysics Data System (ADS)

This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the presence of intrinsic noise and parametric uncertainty. The dissertation is organized in successive, self-contained problems of increasing complexity aimed at investigating the target UQ challenge in a progressive fashion.

Rizzi, Francesco

157

Editorial: The Principle of Personhood: The Field's Transcendent Principle  

Microsoft Academic Search

Some of the names of these principles that come immediately to mind are such principles as person involvement, growth orientation, hope, self-determination and choice. My next problem was I could not perfectly recall the definitions of the several principles that I could remember! If I could not remember all these important principles and their definitions, then how could these principles

William A. Anthony

158

Multi-band pyrometer uncertainty analysis and improvement  

NASA Astrophysics Data System (ADS)

According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

2011-05-01

159

Multi-band pyrometer uncertainty analysis and improvement  

NASA Astrophysics Data System (ADS)

According to the energy ratio value of multi-band radiating from the measured surface, the 'true' temperature can be calculated by multi-band pyrometer. Multi-band pyrometer has many advantages: it can hardly be affected by the emission of measured surface and the environment radiation, and it has higher Signal-to-Noise Ratio and higher temperature measurement accuracy. This paper introduces the principle of a multi-band pyrometer and the uncertainty of measurement result is evaluated by using Monte-Carlo Method (MCM). The result shows that the accuracy of effective wavelength is the largest source of uncertainty and the other main source is reference temperature. When using ordinary blackbody furnace with continuous temperature, which can provide reference temperature and calibrate effective wavelength, the uncertainty component is 2.17K and 2.48K respectively. The combined standard uncertainty is 3.30K. A new calibration method is introduced. The effective wavelength is calibrated by monochromator, and the reference temperature is provided by fixed point black body furnace. The uncertainty component is decreased to 0.73K and 0.12K respectively. The measurement uncertainty is decreased to 0.74K. The temperature measurement accuracy is enhanced.

Yang, Yongjun; Zhang, Xuecong; Cai, Jing; Wang, Zhongyu

2010-12-01

160

Analysis of Infiltration Uncertainty  

SciTech Connect

The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the simulated multi-rectangular region approximating the repository footprint, shown in Figure 1-1. (For brevity, these maps will be referred to as the analog maps, and the corresponding averaged net infiltration values as the analog values.)

R. McCurley

2003-10-27

161

How Uncertainty Bounds the Shape Index of Simple Cells  

PubMed Central

We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex. Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15. PMID:24742044

2014-01-01

162

Pandemic influenza: certain uncertainties  

PubMed Central

SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, wave patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

Morens, David M.; Taubenberger, Jeffery K.

2011-01-01

163

Uncertainties in Arctic Precipitation  

NASA Astrophysics Data System (ADS)

Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

2012-12-01

164

Uncertainties in Arctic precipitation  

NASA Astrophysics Data System (ADS)

Precipitation is an essential and highly variable component of the freshwater budget, and solid precipitation in particular, has a major impact on the local and global climate. The impacts of snow on the surface energy balance are tremendous, as snow has a higher albedo than any other naturally occurring surface condition. Documenting the instrumentally observed precipitation climate records presents its own challenges since the stations themselves undergo many changes in the course of their operation.Though it is crucial to accurately measure precipitation as a means to predict change in future water budgets, estimates of long-term precipitation are riddled with measurement biases. Some of the challenges facing reliable measurement of solid precipitation include missing data, gage change, discontinued stations, trace precipitation, blizzards, wetting losses when emptying the gage, and evaporation between the time of event and the time of measurement. Rain measurements likewise face uncertainties such as splashing of rain out of the gage, evaporation, and extreme events, though the magnitude of these impacts on overall measurement is less than that faced by solid precipitation. In all, biases can be so significant that they present major problems for the use of precipitation data in climate studies.

Majhi, Ipshita; Alexeev, Vladimir; Cherry, Jessica; Groisman, Pasha; Cohen, Judah

2013-04-01

165

Conceptual Framework of the Precautionary Principle Approach in the Decision-making process  

Microsoft Academic Search

This article reviews the precautionary principle as an approach in addressing decisions with far reaching environmental consequences under scientific uncertainty. The precautionary principle is intended to assist with structuring environmentally risky decisions toward sustainable development. It responds to the lack of scientific evidence of a potential environmental problem. There is currently no framework to assist with the process indicating the

Clare D'Souza; Mehdi Taghian; Australia Rajiv Khosla

166

Chemical Principles Exemplified  

ERIC Educational Resources Information Center

Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

Plumb, Robert C.

1972-01-01

167

Buoyancy and Archimedes Principle  

NSDL National Science Digital Library

Summary Buoyancy is based on Archimedes' Principle which states that the buoyant force acting upward on an object completely or partially immersed in a fluid equals the weight of the fluid displaced by the ...

168

Archimedes' Principle in Action  

ERIC Educational Resources Information Center

The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

Kires, Marian

2007-01-01

169

Uncertainty and Labor Contract Durations  

Microsoft Academic Search

This paper provides an empirical investigation into the relationship between uncertainty and ex ante U.S. labor contract durations over the period 1970 to 1995. Using a structural identification of aggregate demand and aggregate supply shocks, we find that desired contract durations are shorter during periods of heightened nominal or real uncertainty. This evidence supports the view that labor contract durations

Robert Rich; Joseph Tracy

2004-01-01

170

Hydrology, society, change and uncertainty  

NASA Astrophysics Data System (ADS)

Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

Koutsoyiannis, Demetris

2014-05-01

171

Housing Uncertainty and Childhood Impatience  

ERIC Educational Resources Information Center

The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that

Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

2011-01-01

172

Mystery Boxes: Uncertainty and Collaboration  

NSDL National Science Digital Library

This lesson teaches students that scientific knowledge is fundamentally uncertain. Students manipulate sealed mystery boxes and attempt to determine the inner structure of the boxes which contain a moving ball and a fixed barrier or two. The nature and sources of uncertainty inherent in the process of problem-solving are experienced. The uncertainty of the conclusions is reduced by student collaboration.

Beard, Jean

173

Principles of Forecasting Project  

NSDL National Science Digital Library

Directed by J. Scott Armstrong at the Wharton School of the University of Pennsylvania, the Principles of Forecasting Project seeks to "develop a comprehensive and structured review of the state of knowledge in the field of forecasting" in order to aid future research. The project will lead to a book entitled Principles of Forecasting: A Handbook for Researchers and Practitioners, and sample chapters, contact information, updates, and links to forecasting resources add value to this expanding compilation.

174

Archimedes' Principle and Applications Objectives  

E-print Network

Lab 9 Archimedes' Principle and Applications Objectives: Upon successful completion of this exercise you will have ... 1. ... utilized Archimedes' principle to determine the density and specific gravity of a variety of substances. 2. ... utilized Archimedes' principle to determine the density

Yu, Jaehoon

175

First Comes Love, Then Comes Google: An Investigation of Uncertainty Reduction Strategies and Self-Disclosure in Online Dating  

Microsoft Academic Search

This study investigates relationships between privacy concerns, uncertainty reduction behaviors, and self-disclosure among online dating participants, drawing on uncertainty reduction theory and the warranting principle. The authors propose a conceptual model integrating privacy concerns, self-efficacy, and Internet experience with uncertainty reduction strategies and amount of self-disclosure and then test this model on a nationwide sample of online dating participants (

Jennifer L. Gibbs; Nicole B. Ellison; Chih-Hui Lai

2011-01-01

176

The Bayesian brain: phantom percepts resolve sensory uncertainty.  

PubMed

Phantom perceptions arise almost universally in people who sustain sensory deafferentation, and in multiple sensory domains. The question arises 'why' the brain creates these false percepts in the absence of an external stimulus? The model proposed answers this question by stating that our brain works in a Bayesian way, and that its main function is to reduce environmental uncertainty, based on the free-energy principle, which has been proposed as a universal principle governing adaptive brain function and structure. The Bayesian brain can be conceptualized as a probability machine that constantly makes predictions about the world and then updates them based on what it receives from the senses. The free-energy principle states that the brain must minimize its Shannonian free-energy, i.e. must reduce by the process of perception its uncertainty (its prediction errors) about its environment. As completely predictable stimuli do not reduce uncertainty, they are not worthwhile of conscious processing. Unpredictable things on the other hand are not to be ignored, because it is crucial to experience them to update our understanding of the environment. Deafferentation leads to topographically restricted prediction errors based on temporal or spatial incongruity. This leads to an increase in topographically restricted uncertainty, which should be adaptively addressed by plastic repair mechanisms in the respective sensory cortex or via (para)hippocampal involvement. Neuroanatomically, filling in as a compensation for missing information also activates the anterior cingulate and insula, areas also involved in salience, stress and essential for stimulus detection. Associated with sensory cortex hyperactivity and decreased inhibition or map plasticity this will result in the perception of the false information created by the deafferented sensory areas, as a way to reduce increased topographically restricted uncertainty associated with the deafferentation. In conclusion, the Bayesian updating of knowledge via active sensory exploration of the environment, driven by the Shannonian free-energy principle, provides an explanation for the generation of phantom percepts, as a way to reduce uncertainty, to make sense of the world. PMID:22516669

De Ridder, Dirk; Vanneste, Sven; Freeman, Walter

2014-07-01

177

Applying the four principles  

PubMed Central

Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle. PMID:14519836

Macklin, R

2003-01-01

178

Uncertainties in Atmospheric Neutrino Fluxes  

E-print Network

An evaluation of the principal uncertainties in the computation of neutrino fluxes produced in cosmic ray showers in the atmosphere is presented. The neutrino flux predictions are needed for comparison with experiment to perform neutrino oscillation studies. The paper concentrates on the main limitations which are due to hadron production uncertainties. It also treats primary cosmic ray flux uncertainties, which are at a lower level. The absolute neutrino fluxes are found to have errors of around 15% in the neutrino energy region important for contained events underground. Large cancellations of these errors occur when ratios of fluxes are considered, in particular, the $\

G. D. Barr; T. K. Gaisser; S. Robbins; T. Stanev

2006-11-08

179

Teaching/learning principles  

NASA Technical Reports Server (NTRS)

The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

Hankins, D. B.; Wake, W. H.

1981-01-01

180

Non-scalar uncertainty: Uncertainty in dynamic systems  

NASA Technical Reports Server (NTRS)

The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

Martinez, Salvador Gutierrez

1992-01-01

181

Policy Uncertainty and Household Savings  

E-print Network

Using German microdata and a quasi-natural experiment, we provide evidence on how households respond to an increase in uncertainty. We find that household saving increases significantly following the increase in political ...

Giavazzi, Francesco

182

Predicting System Performance with Uncertainty  

E-print Network

inexpensive way. We propose using Gaussian Processes for system performance predictions and explain the types of uncertainties included. As an example, we use a Gaussian Process to predict chilled water use and compare the results with Neural Network...

Yan, B.; Malkawi, A.

2012-01-01

183

Critical loads - assessment of uncertainty.  

National Technical Information Service (NTIS)

The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical mode...

A. Barkman

1998-01-01

184

Uncertainty in Climate Change Modeling  

NSDL National Science Digital Library

Learn why trout are important indicators in Wisconsin’s changing climate, and why the uncertainty of global climate models complicates predictions about their survival, in this video produced by the Wisconsin Educational Communications Board.

Ecb, Wisconsin

2010-11-30

185

Uncertainty in emissions projections for climate models  

E-print Network

Future global climate projections are subject to large uncertainties. Major sources of this uncertainty are projections of anthropogenic emissions. We evaluate the uncertainty in future anthropogenic emissions using a ...

Webster, Mort David.; Babiker, Mustafa H.M.; Mayer, Monika.; Reilly, John M.; Harnisch, Jochen.; Hyman, Robert C.; Sarofim, Marcus C.; Wang, Chien.

186

Uncertainty-induced quantum nonlocality  

E-print Network

Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2 x d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.

Shao-xiong Wu; Jun Zhang; Chang-shui Yu; He-shan Song

2013-11-30

187

Uncertainty Quantification in Fluid Flow  

Microsoft Academic Search

\\u000a This chapter addresses the topic of uncertainty quantification in fluid flow computations. The relevance and utility of this\\u000a pursuit are discussed, outlining highlights of available methodologies. Particular attention is focused on spectral polynomial\\u000a chaos methods for uncertainty quantification that have seen significant development over the past two decades. The fundamental\\u000a structure of these methods is presented, along with associated challenges.

Habib N. Najm

188

Haplotype uncertainty in association studies.  

PubMed

Inferring haplotypes from genotype data is commonly undertaken in population genetic association studies. Within such studies the importance of accounting for uncertainty in the inference of haplotypes is well recognised. We investigate the effectiveness of correcting for uncertainty using simple methods based on the output provided by the PHASE haplotype inference methodology. In case-control analyses investigating non-Hodgkin lymphoma and haplotypes associated with immune regulation we find little effect of making adjustment for uncertainty in inferred haplotypes. Using simulation we introduce a higher degree of haplotype uncertainty than was present in our study data. The simulation represents two genetic loci, physically close on a chromosome, forming haplotypes. Considering a range of allele frequencies, degrees of linkage between the loci, and frequency of missing genotype data, we detail the characteristics of genetic regions which may be susceptible to the influence of haplotype uncertainty. Within our evaluation we find that bias is avoided by considering haplotype probabilities or using multiple imputation, provided that for each of these methods haplotypes are inferred separately for case and control populations; furthermore using multiple imputation provides the facility to incorporate haplotype uncertainty in the estimation of confidence intervals. We discuss the implications of our findings within the context of the complexity of haplotype inference for larger marker rich regions as would typically be encountered in genetic analyses. PMID:17323369

Mensah, F K; Gilthorpe, M S; Davies, C F; Keen, L J; Adamson, P J; Roman, E; Morgan, G J; Bidwell, J L; Law, G R

2007-05-01

189

An Improved Inertia Principle  

E-print Network

We show that for isolated relativistic systems with spin the conservation of total angular momentum implies that, instead of the center of mass, it is a modified center of mass and spin which behaves inertially. This requires a change in the the statement of the Principle of Inertia.

Rodrigo Medina; J. Stephany

2014-04-06

190

An Improved Inertia Principle  

E-print Network

We show that for isolated relativistic systems with spin the conservation of total angular momentum implies that, instead of the center of mass, it is a modified center of mass and spin which behaves inertially. This requires a change in the the statement of the Principle of Inertia.

Medina, Rodrigo

2014-01-01

191

Principles of interactional psychotherapy  

Microsoft Academic Search

Describes the principles of interactional psychotherapy which have evolved from 30 yrs of practice in treating the emotionally ill. Disturbed people should be thought of as having problems that involve experiences of futility and lack of meaning in life, rather than as having a mental disease. This approach acknowledges the importance of parental influences in emotional problems but emphasizes that

Benjamin B. Wolman

1975-01-01

192

Laboratory Safety Principles  

NSDL National Science Digital Library

This workshop covers major principles and regulations pertinent to working in laboratories with hazardous materials. It is divided into 45 minute segments dealing with: Radioactive Materials (Staiger); Toxic, Reactive, Carcinogenic, and Teratogenic Chemicals (Carlson); Infectious Agents (Laver); and Fire Safety Concepts and Physical Hazards (Arnston).

Jerry Staiger, Keith Carlson, Jim Laver, Ray Arntson (University of Minnesota;); Keith Carlson (University of Minnesota;); Jim Lauer (University of Minnesota;); Ray Amtson (University of Minnesota;)

2008-04-11

193

First Principles of Instruction.  

ERIC Educational Resources Information Center

Examines instructional design theories and elaborates principles about when learning is promoted, i.e., when learners are engaged in solving real-world problems, when existing knowledge is activated as a foundation for new knowledge, and when new knowledge is demonstrated to the learner, applied by the learner, and integrated into the learner's

Merrill, M. David

2002-01-01

194

The Proposal Design Principles  

E-print Network

satisfactory connections with Worcester Place, Walton Street and the broader community. #12;49 Urban Design Architects consider Exeter College's Walton Street Quad as an urban design project. With its very longThe Proposal 5.0 #12;48 Design Principles 5.1 Existing ground floor plan showing extent of retained

Flynn, E. Victor

195

Principles of Software Testing  

E-print Network

of a failure and usually of a fault. (I follow the IEEE standard terminology: An unsatisfactory program by agile methods, has brought tests to the center stage, but testing is about producing failures. AugustSeven Principles of Software Testing Bertrand Meyer, ETH Zürich and Eiffel Software W hile everyone

Meyer, Bertrand

196

Principles of Object Perception  

Microsoft Academic Search

Research on human infants has begun to shed light on early-developing processes for segmenting perceptual arrays into objects. Infants appear to perceive obiects by analyzing three-dlmensional surface arrangements and motions. Their per- ception does not accord with a general tendency to maximize figural goodness or to attend to nonaccldentol geometric relations in visual arrays. Object perceptlan does accord with principles

Elizabeth S. Spelke

1990-01-01

197

Basic Comfort Heating Principles.  

ERIC Educational Resources Information Center

The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background

Dempster, Chalmer T.

198

Principles of Physical Cosmology  

E-print Network

Principles of Physical Cosmology P.J.E. Peebles Jim Peebles, Albert Ein- stein Professor of Sci in observational and theo- retical cosmology, motivating new three-dimensional surveys of up to a million galaxies. In this rapidly developing discipline, Peebles's books, particu- larly Physical Cosmology (1971) and The Large

Landweber, Laura

199

Practical Doping Principles  

SciTech Connect

'Theoretical investigations of doping of several wide-gap materials suggest a number of rather general, practical"doping principles" that may help guide experimental strategies of overcoming doping bottlenecks. This paper will be published as a journal article in the future.

Zunger, A.

2003-05-01

200

BASIC ELECTRICAL CONNECTION PRINCIPLES  

E-print Network

involving aluminum conductors than those encountered in copper to copper connections. CREEP (COLD FLOW is the conductor as compared to copper, since its creep rate is many times that of copper. Effect of Creep: FigureBURNDY Reference BASIC ELECTRICAL CONNECTION PRINCIPLES Basic Factors: The basic factors which

Johnson, Eric E.

201

PRINCIPLES OF WATER FILTRATION  

EPA Science Inventory

This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...

202

Cosmic rays and tests of fundamental principles  

E-print Network

It is now widely acknowledged that cosmic rays experiments can test possible new physics directly generated at the Planck scale or at some other fundamental scale. By studying particle properties at energies far beyond the reach of any man-made accelerator, they can yield unique checks of basic principles. A well-known example is provided by possible tests of special relativity at the highest cosmic-ray energies. But other essential ingredients of standard theories can in principle be tested: quantum mechanics, uncertainty principle, energy and momentum conservation, effective space-time dimensions, hamiltonian and lagrangian formalisms, postulates of cosmology, vacuum dynamics and particle propagation, quark and gluon confinement, elementariness of particles... Standard particle physics or string-like patterns may have a composite origin able to manifest itself through specific cosmic-ray signatures. Ultra-high energy cosmic rays, but also cosmic rays at lower energies, are probes of both "conventional" and new Physics. Status, prospects, new ideas, and open questions in the field are discussed. The Post Scriptum shows that several basic features of modern cosmology naturally appear in a SU(2) spinorial description of space-time without any need for matter, relativity or standard gravitation. New possible effects related to the spinorial space-time structure can also be foreseen. Similarly, the existence of spin-1/2 particles can be naturally related to physics beyond Planck scale and to a possible pre-Big Bang era.

Luis Gonzalez-Mestres

2010-11-22

203

The precautionary principle in environmental science.  

PubMed Central

Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114

Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

2001-01-01

204

A weak equivalence principle test on a suborbital rocket  

E-print Network

We describe a Galilean test of the weak equivalence principle, to be conducted during the free fall portion of a sounding rocket flight. The test of a single pair of substances is aimed at a measurement uncertainty of sigma(eta) < 10^-16 after averaging the results of eight separate drops. The weak equivalence principle measurement is made with a set of four laser gauges that are expected to achieve 0.1 pm Hz^-1/2. The discovery of a violation (eta not equal to 0) would have profound implications for physics, astrophysics, and cosmology.

Robert D. Reasenberg; James D. Phillips

2010-01-26

205

Fine-Grained Lower Limit of Entropic Uncertainty in the Presence of Quantum Memory  

NASA Astrophysics Data System (ADS)

The limitation on obtaining precise outcomes of measurements performed on two noncommuting observables of a particle as set by the uncertainty principle in its entropic form can be reduced in the presence of quantum memory. We derive a new entropic uncertainty relation based on fine graining, which leads to an ultimate limit on the precision achievable in measurements performed on two incompatible observables in the presence of quantum memory. We show that our derived uncertainty relation tightens the lower bound set by entropic uncertainty for members of the class of two-qubit states with maximally mixed marginals, while accounting for the recent experimental results using maximally entangled pure states and mixed Bell-diagonal states. An implication of our uncertainty relation on the security of quantum key generation protocols is pointed out.

Pramanik, T.; Chowdhury, P.; Majumdar, A. S.

2013-01-01

206

Data uncertainty in face recognition.  

PubMed

The image of a face varies with the illumination, pose, and facial expression, thus we say that a single face image is of high uncertainty for representing the face. In this sense, a face image is just an observation and it should not be considered as the absolutely accurate representation of the face. As more face images from the same person provide more observations of the face, more face images may be useful for reducing the uncertainty of the representation of the face and improving the accuracy of face recognition. However, in a real world face recognition system, a subject usually has only a limited number of available face images and thus there is high uncertainty. In this paper, we attempt to improve the face recognition accuracy by reducing the uncertainty. First, we reduce the uncertainty of the face representation by synthesizing the virtual training samples. Then, we select useful training samples that are similar to the test sample from the set of all the original and synthesized virtual training samples. Moreover, we state a theorem that determines the upper bound of the number of useful training samples. Finally, we devise a representation approach based on the selected useful training samples to perform face recognition. Experimental results on five widely used face databases demonstrate that our proposed approach can not only obtain a high face recognition accuracy, but also has a lower computational complexity than the other state-of-the-art approaches. PMID:25222733

Xu, Yong; Fang, Xiaozhao; Li, Xuelong; Yang, Jiang; You, Jane; Liu, Hong; Teng, Shaohua

2014-10-01

207

The Principle of Maximum Conformality  

SciTech Connect

A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

Brodsky, Stanley J; /SLAC; Giustino, Di; /SLAC

2011-04-05

208

Basic Principles of Ultrasound  

NSDL National Science Digital Library

Created by a team of medical professionals and health-care specialists, the main Echo Web site contains a wide range of resources dealing primarily with diagnostic ultrasounds, sonography, and the field of echocardiography. One of the most helpful of these resources is the Basic Principles of Ultrasound online course, which is available here at no cost. The course itself is divided into six different sections, along with a bibliography and FAQ area. Visitors can use the online course to learn about the basic principles of ultrasound, the basic science behind related devices and instruments, and the ways to use these devices safely. Instructors might also do well to use this website in conjunction with lectures on the subject, or as away to give students an additional resource to consult at their leisure.

2004-01-01

209

Principles of Semiconductor Devices  

NSDL National Science Digital Library

Home page of an online and interactive textbook, Principles of Semiconductor Devices., written by Bart J. Van Zeghbroeck, Ph.D., Professor in the Department of Electrical and Computer Engineering at the University of Colorado at Boulder. The goal of this text is to provide the basic principles of common semiconductor devices, with a special focus on Metal-Oxide-Semiconductor Field-Effect-Transistors (MOSFETs). A browser environment was chosen so that text, figures and equations can be linked for easy reference. A table of contents, a glossary, active figures and some study aids are integrated with the text with the intention to provide a more effective reference and learning environment. Chapter titles include: Semiconductor Fundamentals, Metal-Semiconductor Junctions, p-n Junctions, Bipolar Transistors, MOS Capacitors, and MOSFET.

Van Zeghbroeck, Bart J.

2011-06-13

210

The Shakespearean Principle Revisited  

PubMed Central

Let every eye negotiate for itself and trust no agent. That line is from William Shakespeare's Much Ado About Nothing. 1 To me, it is a fundamental doctrine of patient care, and I have named it the Shakespearean Principle.2 It stimulates skepticism,3 promotes doubt,4 improves communication, fosters proper decision-making, and protects against a malady that currently plagues our professionherd mentality.5 This editorial shows what can happen when doctors violate the Shakespearean Principle. The story is real and tells of a woman whose doctor unintentionally killed her. To ensure anonymity, the time and place of the tragedy, as well as the players involved, have been changed. PMID:22412219

Fred, Herbert L.

2012-01-01

211

Principles of gravitational biology  

NASA Technical Reports Server (NTRS)

Physical principles of gravitation are enumerated, including gravitational and inertial forces, weight and mass, weightlessness, size and scale effects, scale limits of gravitational effects, and gravity as biogenic factor. Statocysts, otolithic organs of vertebrates, gravity reception in plants, and clinostat studies for gravitation orientation are reviewed. Chronic acceleration is also studied, as well as physiology of hyper and hypodynamic fields. Responses of animals to a decreased acceleration field are examined, considering postural changes, work capacity, growth, and physiologic deadaptation.

Smith, A. H.

1975-01-01

212

Principles of lake sedimentology  

SciTech Connect

This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.

Janasson, L.

1983-01-01

213

Principles of nuclear geology  

SciTech Connect

This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

Aswathanarayana, U.

1985-01-01

214

Principles of Medical Management  

Microsoft Academic Search

In the past, patients rarely questioned the therapeutic decisions made by their physicians. Physicians, in turn, were guided\\u000a by the principle of doing onto others as you would have done onto you and were largely limited by their clinical experiences.\\u000a Today, evidence-based medicine dictates most treatment decisions, and the role of the physician in educating patients about\\u000a available therapeutic options

Tarek Mekhail; Rony Abou-Jawde; Maurie Markman

215

A biomechanical inactivation principle  

Microsoft Academic Search

This paper develops the mathematical side of a theory of inactivations in human biomechanics. This theory has been validated\\u000a by practical experiments, including zero-gravity experiments. The theory mostly relies on Pontryagins maximum principle on\\u000a the one side and on transversality theory on the other side. It turns out that the periods of silence in the activation of\\u000a muscles that are

Jean-Paul Gauthier; Bastien Berret; Frdric Jean

2010-01-01

216

White coat principles.  

PubMed

The White Coat Ceremony, which many dental schools use to mark the transition to patient care, is an opportunity to reflect on the values of dental practice. Eight principles are offered for consideration: 1 ) patient care is the point of practice; 2) the doctor-patient relationship is essential; 3) discuss options and possibilities; 4) mistakes will be made; 5) tell the truth; be assertive; 7 ) consult; and 8) manage your stress and your life. PMID:15948496

Peltier, Bruce N

2004-01-01

217

Climate negotiations under scientific uncertainty  

PubMed Central

How does uncertainty about dangerous climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

Barrett, Scott; Dannenberg, Astrid

2012-01-01

218

Linear Programming Problems for Generalized Uncertainty  

ERIC Educational Resources Information Center

Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle

Thipwiwatpotjana, Phantipa

2010-01-01

219

A procedure for assessing uncertainty in models  

Microsoft Academic Search

This paper discusses uncertainty in the output calculation of a model due to uncertainty in inputs values. Uncertainty in input values, characterized by suitable probability distributions, propagates through the model to a probability distribution of an output. Our objective in studying uncertainty is to identify a subset of inputs as being important in the sense that fixing them greatly reduces

M. D. McKay; R. J. Beckman

1993-01-01

220

Uncertainty analysis of transient population dynamics  

Microsoft Academic Search

Two types of demographic analyses, perturbation analysis and uncertainty analysis, can be conducted to gain insights about matrix population models and guide population management. Perturbation analysis studies how the perturbation of demographic parameters (survival, growth, and reproduction parameters) may affect the population projection, while uncertainty analysis evaluates how much uncertainty there is in population dynamic predictions and where the uncertainty

Chonggang Xu; George Z. Gertner

2009-01-01

221

The visualization of spatial uncertainty  

SciTech Connect

Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper explores the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.

Srivastava, R.M. [FSS International, Vancouver, British Columbia (Canada)

1994-12-31

222

Weak Equivalence Principle Test on a Sounding Rocket  

Microsoft Academic Search

SR-POEM, our principle of equivalence measurement on a sounding rocket, will\\u000acompare the free fall rate of two substances yielding an uncertainty of E-16 in\\u000athe estimate of \\\\eta. During the past two years, the design concept has matured\\u000aand we have been working on the required technology, including a laser gauge\\u000athat is self aligning and able to reach

James D. Phillips; Bijunath R. Patla; Eugeniu M. Popescu; Emanuele Rocco; Rajesh Thapa; Robert D. Reasenberg; Enrico C. Lorenzini

2010-01-01

223

CO2 transport uncertainties from the uncertainties in meteorological fields  

NASA Astrophysics Data System (ADS)

Inference of surface CO2 fluxes from atmospheric CO2 observations requires information about large-scale transport and turbulent mixing in the atmosphere, so transport errors and the statistics of the transport errors have significant impact on surface CO2 flux estimation. In this paper, we assimilate raw meteorological observations every 6 hours into a general circulation model with a prognostic carbon cycle (CAM3.5) using the Local Ensemble Transform Kalman Filter (LETKF) to produce an ensemble of meteorological analyses that represent the best approximation to the atmospheric circulation and its uncertainty. We quantify CO2 transport uncertainties resulting from the uncertainties in meteorological fields by running CO2 ensemble forecasts within the LETKF-CAM3.5 system forced by prescribed surface fluxes. We show that CO2 transport uncertainties are largest over the tropical land and the areas with large fossil fuel emissions, and are between 1.2 and 3.5 ppm at the surface and between 0.8 and 1.8 ppm in the column-integrated CO2 (with OCO-2-like averaging kernel) over these regions. We further show that the current practice of using a single meteorological field to transport CO2 has weaker vertical mixing and stronger CO2 vertical gradient when compared to the mean of the ensemble CO2 forecasts initialized by the ensemble meteorological fields, especially over land areas. The magnitude of the difference at the surface can be up to 1.5 ppm.

Liu, Junjie; Fung, Inez; Kalnay, Eugenia; Kang, Ji-Sun

2011-06-01

224

Awe, uncertainty, and agency detection.  

PubMed

Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728

Valdesolo, Piercarlo; Graham, Jesse

2014-01-01

225

Uncertainties in offsite consequence analysis  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

Young, M.L.; Harper, F.T.; Lui, C.H.

1996-03-01

226

Measuring uncertainty by extracting fuzzy rules using rough sets  

NASA Technical Reports Server (NTRS)

Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

Worm, Jeffrey A.

1991-01-01

227

Design rationale: Researching under uncertainty  

Microsoft Academic Search

Rationale research in software development is a challenging area because while there is no shortage of advocates for its value, there is also no shortage of reasons for why rationale is unlikely to be captured in practice. Despite more than thirty years of research there still remains much uncertaintyhow useful are the potential benefits and how insurmountable are the barriers?

Janet E. Burge

2008-01-01

228

Spatial uncertainty and ecological models  

SciTech Connect

Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

Jager, Yetta [ORNL; King, Anthony Wayne [ORNL

2004-07-01

229

Uncertainty Analysis of Reservoir Sedimentation  

Microsoft Academic Search

Significant advances have been made in understanding the importance of the factors involved in reservoir sedimentation. However, predicting the accumulation of sediment in a reservoir is still a complex problem. In estimating reservoir sedimentation and accumulation, a number of uncertainties arise. These are related to quantity of streamflow, sediment load, sediment particle size, and specific weight, trap efficiency, and reservoir

Jose D. Salas; Hyun-Suk Shin

1999-01-01

230

Uncertainties in Fault Tree Analysis  

Microsoft Academic Search

Fault tree analysis is one kind of the probabilistic safety analysis method. After constructing a fault tree, many basic events which can happen theoretically have never occurred so far or have occurred so infrequently that their reasonable data are not available. However, the use of fuzzy probability can describe the failure probability and its uncertainty of each basic event ,

Yue-Lung Cheng

231

An Uncertainty Framework for Classification  

Microsoft Academic Search

We define a generalized likelihood function based on uncertainty measures and show that maximizing such a likelihood function for different measures induces different types of classifiers. In the probabilistic framework, we obtain classifiers that optimize the cross-entropy function. In the possibilistic framework, we obtain classifiers that maximize the interclass margin. Furthermore, we show that the support vector machine is a

Loo-nin Teow; Kia-fock Loe

2000-01-01

232

Optimising Yacht Routes under Uncertainty  

Microsoft Academic Search

The planning of routes for sailing vessels is subject to uncertainty from the weather. This is particularly important in yacht racing where the accuracy of a weather prediction can determine the outcome of a race. With a perfect weather forecast it is possible to use the polar tables of a given yacht to compute a route that minimises its arrival

Andy Philpott; Andrew Mason

233

Uncertainty and Technical Communication Patterns  

Microsoft Academic Search

This paper examines the relationship between research and development peoples' perceptions of uncertainty in their firm's competitive environment and their patterns of technical communication. Measures of both these attributes of six R&D groups, two in each of three industries, are reported and analyzed here. Technical people who saw the world (competitors, suppliers, customers, technology and regulations) outside their firm as

James W. Brown; James M. Utterback

1985-01-01

234

Exploring Uncertainty with Projectile Launchers  

ERIC Educational Resources Information Center

The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between

Orzel, Chad; Reich, Gary; Marr, Jonathan

2012-01-01

235

Entropic uncertainty from effective anticommutators  

NASA Astrophysics Data System (ADS)

We investigate entropic uncertainty relations for two or more binary measurements, for example, spin-1/2 or polarization measurements. We argue that the effective anticommutators of these measurements, i.e., the anticommutators evaluated on the state prior to measuring, are an expedient measure of measurement incompatibility. Based on the knowledge of pairwise effective anticommutators we derive a class of entropic uncertainty relations in terms of conditional Rnyi entropies. Our uncertainty relations are formulated in terms of effective measures of incompatibility, which can be certified in a device-independent fashion. Consequently, we discuss potential applications of our findings to device-independent quantum cryptography. Moreover, to investigate the tightness of our analysis we consider the simplest (and very well studied) scenario of two measurements on a qubit. We find that our results outperform the celebrated bound due to Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103] and provide an analytical expression for the minimum uncertainty which also outperforms some recent bounds based on majorization.

Kaniewski, Jedrzej; Tomamichel, Marco; Wehner, Stephanie

2014-07-01

236

Principles of Public Paul Tabbush  

E-print Network

access for recreation or health initiatives, using forests and woodlands as the sites for public eventsPrinciples of Public Engagement Paul Tabbush Bianca Ambrose-Oji #12;Principles of Public Engagement for this document is: Tabbush, P., and Ambrose-Oji, B., 2011, Principles of Public Engagement. Forest Research

237

Robustness of computational time reversal imaging in media with elastic constant uncertainties  

E-print Network

Robustness of computational time reversal imaging in media with elastic constant uncertainties M attenuative media, satisfying the spatial reciprocity principle.4 TR can be exploited for imaging purposes; accepted 5 November 2009; published online 11 December 2009 In order to image a source or a scatterer

238

Uncertainty, entropy, scaling and hydrological stochastics. 1. Marginal distributional properties of hydrological processes and state scaling  

Microsoft Academic Search

The well-established physical and mathematical principle of maximum entropy (ME), is used to explain the distributional and autocorrelation properties of hydrological processes, including the scaling behaviour both in state and in time. In this context, maximum entropy is interpreted as maximum uncertainty. The conditions used for the maximization of entropy are as simple as possible, i.e. that hydrological processes are

DEMETRIS KOUTSOYIANNIS

2005-01-01

239

Uncertainty, entropy, scaling and hydrological stochastics. 2. Time dependence of hydrological processes and time scaling  

Microsoft Academic Search

The well-established physical and mathematical principle of maximum entropy (ME), is used to explain the distributional and autocorrelation properties of hydrological processes, including the scaling behaviour both in state and in time. In this context, maximum entropy is interpreted as maximum uncertainty. The conditions used for the maximization of entropy are as simple as possible, i.e. that hydrological processes are

DEMETRIS KOUTSOYIANNIS

240

Will hydrologists learn from the world around them?: Empiricism, models, uncertainty and stationarity (Invited)  

Microsoft Academic Search

To honor the passing this year of eminent hydrologists, Dooge, Klemes and Shiklomanov, I offer an irreverent look at the issues of uncertainty and stationarity as the hydrologic industry prepares climate change products. In an AGU keynote, Dooge said that the principle of mass balance was the only hydrologic law. It was not clear how one should apply it. Klemes

U. Lall

2010-01-01

241

Principles of Sociology  

NSDL National Science Digital Library

This activity is used in Principles of Sociology class for undergraduate students. This activity looks at the labor force and factors that affect occupation over time in the United States on a state-by-state basis. This activity uses a customized data set made from combining census information from 1950-1990. It guides students through data manipulation using WebCHIP software found at DataCounts!. To open WebCHIP with the dataset for the activity, please see instructions and links in the exercise documents under teaching materials. For more information on how to use WebCHIP, see the How To section on DataCounts!

Ciabattari, Theresa

242

Complex Correspondence Principle  

SciTech Connect

Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

Bender, Carl M.; Meisinger, Peter N. [Department of Physics, Washington University, St. Louis, Missouri 63130 (United States); Hook, Daniel W. [Theoretical Physics, Imperial College London, London SW7 2AZ (United Kingdom); Wang Qinghai [Department of Physics, National University of Singapore, Singapore 117542 (Singapore)

2010-02-12

243

Remote Sensing Principles  

NSDL National Science Digital Library

This introduction to Earth observation includes definitions of several terms, examples taken from real situations, and questions, answers, and exercises. A simple example of traditional chorological mapping methods and is used to show some fundamental principles of satellite images. Histogram, pixel and classification are introduced. There are discussions about remote sensing, the history of Earth observation, and geostationary and solar synchronous orbits. In addition, the basic physical concepts underlying remote sensing are explained, with the help of some relatively simple viewgraphs. This site is also available in German, French, Italian and Spanish.

244

Experimental uncertainty estimation and statistics for data having interval uncertainty.  

SciTech Connect

This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

2007-05-01

245

Principle of relative locality  

SciTech Connect

We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

Amelino-Camelia, Giovanni [Dipartimento di Fisica, Universita 'La Sapienza', and Sez. Roma1 INFN, P. le A. Moro 2, 00185 Roma (Italy); Freidel, Laurent; Smolin, Lee [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario N2J 2Y5 (Canada); Kowalski-Glikman, Jerzy [Institute for Theoretical Physics, University of Wroclaw, Pl. Maxa Borna 9, 50-204 Wroclaw (Poland)

2011-10-15

246

Great Lakes Literacy Principles  

NASA Astrophysics Data System (ADS)

Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These Great Lakes Literacy Principles represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

Fortner, Rosanne W.; Manzo, Lyndsey

2011-03-01

247

Revisiting Tversky's diagnosticity principle.  

PubMed

Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

Evers, Ellen R K; Lakens, Danil

2014-01-01

248

Principles of Safety Pharmacology  

PubMed Central

Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

Pugsley, M K; Authier, S; Curtis, M J

2008-01-01

249

Principles for high-quality, high-value testing.  

PubMed

A survey of doctors working in two large NHS hospitals identified over 120 laboratory tests, imaging investigations and investigational procedures that they considered not to be overused. A common suggestion in this survey was that more training was required. And, this prompted the development of a list of core principles for high-quality, high-value testing. The list can be used as a framework for training and as a reference source. The core principles are: (1) Base testing practices on the best available evidence. (2) Apply the evidence on test performance with careful judgement. (3) Test efficiently. (4) Consider the value (and affordability) of a test before requesting it. (5) Be aware of the downsides and drivers of overdiagnosis. (6) Confront uncertainties. (7) Be patient-centred in your approach. (8) Consider ethical issues. (9) Be aware of normal cognitive limitations and biases when testing. (10) Follow the 'knowledge journey' when teaching and learning these core principles. PMID:22740357

Power, Michael; Fell, Greg; Wright, Michael

2013-02-01

250

Uncertainty in flood risk mapping  

NASA Astrophysics Data System (ADS)

A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.

Gonalves, Luisa M. S.; Fonte, Cidlia C.; Gomes, Ricardo

2014-05-01

251

Quantifying uncertainty from material inhomogeneity.  

SciTech Connect

Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

2009-09-01

252

Visualizing Uncertainty in Natural Hazards  

Microsoft Academic Search

Presenting data without error rate is misleading. This is a quote from the O.J. Simpson defense team regarding the presentation\\u000a of DNA evidence without valid error rate statistics. Taken more generally, this practice is a prevalent shortcoming in the\\u000a scientific and information visualization communities where data are visualized without any indication of their associated\\u000a uncertainties. While it is widely acknowledged

Alex Pang

253

Parametric Uncertainty Modeling using LFTs  

Microsoft Academic Search

In this paper a general approach for modelling structured real-valued parametric perturbations is presented. It is based on a decomposition of perturbations into linear fractional transformations (LFTs), and is applicable to rational multi-dimensional (ND) polynomial perturbations of entries in state-space models. Model reduction is used to reduce the size of the uncertainty structure. The procedure will be applied for the

Paul Lambrechts; Jan Terlouw; Samir Bennani; Maarten Steinbuch

1993-01-01

254

Principles of Digital Computing  

NSDL National Science Digital Library

All About Circuits is a website that âprovides a series of online textbooks covering electricity and electronics.â Written by Tony R. Kuphaldt, the textbooks available here are wonderful resources for students, teachers, and anyone who is interested in learning more about electronics. This specific section, Principles of Digital Computing, is the sixteenth chapter in Volume IV âDigital. A few of the topics covered in this chapter include: a binary adder; look-up tables; finite state machines; microprocessors; and microprocessor planning. Diagrams and detailed descriptions of concepts are included throughout the chapter to provide users with a comprehensive lesson. Visitors to the site are also encouraged to discuss concepts and topics using the All About Circuits discussion forums (registration with the site is required to post materials).

Kuphaldt, Tony R.

2008-07-29

255

The Quantum Gauge Principle  

E-print Network

We consider the evolution of quantum fields on a classical background space-time, formulated in the language of differential geometry. Time evolution along the worldlines of observers is described by parallel transport operators in an infinite-dimensional vector bundle over the space-time manifold. The time evolution equation and the dynamical equations for the matter fields are invariant under an arbitrary local change of frames along the restriction of the bundle to the worldline of an observer, thus implementing a ``quantum gauge principle''. We derive dynamical equations for the connection and a complex scalar quantum field based on a gauge field action. In the limit of vanishing curvature of the vector bundle, we recover the standard equation of motion of a scalar field in a curved background space-time.

Dirk Graudenz

1996-04-29

256

Uncertainties in atmospheric neutrino fluxes  

SciTech Connect

An evaluation of the principal uncertainties in the computation of neutrino fluxes produced in cosmic ray showers in the atmosphere is presented. The neutrino flux predictions are needed for comparison with experiment to perform neutrino oscillation studies. The paper concentrates on the main limitations which are due to hadron production uncertainties. It also treats primary cosmic ray flux uncertainties, which are at a lower level. The absolute neutrino fluxes are found to have errors of around 15% in the neutrino energy region important for contained events underground. Large cancellations of these errors occur when ratios of fluxes are considered, in particular, the {nu}{sub {mu}}/{nu}{sub {mu}} ratio below E{sub {nu}}=1 GeV, the ({nu}{sub {mu}}+{nu}{sub {mu}})/({nu}{sub e}+{nu}{sub e}) ratio below E{sub {nu}}=10 GeV and the up/down ratios above E{sub {nu}}=1 GeV are at the 1% level. A detailed breakdown of the origin of these errors and cancellations is presented.

Barr, G. D.; Robbins, S.; Gaisser, T. K.; Stanev, T. [Department of Physics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom); Bartol Research Institute and Department of Physics and Astronomy, University of Delaware, Newark, Delaware, 19716 (United States)

2006-11-01

257

Fuzzy-algebra uncertainty assessment  

SciTech Connect

A significant number of analytical problems (for example, abnormal-environment safety analysis) depend on data that are partly or mostly subjective. Since fuzzy algebra depends on subjective operands, we have been investigating its applicability to these forms of assessment, particularly for portraying uncertainty in the results of PRA (probabilistic risk analysis) and in risk-analysis-aided decision-making. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only known (not assumed) information. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments; and therefore require an even more judicious approach. Fuzzy algebra matches these requirements well. One of the most useful aspects of this work is that we have shown the potential for significant differences (especially in perceived margin relative to a decision threshold) between fuzzy assessment and probabilistic assessment based on subtle factors inherent in the choice of probability distribution models. We have also shown the relation of fuzzy algebra assessment to ``bounds`` analysis, as well as a description of how analyses can migrate from bounds analysis to fuzzy-algebra analysis, and to probabilistic analysis as information about the process to be analyzed is obtained. Instructive examples are used to illustrate the points.

Cooper, J.A. [Sandia National Labs., Albuquerque, NM (United States); Cooper, D.K. [Naval Research Lab., Washington, DC (United States)

1994-12-01

258

Environmental assessments: Uncertainties in implementation  

SciTech Connect

A review of the regulations, guidance, statutes, and case law affecting Environmental Assessment (EA) preparation has identified a number of uncertainties that, if clarified, would facilitate EA preparation and National Environmental Policy ACT (NEPA) implementation. Recommendations are made for clarifying the uncertainties regarding EA preparation to help EAs fulfill their intended role in the NEPA process and to thereby facilitate NEPA implementation in general. The CEQ should report annually on the total number of EAs prepared each year in the United States. The current CEQ guidance, as well as requirements for EAs found in other environmental review laws, should be codified into the CEQ regulations. At the same time, existing requirements in the regulations should be collected into a new section dealing exclusively with EAs. The existing regulations should be changed to eliminate the analysis of alternatives in an EA; such as analysis is not usually critical to the decision at hand and is not required by NEPA for EAs resulting in Findings of No Significant Impact (FONSIs). Uncertainties related to identifying significant effects could be addressed by requiring greater public and agency involvement in the EA process before decisions are made. 15 refs.

Hunsaker, D.B. Jr.

1987-01-01

259

Fault Management Guiding Principles  

NASA Technical Reports Server (NTRS)

Regardless of the mission type: deep space or low Earth orbit, robotic or human spaceflight, Fault Management (FM) is a critical aspect of NASA space missions. As the complexity of space missions grows, the complexity of supporting FM systems increase in turn. Data on recent NASA missions show that development of FM capabilities is a common driver for significant cost overruns late in the project development cycle. Efforts to understand the drivers behind these cost overruns, spearheaded by NASA's Science Mission Directorate (SMD), indicate that they are primarily caused by the growing complexity of FM systems and the lack of maturity of FM as an engineering discipline. NASA can and does develop FM systems that effectively protect mission functionality and assets. The cost growth results from a lack of FM planning and emphasis by project management, as well the maturity of FM as an engineering discipline, which lags behind the maturity of other engineering disciplines. As a step towards controlling the cost growth associated with FM development, SMD has commissioned a multi-institution team to develop a practitioner's handbook representing best practices for the end-to-end processes involved in engineering FM systems. While currently concentrating primarily on FM for science missions, the expectation is that this handbook will grow into a NASA-wide handbook, serving as a companion to the NASA Systems Engineering Handbook. This paper presents a snapshot of the principles that have been identified to guide FM development from cradle to grave. The principles range from considerations for integrating FM into the project and SE organizational structure, the relationship between FM designs and mission risk, and the use of the various tools of FM (e.g., redundancy) to meet the FM goal of protecting mission functionality and assets.

Newhouse, Marilyn E.; Friberg, Kenneth H.; Fesq, Lorraine; Barley, Bryan

2011-01-01

260

The nonholonomic variational principle  

NASA Astrophysics Data System (ADS)

A variational principle for mechanical systems and fields subject to nonholonomic constraints is found, providing Chetaev-reduced equations as equations for extremals. Investigating nonholonomic variations of the Chetaev type and their properties, we develop foundations of the calculus of variations on constraint manifolds, modelled as fibred submanifolds in jet bundles. This setting is appropriate to study general first-order 'nonlinear nonitegrable constraints' that locally are given by a system of first-order ordinary or partial differential equations. We obtain an invariant constrained first variation formula and constrained Euler-Lagrange equations both in intrinsic and coordinate forms, and show that the equations are the same as Chetaev equations 'without Lagrange multipliers', introduced recently by other methods. We pay attention to two possible settings: first, when the constrained system arises from an unconstrained Lagrangian system defined in a neighbourhood of the constraint, and second, more generally, when an 'internal' constrained system on the constraint manifold is given. In the latter case a corresponding unconstrained system need not be a Lagrangian, nor even exist. We also study in detail an important particular case: nonholonomic constraints that can be alternatively modelled by means of (co)distributions in the total space of the fibred manifold; in nonholonomic mechanics this happens whenever constraints affine in velocities are considered. It becomes clear that (and why) if the distribution is completely integrable (= the constraints are semiholonomic), the principle of virtual displacements holds and can be used to obtain the constrained first variational formula by a more or less standard procedure, traditionally used when unconstrained or holonomic systems are concerned. If, however, the constraint is nonintegrable, no significant simplifications are available. Among others, some properties of nonholonomic systems are clarified that without a deeper insight seem rather mysterious.

Krupkov, Olga

2009-05-01

261

Genetics and psychiatry: a proposal for the application of the precautionary principle.  

PubMed

The paper suggests an application of the precautionary principle to the use of genetics in psychiatry focusing on scientific uncertainty. Different levels of uncertainty are taken into consideration--from the acknowledgement that the genetic paradigm is only one of the possible ways to explain psychiatric disorders, via the difficulties related to the diagnostic path and genetic methods, to the value of the results of studies carried out in this field. Considering those uncertainties, some measures for the use of genetics in psychiatry are suggested. Some of those measures are related to the conceptual limits of the genetic paradigm; others are related to present knowledge and should be re-evaluated. PMID:22460929

Porteri, Corinna

2013-08-01

262

Multidelity approaches for design under uncertainty  

E-print Network

Uncertainties are present in many engineering applications and it is important to account for their effects during engineering design to achieve robust and reliable systems. One approach is to represent uncertainties as ...

Ng, Leo Wai-Tsun

2013-01-01

263

Optimization under uncertainty in radiation therapy  

E-print Network

In the context of patient care for life-threatening illnesses, the presence of uncertainty may compromise the quality of a treatment. In this thesis, we investigate robust approaches to managing uncertainty in radiation ...

Chan, Timothy Ching-Yee

2007-01-01

264

How do households respond to uncertainty shocks?  

Microsoft Academic Search

Economic disruptions generally coincide with heightened uncertainty. In the United States, uncertainty increased sharply with the recent housing market crash, financial crisis, deep recession, and uneven recovery. In July 2010 Congressional testimony, Federal Reserve Chairman Bernanke described conditions as \\

Edward S. Knotek II; Shujaat Khan

2011-01-01

265

Modeling travel time uncertainty in traffic networks  

E-print Network

Uncertainty in travel time is one of the key factors that could allow us to understand and manage congestion in transportation networks. Models that incorporate uncertainty in travel time need to specify two mechanisms: ...

Chen, Daizhuo

2010-01-01

266

10 CFR 436.24 - Uncertainty analyses.  

Code of Federal Regulations, 2010 CFR

...2010-01-01 2010-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy ...Methodology and Procedures for Life Cycle Cost Analyses 436.24 Uncertainty analyses. If particular items of cost data...

2010-01-01

267

Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification  

E-print Network

In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations. In addition, the notion of a valid measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct ...

Xue, Zhenyu; Vlachos, Pavlos P

2014-01-01

268

Uncertainty and its propagation in dynamics models  

SciTech Connect

The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a {open_quotes}subdynamics{close_quotes} where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision.

Devooght, J. [Universite Libre de Bruxelles, Brussels (Belgium)

1994-10-01

269

The Stock Market: Risk vs. Uncertainty.  

ERIC Educational Resources Information Center

This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty

Griffitts, Dawn

2002-01-01

270

Assessment of Uncertainty-Infused Scientific Argumentation  

ERIC Educational Resources Information Center

Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty

Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zo E.

2014-01-01

271

Regarding Uncertainty in Teachers and Teaching  

ERIC Educational Resources Information Center

The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective

Helsing, Deborah

2007-01-01

272

UK coastal flood risk; understanding the uncertainty  

Microsoft Academic Search

The sensitivity of flood risk mapping to the major sources of future climate uncertainty were investigated by propagating these uncertainties through a LISFLOOD inundation model of a significant flood event of the North Somerset coast, to the west of the UK. The largest source of uncertainty was found to be the effect of the global Mean Sea Level rise range

Matt Lewis; Paul Bates; Kevin Horsburgh; Ros Smith

2010-01-01

273

Moral principles as moral dispositions  

Microsoft Academic Search

What are moral principles? In particular, what are moral principles of the sort that (if they exist) ground moral obligations orat the very leastparticular moral truths? I argue that we can fruitfully conceive of such principles\\u000a as real, irreducibly dispositional properties of individual persons (agents and patients) that are responsible for and thereby\\u000a explain the moral properties of (e.g.) agents

Luke Robinson

274

Towards first-principles electrochemistry  

E-print Network

This doctoral dissertation presents a comprehensive computational approach to describe quantum mechanical systems embedded in complex ionic media, primarily focusing on the first-principles representation of catalytic ...

Dabo, Ismaila

2008-01-01

275

Forest management under uncertainty for multiple bird population objectives  

USGS Publications Warehouse

We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

Moore, C. T.; Plummer, W.T.; Conroy, M.J.

2005-01-01

276

The uncertainty of local flow parameters during inundation flow over complex topographies with elevation errors  

NASA Astrophysics Data System (ADS)

SummarySince the topographical data obtained from LiDAR (Light Detection and Ranging) measurements is superior in resolution and accuracy as compared to conventional geospatial data, over the last decade aerial LiDAR (Light Detection and Ranging) has been widely used for obtaining geospatial information. However, digital terrain models made from LiDAR data retain some degree of uncertainty as a result of the measurement principles and the operational limitations of LiDAR surveying. LiDAR cannot precisely measure topographical elements such as ground undulation covered by vegetation, curbstones, etc. Such instrumental and physical uncertainties may impact an estimated result in an inundation flow simulation. Meanwhile, how much and how these topographical uncertainties affect calculated results is not understood. To evaluate the effect of topographical uncertainty on the calculated inundation flow, three representative terrains were prepared that included errors in elevation. Here, the topographical uncertainty that was introduced was generated using a fractal algorithm in order to represent the spatial structure of the elevation uncertainty. Then, inundation flows over model terrains were calculated with an unstructured finite volume flow model that solved shallow water equations. The sensitivity of the elevation uncertainty on the calculated inundated propagation, especially the local flow velocity, was evaluated. The predictability of inundation flow over complex topography is discussed, as well as its relationship to topographical features.

Tsubaki, Ryota; Kawahara, Yoshihisa

2013-04-01

277

An integrated approach for addressing uncertainty in the delineation of groundwater management areas  

NASA Astrophysics Data System (ADS)

Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.

Sousa, Marcelo R.; Frind, Emil O.; Rudolph, David L.

2013-05-01

278

General entropy-like uncertainty relations in finite dimensions  

E-print Network

We revisit entropic formulations of the uncertainty principle for an arbitrary pair of positive operator-valued measures (POVM) $A$ and $B$, acting on finite dimensional Hilbert space. Salicr\\'u generalized $(h,\\phi)$-entropies, including R\\'enyi and Tsallis ones among others, are used as uncertainty measures associated with the distribution probabilities corresponding to the outcomes of the observables. We obtain a nontrivial lower bound for the sum of generalized entropies for any pair of entropic functionals, which is valid for both pure and mixed states. The bound depends on the overlap triplet $(c_A,c_B,c_{A,B})$ with $c_A$ (resp. $c_B$) being the overlap between the elements of the POVM $A$ (resp. $B$) and $c_{A,B}$ the overlap between the pair of POVM. Our approach is inspired by that of de Vicente and S\\'anchez-Ruiz [Phys.\\ Rev.\\ A \\textbf{77}, 042110 (2008)] and consists in a minimization of the entropy sum subject to the Landau-Pollak inequality that links the maximum probabilities of both observables. We solve the constrained optimization problem in a geometrical way and furthermore, when dealing with R\\'enyi or Tsallis entropic formulations of the uncertainty principle, we overcome the H\\"older conjugacy constraint imposed on the entropic indices by the Riesz-Thorin theorem. In the case of nondegenerate observables, we show that for given $c_{A,B} > \\frac{1}{\\sqrt2}$, the bound obtained is optimal; and that, for R\\'enyi entropies, our bound improves Deutsch one, but Maassen-Uffink bound prevails when $c_{A,B} \\leq\\frac12$. Finally, we illustrate by comparing our bound with known previous results in particular cases of R\\'enyi and Tsallis entropies.

S. Zozor; G. M. Bosyk; M. Portesi

2013-11-21

279

Bateman's principle and immunity.  

PubMed Central

The immunocompetence handicap hypothesis (ICHH) of Folstad and Karter has inspired a large number of studies that have tried to understand the causal basis of parasite-mediated sexual selection. Even though this hypothesis is based on the double function of testosterone, a hormone restricted to vertebrates, studies of invertebrates have tended to provide central support for specific predictions of the ICHH. I propose an alternative hypothesis that explains many of the findings without relying on testosterone or other biochemical feedback loops. This alternative is based on Bateman's principle, that males gain fitness by increasing their mating success whilst females increase fitness through longevity because their reproductive effort is much higher. Consequently, I predict that females should invest more in immunity than males. The extent of this dimorphism is determined by the mating system and the genetic correlation between males and females in immune traits. In support of my arguments, I mainly use studies on insects that share innate immunity with vertebrates and have the advantage that they are easier to study. PMID:11958720

Rolff, Jens

2002-01-01

280

Uncertainty in cancer risk estimates  

SciTech Connect

Several existing databases compiled by Gold et al. for carcinogenesis bioassays are examined to obtain estimates of reproducibility of cancer rates across experiments, strains, and rodent species. A measure of carcinogenic potency is given by the TD[sub 50] (daily dose that causes a tumor type in 50% of the exposed animals that otherwise would not develop the tumor in a standard lifetime). The lognormal distribution can be used to model the uncertainty of the estimates of potency (TD[sub 50]) and the ratio of TD[sub 50]'s between two species. The practice of basing cancer risk estimates on the most sensitive rodent species-strain-sex and using interspecies dose scaling based on body surface area appears to overestimate cancer rates for these 20 human carcinogens by about one order of magnitude. For chemicals where the dose-response is nearly linear below experimental doses, cancer risk estimates based on animal data are not necessarily conservative and may range from a factor of 10 too low for human carcinogens up to a factor of 1000 too high for approximately 95% of the chemicals tested. These limits may need to be modified for specific chemicals where additional mechanistic or pharmacokinetic information may suggest alterations or where particularly sensitive subpopulations may be exposed. Supralinearity could lead to anticonservative estimates of cancer risk. Underestimating cancer risk by a specific factor has a much larger impact on the actual number of cancer cases than overestimates of smaller risk by the same factor. This paper does not address the uncertainties in high to low dose extrapolation. If the dose-response is sufficiently nonlinear at low doses to produce cancer risks near zero, then low-dose risk estimates based on linear extrapolation are likely to overestimate risk and the limits of uncertainty cannot be established. 17 refs., 1 fig., 1 tab.

Gaylor, D.W.; Chen, J.J.; Sheehan, D.M. (Food and Drug Administration, Jefferson, AR (United States))

1993-04-01

281

Failure probability under parameter uncertainty.  

PubMed

In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. PMID:21175720

Gerrard, R; Tsanakas, A

2011-05-01

282

Conditional Uncertainty in Anthropogenic Global Climate Change  

NASA Astrophysics Data System (ADS)

Although, the uncertainty associated with human-induced climate change is less that in many other human activities such as economic management and warfare, the uncertainties in the climate system have assumed a disproportionate profile in public debate. Achieving improved public understanding is dependent on consistent use of the various categories of change and their respective uncertainties. Probably the most important distinction to be made is between uncertainties associated with uncertain societal choices and uncertainties associated with the consequences of such choices. For the biogeochemical system, categories of uncertainty are adapted from those used in the study of uncertainty for the REgional Carbon Assessment and Processes (RECCAP) study. These are then extended and applied to the discussion of the combined carbon-climate system. Characterising uncertainties in future change requires a consistent approach to propagating into the future the uncertainties associated with the past and present state of the climate system. Again, previous analysis for the carbon system is extended to the carbon-climate system. The potential category ambiguities that arise from feedbacks between climate and carbon are identified and resolved. A consistent characterisation of the uncertainties in the earth system provides a basis for factoring the overall uncertainty into human and natural contributions.

Enting, I. G.

2012-12-01

283

Principles of Instructed Language Learning  

ERIC Educational Resources Information Center

This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on

Ellis, Rod

2005-01-01

284

Meaty Principles for Environmental Educators.  

ERIC Educational Resources Information Center

Suggests that educated persons should be exposed to a body of conceptual knowledge which includes basic principles of the biological and physical sciences. Practical examples involving force, sound, light, waves, and density of water are cited. A lesson on animal tracks using principles of force and pressure is also described. (DH)

Rockcastle, V. N.

1985-01-01

285

Kautilya on principles of taxation  

Microsoft Academic Search

Purpose The purpose of this paper is to present Kautilya's principles of taxation during the fourth century BCE. Design\\/methodology\\/approach Modern tools of economic analysis are used to present Kautilya's principles on income taxation. Findings Kautilya implicitly suggests a linear income tax. He emphasizes fairness, stability of tax structure, fiscal federalism, avoidance of heavy taxation, ensuring of tax

Balbir S. Sihag

2009-01-01

286

Biology 2250 Principles of Genetics  

E-print Network

1 Biology 2250 Principles of Genetics Instructors: Dr. Steven M. Carr B Molecular Genetics Dr. David J. Innes B MendelianGenetics Biology 2250 Principles of Genetics Lab Instructor: Valerie Power Genetics Laboratory: SN-4110 (Lab. organization meeting week of Sept. 13: Groups A &B) Lab. Demonstrators

Innes, David J.

287

Design Principles for Children's Technology  

Microsoft Academic Search

Designers of children's technology and software face distinctive challenges. Many design principles used for adult interfaces cannot be applied to children's products because the needs, skills, and expectations of this user population are drastically different than those of adults. In recent years, designers have started developing design principles for children, but this work has not been collected in one place.

Sonia Chiasson; Carl Gutwin

288

The legal status of uncertainty  

NASA Astrophysics Data System (ADS)

Authorities of civil protection are giving extreme importance to the scientific assessment throughout the widespread use of mathematical models that have been implemented in order to prevent and mitigate the effect of natural hazards. These models, however, are far from deterministic; moreover, the uncertainty that characterizes them plays an important role in the scheme of prevention of natural hazards. We are, in fact, presently experiencing a detrimental increase of legal actions taken against the authorities of civil protection whom, relying on the forecasts of mathematical models, fail in protecting the population. It is our profound concern that civilians have granted the right of being protected by any means, and at the same extent, from natural hazards and from the fallacious behaviour of whom should grant individual safety. But, at the same time, a dangerous overcriminalization could have a negative impact on the Civil Protection system inducing a dangerous defensive behaviour which is costly and ineffective. A few case studies are presented in which the role of uncertainty, in numerical predictions, is made evident and discussed. Scientists, thus, need to help policymakers to agree on sound procedures that must recognize the real level of unpredictability. Hence, we suggest the creation of an international and interdisciplinary committee, with the scope of having politics, jurisprudence and science communicate, to find common solutions to a common problem.

Ferraris, L.; Miozzo, D.

2009-09-01

289

Physical Principles of Evolution  

NASA Astrophysics Data System (ADS)

Theoretical biology is incomplete without a comprehensive theory of evolution, since evolution is at the core of biological thought. Evolution is visualized as a migration process in genotype or sequence space that is either an adaptive walk driven by some fitness gradient or a random walk in the absence of (sufficiently large) fitness differences. The Darwinian concept of natural selection consisting in the interplay of variation and selection is based on a dichotomy: All variations occur on genotypes whereas selection operates on phenotypes, and relations between genotypes and phenotypes, as encapsulated in a mapping from genotype space into phenotype space, are central to an understanding of evolution. Fitness is conceived as a function of the phenotype, represented by a second mapping from phenotype space into nonnegative real numbers. In the biology of organisms, genotype-phenotype maps are enormously complex and relevant information on them is exceedingly scarce. The situation is better in the case of viruses but so far only one example of a genotype-phenotype map, the mapping of RNA sequences into RNA secondary structures, has been investigated in sufficient detail. It provides direct information on RNA selection in vitro and test-tube evolution, and it is a basis for testing in silico evolution on a realistic fitness landscape. Most of the modeling efforts in theoretical and mathematical biology today are done by means of differential equations but stochastic effects are of undeniably great importance for evolution. Population sizes are much smaller than the numbers of genotypes constituting sequence space. Every mutant, after all, has to begin with a single copy. Evolution can be modeled by a chemical master equation, which (in principle) can be approximated by a stochastic differential equation. In addition, simulation tools are available that compute trajectories for master equations. The accessible population sizes in the range of 10^7le Nle 10^8 molecules are commonly too small for problems in chemistry but sufficient for biology.

Schuster, Peter

290

Principles of animal extrapolation  

SciTech Connect

Animal Extrapolation presents a comprehensive examination of the scientific issues involved in extrapolating results of animal experiments to human response. This text attempts to present a comprehensive synthesis and analysis of the host of biomedical and toxicological studies of interspecies extrapolation. Calabrese's work presents not only the conceptual basis of interspecies extrapolation, but also illustrates how these principles may be better used in selection of animal experimentation models and in the interpretation of animal experimental results. The book's theme centers around four types of extrapolation: (1) from average animal model to the average human; (2) from small animals to large ones; (3) from high-risk animal to the high risk human; and (4) from high doses of exposure to lower, more realistic, doses. Calabrese attacks the issues of interspecies extrapolation by dealing individually with the factors which contribute to interspecies variability: differences in absorption, intestinal flora, tissue distribution, metabolism, repair mechanisms, and excretion. From this foundation, Calabrese then discusses the heterogeneticity of these same factors in the human population in an attempt to evaluate the representativeness of various animal models in light of interindividual variations. In addition to discussing the question of suitable animal models for specific high-risk groups and specific toxicological endpoints, the author also examines extrapolation questions related to the use of short-term tests to predict long-term human carcinogenicity and birth defects. The book is comprehensive in scope and specific in detail; for those environmental health professions seeking to understand the toxicological models which underlay health risk assessments, Animal Extrapolation is a valuable information source.

Calabrese, E.J.

1991-01-01

291

BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance  

NASA Astrophysics Data System (ADS)

Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to dobut this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

Lira, Ignacio

2003-08-01

292

Solving Navigational Uncertainty Using Grid Cells on Robots  

PubMed Central

To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments. PMID:21085643

Milford, Michael J.; Wiles, Janet; Wyeth, Gordon F.

2010-01-01

293

An Inconvenient Principle  

NASA Astrophysics Data System (ADS)

At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

Bellac, Michel Le

2014-11-01

294

Uncertainty in Lagrangian pollutant transport simulations due to meteorological uncertainty at mesoscale  

NASA Astrophysics Data System (ADS)

Lagrangian particle dispersion models require meteorological fields as input. Uncertainty in the driving meteorology is one of the major uncertainties in the results. The propagation of uncertainty through the system is not simple, and has not been thoroughly explored. Here, we take an ensemble approach. Six different configurations of the Weather Research and Forecast (WRF) model drive otherwise identical simulations with FLEXPART for 49 days over eastern North America. The ensemble spreads of wind speed, mixing height, and tracer concentration are presented. Uncertainty of tracer concentrations due solely to meteorological uncertainty is 30-40%. Spatial and temporal averaging reduces the uncertainty marginally. Tracer age uncertainty due solely to meteorological uncertainty is 15-20%. These are lower bounds on the uncertainty, because a number of processes are not accounted for in the analysis.

Angevine, W. M.; Brioude, J.; McKeen, S.; Holloway, J. S.

2014-07-01

295

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

1995-01-01

296

Induction of models under uncertainty  

NASA Technical Reports Server (NTRS)

This paper outlines a procedure for performing induction under uncertainty. This procedure uses a probabilistic representation and uses Bayes' theorem to decide between alternative hypotheses (theories). This procedure is illustrated by a robot with no prior world experience performing induction on data it has gathered about the world. The particular inductive problem is the formation of class descriptions both for the tutored and untutored cases. The resulting class definitions are inherently probabilistic and so do not have any sharply defined membership criterion. This robot example raises some fundamental problems about induction; particularly, it is shown that inductively formed theories are not the best way to make predictions. Another difficulty is the need to provide prior probabilities for the set of possible theories. The main criterion for such priors is a pragmatic one aimed at keeping the theory structure as simple as possible, while still reflecting any structure discovered in the data.

Cheeseman, Peter

1986-01-01

297

Uncertainties drive arsenic rule delay  

SciTech Connect

The US Environmental Protection Agency (USEPA) is under court order to sign a proposed rule for arsenic by Nov. 30, 1995. The agency recently announced that it will not meet this deadline, citing the need to gather additional information. Development of a National Interim Primary Drinking Water Regulation for arsenic has been delayed several times over the past 10 years because of uncertainties regarding health issues and costs associated with compliance. The early history of development of the arsenic rule has been reviewed. Only recent developments are reviewed here. The current maximum contaminant level (MCL) for arsenic in drinking water is 0.05 mg/L. This MCL was set in 1975, based on the 1962 US Public Health Standards. The current Safe Drinking Water Act (SDWA) requires that the revised arsenic MCL be set as close to the MCL goal (MCLG) as is feasible using best technology, treatment techniques, or other means and taking cost into consideration.

Pontius, F.W.

1995-04-01

298

Dirac particles' tunnelling from 5-dimensional rotating black strings influenced by the generalized uncertainty principle  

E-print Network

The standard Hawking formula predicts the complete evaporation of black holes. Taking into account effects of quantum gravity, we investigate fermions' tunnelling from a 5-dimensional rotating black string. The temperature is determined not only by the string, but also affected by the quantum number of the emitted fermion and the effect of the extra spatial dimension. The quantum correction slows down the increase of the temperature, which naturally leads to the remnant in the evaporation.

Deyou Chen

2013-12-07

299

Does a String-Particle Dualism Indicate the Uncertainty Principle's Philosophical Dichotomy?  

NASA Astrophysics Data System (ADS)

String theory may allow resonances of neutrino-wave-strings to account for all experimentally detected phenomena. Particle theory logically, and physically, provides an alternate, contradictory dualism. Is it contradictory to symbolically and simultaneously state that ?p = h, but, the product of position and momentum must be greater than, or equal to, the same (scaled) Plank's constant? Our previous electron and positron models require `membrane' vibrations of string-linked neutrinos, in closed loops, to behave like traveling waves, Tws, intermittently metamorphosing into alternately ascending and descending standing waves, Sws, between the nodes, which advance sequentially through 360 degrees. Accumulated time passages as Tws detail required ``loop currents'' supplying magnetic moments. Remaining time partitions into the Sws' alternately ascending and descending phases: the physical basis of the experimentally established 3D modes of these ``particles.'' Waves seem to indicate that point mass cannot be required to exist instantaneously at one point; Mott's and Sneddon's Wave Mechanics says that a constant, [mass], is present. String-like resonances may also account for homeopathy's efficacy, dark matter, and constellations' ``stick-figure projections,'' as indicated by some traditional cultures, all possibly involving neutrino strings. To cite this abstract, use the following reference: http://meetings.aps.org/link/BAPS.2007.NES07.C2.5

Mc Leod, David; Mc Leod, Roger

2007-04-01

300

The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes  

NASA Technical Reports Server (NTRS)

A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

1993-01-01

301

The uncertainty principle in resonant gravitational wave antennae and quantum non-demolition measurement schemes  

NASA Technical Reports Server (NTRS)

A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

1993-01-01

302

Principles of Pharmacotherapy: I. Pharmacodynamics  

PubMed Central

This paper and the ensuing series present the principles guiding and affecting the ability of drugs to produce therapeutic benefit or untoward harm. The principles of pharmacodynamics and pharmacokinetics, the physiologic basis of adverse drug reactions and suitable antidotal therapy, and the biologic basis of drug allergy, drug-drug interactions, pharmacogenetics, teratology and hematologic reactions to chemicals are explored. These principles serve to guide those administering and using drugs to attain the maximum benefit and least attendant harm from their use. Such is the goal of rational therapeutics. PMID:3046440

Pallasch, Thomas J.

1988-01-01

303

OECD Principles of Corporate Governance  

NSDL National Science Digital Library

The "Organisation for Economic Co-operation and Development Principles of Corporate Governance" sets out a structure for directing and controlling corporate businesses. This document (html or .pdf) consists of five sections detailing the principles: "The rights of shareholders," "The equitable treatment of shareholders," "The role of stakeholders in corporate governance," "Disclosure and transparency," and "The responsibilities of the board," as well as annotations for each of the sections. Be sure to visit the OECD Principles of Corporate Governance Q&A page, linked at the top of the page.

304

Assessment of AERONET-OC LWN uncertainties  

NASA Astrophysics Data System (ADS)

This study presents a detailed analysis of the uncertainties affecting the normalized water-leaving radiance (LWN) from above-water measurements performed within the context of the Ocean Colour component of the Aerosol Robotic Network (AERONET-OC). The analysis, conducted in agreement with the Guide to the Expression of Uncertainty in Measurement (GUM), indicates uncertainties of LWN markedly dependent on the optical properties of seawater for a number of AERONET-OC sites located in different marine regions. Results obtained for the Adriatic Sea site, characterized by a large variety of measurement conditions, confirm previous uncertainties from an independent study indicating median values of relative combined uncertainties of 5% in the blue-green part of the spectrum and of approximately 7% in the red. Additional investigations show that the former uncertainties can be reduced by 1% when restricting the determination of AERONET-OC LWN to measurements performed at low sun zenith angle and low aerosol optical thickness.

Gergely, Mathias; Zibordi, Giuseppe

2014-02-01

305

Weak Equivalence Principle Test on a Sounding Rocket  

E-print Network

SR-POEM, our principle of equivalence measurement on a sounding rocket, will compare the free fall rate of two substances yielding an uncertainty of E-16 in the estimate of \\eta. During the past two years, the design concept has matured and we have been working on the required technology, including a laser gauge that is self aligning and able to reach 0.1 pm per root hertz for periods up to 40 s. We describe the status and plans for this project.

James D. Phillips; Bijunath R. Patla; Eugeniu M. Popescu; Emanuele Rocco; Rajesh Thapa; Robert D. Reasenberg; Enrico C. Lorenzini

2010-08-04

306

Neural coding of uncertainty and probability.  

PubMed

Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability. PMID:25032495

Ma, Wei Ji; Jazayeri, Mehrdad

2014-01-01

307

Geostatistical modelling of uncertainty in soil science  

Microsoft Academic Search

This paper addresses the issue of modelling the uncertainty about the value of continuous soil attributes, at any particular unsampled location (local uncertainty) as well as jointly over several locations (multiple-point or spatial uncertainty). Two approaches are presented: kriging-based and simulation-based techniques that can be implemented within a parametric (e.g. multi-Gaussian) or non-parametric (indicator) frameworks. As expected in theory and

P. Goovaerts

2001-01-01

308

Reactivity worth determination for rod position uncertainty  

Microsoft Academic Search

This document provides the technical basis for determining the reactivity uncertainty associated with control rod position uncertainty. This report supports resolution of Issue B-235, Technical Specifications Submittal No. 73. The N Reactor Technical Specifications B 3\\/4 2.1, Bases for Horizontal Control Rod Systems, allows an uncertainty of 14 inches in rod tip location. Specifically, this reference states, ``An allowance of

K. N. Schwinkendorf; D. A. Tollefson; D. H. Finfrock

1988-01-01

309

Extrema Principles Of Dissipation In Fluids  

NASA Technical Reports Server (NTRS)

Report discusses application of principle of least action and other variational or extrema principles to dissipation of energy and production of entropy in fluids. Principle of least action applied successfully to dynamics of particles and to quantum mechanics, but not universally accepted that variational principles applicable to thermodynamics and hydrodynamics. Report argues for applicability of some extrema principles to some simple flows.

Horne, W. Clifton; Karamcheti, Krishnamurty

1991-01-01

310

The Precautionary Principle Also Applies to Public Health Actions  

PubMed Central

The precautionary principle asserts that the burden of proof for potentially harmful actions by industry or government rests on the assurance of safety and that when there are threats of serious damage, scientific uncertainty must be resolved in favor of prevention. Yet we in public health are sometimes guilty of not adhering to this principle. Examples of actions with unintended negative consequences include the addition of methyl tert-butyl ether to gasoline in the United States to decrease air pollution, the drilling of tube wells in Bangladesh to avoid surface water microbial contamination, and villagewide parenteral antischistosomiasis therapy in Egypt. Each of these actions had unintended negative consequences. Lessons include the importance of multidisciplinary approaches to public health and the value of riskbenefit analysis, of public health surveillance, and of a functioning tort systemall of which contribute to effective precautionary approaches. PMID:11527755

Goldstein, Bernard D.

2001-01-01

311

Measurement uncertainty analysis in medical physics.  

E-print Network

??This research demonstrates that an internationally recognised measurement uncertainty analysis process, the GUM, can be successfully applied within the field of Medical Physics. The research (more)

Gregory, Kent

2011-01-01

312

Responding to uncertainty in nursing practice.  

PubMed

Uncertainty is a fact of life for practising clinicians and cannot be avoided. This paper outlines the model of uncertainty presented by Katz (1988, Cambridge University Press, Cambridge, UK. pp. 544-565) and examines the descriptive and normative power of three broad theoretical and strategic approaches to dealing with uncertainty: rationality, bounded rationality and intuition. It concludes that nursing research and development (R&D) must acknowledge uncertainty more fully in its R&D agenda and that good-quality evaluation studies which directly compare intuitive with rational-analytical approaches for given clinical problems should be a dominant feature of future R&D. PMID:11524107

Thompson, C; Dowding, D

2001-10-01

313

Modeling uncertainty: quicksand for water temperature modeling  

USGS Publications Warehouse

Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

Bartholow, John M.

2003-01-01

314

Methodological principles of modern thermodynamics  

E-print Network

The article describes basic principles of the theory which unites thermodynamics of reversible and irreversible processes also extends them methods on processes of transfer and transformation of any forms of energy

V. A. Etkin

2014-01-02

315

Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods  

SciTech Connect

The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

2013-12-01

316

Developmental principles: fact or fiction.  

PubMed

While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, "whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm". We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them. PMID:22489210

Durston, A J

2012-01-01

317

The principle of finiteness - a guideline for physical laws  

NASA Astrophysics Data System (ADS)

I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.

Sternlieb, Abraham

2013-04-01

318

Funding the Unfundable: Mechanisms for Managing Uncertainty in Decisions on the Introduction of New and Innovative Technologies into Healthcare Systems  

Microsoft Academic Search

As tensions between payers, responsible for ensuring prudent and principled use of scarce resources, and both providers and patients, who legitimately want access to technologies from which they could benefit, continue to mount, interest in approaches to managing the uncertainty surrounding the introduction of new health technologies has heightened. The purpose of this project was to compile an inventory of

Tania Stafinski; Christopher J. McCabe; Devidas Menon

2010-01-01

319

Individuation, counting, and statistical inference: The role of frequency and whole-object representations in judgment under uncertainty  

Microsoft Academic Search

Evolutionary approaches to judgment under uncertainty have led to new data showing that untutored subject reliably produce judgments that conform to may principles of probability theory when (a) they are asked to compute a frequency instead of the probability of a single event, and (b) the relevant information is expressed as frequencies. But are the frequency- computation systems implicated in

Gary L. Brase; Leda Cosmides; John Tooby

1998-01-01

320

Uncertainty Analysis of CROPGRO-Cotton Model  

NASA Astrophysics Data System (ADS)

An application of crop simulation models have become an inherent part of research and decision making process. As many decision making processes solely rely on the results obtained from simulation models, consideration of model uncertainties along with model accuracy in decision making processes have also become increasingly important. Newly developed crop model, CROPGRO - Cotton model is complex simulation model that has been heavily parameterized. The values of those parameters were obtained from literature which also carries uncertainties. True uncertainty associated with important model parameters were not known. The objective of this study was to estimate uncertainties associated with model parameters and associated uncertainties in model outputs. The uncertainty assessment was carried out using widely accepted Geenralized Likelihood Uncertainty Estimation (GLUE technique. Dataset on this analysis was collected from four different experiments at three geographic locations. Primary results show that the amount of uncertainties in model input parameters were narrowed down significantly from the priori knowledge of selected parameters. The expected means of parameters obtained from their posterior distributions were not considerably different from their prior means and default values in the model. However, importantly the coefficient of variation of those parameters were reduced considerably. Maximum likelihood estimates of selected parameter improved the model performance. The fitting of the model to measured LAI, and biomass components was reasonably well with R-squared values for total above ground biomass for all four sites ranging between 0.86 and 0.98. Approximate reduction of uncertainties in input parameters ranged between 25%-85% and corresponding model output uncertainties reductions ranged between 62%-76%. Most of the measurements were covered within the 95% confidence interval estimated from 2.5% and 97.5% quantiles of cumulative distributions of model outputs generated from posterior distribution of model parameters. The study demonstrated an efficient prediction of uncertainties in model input and outputs using a widely accepted GLUE methodology.

Pathak, T. B.; Jones, J. W.; Fraisse, C.; Wright, D.; Hoogenboom, G.; Judge, J.

2009-12-01

321

Data Communication Principles Reliable Data Transfer  

E-print Network

Data Communication Principles Switching Reliable Data Transfer Data Communication Basics Mahalingam Ramkumar Mississippi State University, MS September 8, 2014 Ramkumar CSE 4153 / 6153 #12;Data Communication Principles Switching Reliable Data Transfer 1 Data Communication Principles Data Rate of a Communication

Ramkumar, Mahalingam

322

Risk communication and the Precautionary Principle.  

PubMed

The perception of risks for environment and health deriving from globalization processes and an uncontrolled use of modern technologies is growing everywhere. The greater the capacity of controlling living conditions, the larger is the possibility of misusing this power. In environmental and occupational health research we tend to reduce the complexity of the observed phenomena in order to facilitate conclusions. In social and political sciences complexity is an essential element of the context, which needs to be continuously considered. The Precautionary Principle is a tool for facing complexity and uncertainty in health risk management. This paper is aimed at demonstrating that this is not only a problem of technical risk assessment. Great attention should also be paid to improve risk communication. Communication between the stakeholders (experts, decision makers, political and social leaders, media, groups of interest and people involved) is possibly the best condition to be successful in health risk management. Nevertheless, this process usually runs up against severe obstacles. These are not only caused by existing conflicts of interest. Differences in values, languages, perceptions, resources to have access to information, and to express one's own point of view are other key aspects. PMID:15212225

Biocca, Marco

2004-01-01

323

Strong majorization entropic uncertainty relations  

NASA Astrophysics Data System (ADS)

We analyze entropic uncertainty relations in a finite-dimensional Hilbert space and derive several strong bounds for the sum of two entropies obtained in projective measurements with respect to any two orthogonal bases. We improve the recent bounds by Coles and Piani [P. Coles and M. Piani, Phys. Rev. A 89, 022112 (2014), 10.1103/PhysRevA.89.022112], which are known to be stronger than the well-known result of Maassen and Uffink [H. Maassen and J. B. M. Uffink, Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103]. Furthermore, we find a bound based on majorization techniques, which also happens to be stronger than the recent results involving the largest singular values of submatrices of the unitary matrix connecting both bases. The first set of bounds gives better results for unitary matrices close to the Fourier matrix, while the second one provides a significant improvement in the opposite sectors. Some results derived admit generalization to arbitrary mixed states, so that corresponding bounds are increased by the von Neumann entropy of the measured state. The majorization approach is finally extended to the case of several measurements.

Rudnicki, ?ukasz; Pucha?a, Zbigniew; ?yczkowski, Karol

2014-05-01

324

Uncertainty reasoning in expert systems  

NASA Technical Reports Server (NTRS)

Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

Kreinovich, Vladik

1993-01-01

325

Parametric uncertainty or hydrological changes?  

NASA Astrophysics Data System (ADS)

The model calibration is the way of hydrologists for searching also a physical interpretation of complex interactions acting within a basin. Actually, it can be frequently noticed how model calibration performed on a given time-window may converge to a point in the parameter space that could be distant from another obtainable calibration of the model in the same basin but considering a different time window. Is that again parametric uncertainty or does the trajectory in the parametric space relate about to a slow hydrological basin change? This paper depicts a possible path for detecting changes' signatures in a streamflow time series. In particular, the paper seeks to draw a way to discern the random variability over different time-windows of the calibrated model parameters set from that induced by the variation in time of some boundary conditions and external forcings. To this purpose, we will refer to a conceptual lumped model for simulating daily streamflow, the EHSM (EcoHydrological Streamflow Model), and to a hypothetical case study. The selected hydrological model requires a total of seven parameters, some of which can be easily related to land use, while others rely on climate variables. The calibration of the EHSM parameters with regard to different time-windows and the analysis of potential impacts of the anthropic variation in land use and/or climatic variability on the calibrated parameters set, will support our investigation.

Viola, F.; Noto, L. V.; Pumo, D.

2014-09-01

326

A new robust optimization approach for scheduling under uncertainty: II. Uncertainty with known probability distribution  

Microsoft Academic Search

In this work, we consider the problem of scheduling under uncertainty where the uncertain problem parameters can be described by a known probability distribution function. A novel robust optimization methodology, originally proposed by Lin, Janak, and Floudas [Lin, X., Janak, S. L., & Floudas, C. A. (2004). A new robust optimization approach for scheduling under uncertainty: I. Bounded uncertainty. Computers

Stacy L. Janak; Xiaoxia Lin; Christodoulos A. Floudas

2007-01-01

327

Impact of uncertainty on modeling and testing  

NASA Technical Reports Server (NTRS)

A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

Coleman, Hugh W.; Brown, Kendall K.

1995-01-01

328

Numerical Uncertainty Quantification for Radiation Analysis Tools  

NASA Technical Reports Server (NTRS)

Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

2007-01-01

329

Critical analysis of uncertainties during particle filtration.  

PubMed

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data. PMID:23020418

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

330

Micro-Pulse Lidar Signals: Uncertainty Analysis  

NASA Technical Reports Server (NTRS)

Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

2002-01-01

331

Polyhedral Approximation of Ellipsoidal Uncertainty Sets via ...  

E-print Network

Feb 21, 2014 ... power of modern integer programming solvers such as CPLEX and Gurobi. ...... as row-wise uncertainties by projecting the uncertainty set to the space ..... estimated or measured values in contrast to coefficients determining the ..... For the approximate method, the first value states the number of simplex.

Andreas Brmann, Christoph Thurner, Andreas Heidt, Sebastian Pokutta, Alexander Martin

2014-02-21

332

Uncertainty and Engagement with Learning Games  

ERIC Educational Resources Information Center

Uncertainty may be an important component of the motivation provided by learning games, especially when associated with gaming rather than learning. Three studies are reported that explore the influence of gaming uncertainty on engagement with computer-based learning games. In the first study, children (10-11 years) played a simple maths quiz.

Howard-Jones, Paul A.; Demetriou, Skevi

2009-01-01

333

Design optimization of steel structures considering uncertainties  

Microsoft Academic Search

In real world engineering applications the uncertainties of the structural parameters are inherent and the scatter from their nominal ideal values is in most cases unavoidable. These uncertainties play a dominant role in structural performance and the only way to assess this influence is to perform Reliability-Based Design Optimization (RBDO) and Robust Design Optimization (RDO). Compared to the basic deterministic-based

M. Papadrakakis; N. D. Lagaros; V. Plevris

2005-01-01

334

Methods of Dealing with Uncertainty: Panel Presentation.  

ERIC Educational Resources Information Center

Rising energy costs, changing tax bases, increasing numbers of non-traditional students, and ever changing educational technology point to the fact that community college administrators will have to accept uncertainty as a normal planning component. Rather than ignoring uncertainty, a tendency that was evidenced in the reluctance of administrators

Pearce, Frank C.

335

Forces, Uncertainty, and the Gibbs Entropy  

Microsoft Academic Search

A physicist's educated intuition indicates that the presence of some readily analyzed forces in a gas reduces one's uncertainty about the actual state of affairs at the molecular level below the corresponding uncertainty in the absence of such forces. There ought to be a connection with at least one of the entropylike expressions appearing in statistical mechanics. A simple proof

Ralph Baierlein

1968-01-01

336

Uncertainty and Decisions in Medical Informatics 1  

Microsoft Academic Search

This paper presents a tutorial introduction to the handling of uncertainty and decision-making in medical reasoning systems. It focuses on the central role of uncertainty in all of medicine and identifies the major themes that arise in re- search papers. It then reviews simple Bayesian formulations of the problem and pursues their generalization to the Bayesian network methods that are

Peter Szolovits

1995-01-01

337

Uncertainty and Decisions in Medical Informatics1  

E-print Network

1 Uncertainty and Decisions in Medical Informatics1 Peter Szolovits, Ph.D. Laboratory for Computer critical decision. This paper surveys historical and contemporary approaches taken by medical informatics. Uncertainty and Decisions in Medical Informatics. Methods of Information in Medicine, 34:111­21, 1995 #12

Szolovits, Peter

338

Uncertainty and Decisions in Medical Informatics 1  

E-print Network

1 Uncertainty and Decisions in Medical Informatics 1 Peter Szolovits, Ph.D. Laboratory for Computer critical decision. This paper surveys historical and contemporary approaches taken by medical informatics. Uncertainty and Decisions in Medical Informatics. Methods of Information in Medicine, 34:111--21, 1995 #12; 2

Szolovits, Peter

339

The Economic Implications of Carbon Cycle Uncertainty  

SciTech Connect

This paper examines the implications of uncertainty in the carbon-cycle for the cost of stabilizing carbon-dioxide concentrations. We find that uncertainty in our understanding of the carbon-dioxide has significant implications for the costs of a climate stabilization policy, equivalent to a change in concentration target of up to 100 ppmv.

Smith, Steven J.; Edmonds, James A.

2006-10-17

340

AUTOMATIC PARTICLE IMAGE VELOCIMETRY UNCERTAINTY QUANTIFICATION  

E-print Network

AUTOMATIC PARTICLE IMAGE VELOCIMETRY UNCERTAINTY QUANTIFICATION by Benjamin H. Timmins A thesis Velocimetry Uncertainty Quantification by Benjamin H. Timmins, Master of Science Utah State University, 2011 Velocimetry (PIV) measurement error depends on the PIV algo- rithm used, a wide range of user inputs, flow

Smith, Barton L.

341

Microform calibration uncertainties of Rockwell diamond indenters  

SciTech Connect

The Rockwell hardness test is a mechanical testing method for evaluating a property of metal products. National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of calibration and check standards, and calibration and uncertainty calculation procedures, the authors have calibrated the microform geometric parameters of Rockwell diamond indenters. These calibrations are traceable to fundamental standards. The expanded uncertainties are {+-} 0.3 {micro}m for the least-squares radius; {+-} 0.01{degree} for the cone angle; and {+-} 0.025 for the holder axis alignment calibrations. Under ISO and NIST guidelines for expressing measurement uncertainties, the calibration and uncertainty calculation procedure, error sources, and uncertainty components are described, and the expanded uncertainties are calculated. The instrumentation and calibration procedure also allows the measurement of profile deviation from the least-squares radius and cone flank straightness. The surface roughness and the shape of the spherical tip of the diamond indenter can also be explored and quantified. The calibration approach makes it possible to quantify the uncertainty, uniformity, and reproducibility of Rockwell diamond indenter microform geometry, as well as to unify the Rockwell hardness standards, through fundamental measurements rather than performance comparisons.

Song, J.F.; Rudder, F.F. Jr.; Vorburger, T.V.; Smith, J.H. [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

1995-09-01

342

Microform calibration uncertainties of Rockwell diamond indenters  

Microsoft Academic Search

The Rockwell hardness test is a mechanical testing method for evaluating a property of metal products. National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of calibration and check standards, and calibration and uncertainty calculation

J. F. Song; F. F. Jr. Rudder; T. V. Vorburger; J. H. Smith

1995-01-01

343

Reliable water supply system design under uncertainty  

Microsoft Academic Search

Given the natural variability and uncertainties in long-term predictions, reliability is a critical design factor for water supply systems. However, the large scale of the problem and the correlated nature of the involved uncertainties result in models that are often intractable. In this paper, we consider a municipal water supply system over a 15-year planning period with initial infrastructure and

G. Chung; K. Lansey; Gzin Bayraksan

2009-01-01

344

Visualizing Data with Bounded Uncertainty Chris Olston  

E-print Network

Introduction In most data-intensive applications, uncertainty is a fact of life. For example, in scientific, they may draw inaccurate conclusions, potentially leading to costly mistakes. A report by the US Department- niques for conveying uncertainty in scientific visualization appli- cations. Many of these techniques can

Olston, Christopher

345

Visualizing Data with Bounded Uncertainty Chris Olston #  

E-print Network

Introduction In most data­intensive applications, uncertainty is a fact of life. For example, in scientific, they may draw inaccurate conclusions, potentially leading to costly mistakes. A report by the US Department­ niques for conveying uncertainty in scientific visualization appli­ cations. Many of these techniques can

Olston, Christopher

346

UFLOW: visualizing uncertainty in fluid flow  

Microsoft Academic Search

Uncertainty or errors are introduced in fluid flow data as the data is acquired, transformed and rendered. Although researchers are aware of these uncertainties, little has been done to incorporate them in the existing visualization systems for fluid flow. In the absence of integrated presentation of data and its associated un- certainty, the analysis of the visualization is incomplete at

Suresh K. Lodha; Alex Pang; Robert E. Sheehan; Craig M. Wittenbrink

1996-01-01

347

DO MODEL UNCERTAINTY WITH CORRELATED INPUTS  

EPA Science Inventory

The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

348

Human capital and growth under political uncertainty  

Microsoft Academic Search

In this paper we show how political uncertainty may impede economic growth by reducing public investment in the formation of human capital, and how this negative effect of political uncertainty can be offset by a government contract. We present a model of growth with accumulation of human capital and government investment in education. We show that in a country with

Nigar Hashimzade; George Davis

2006-01-01

349

UNCERTAINTY AND INSURANCE IN ENDOGENOUS CLIMATE CHANGE  

E-print Network

UNCERTAINTY AND INSURANCE IN ENDOGENOUS CLIMATE CHANGE Georg M?LLER-F?RSTENBERGER Ingmar SCHUMACHER IN ENDOGENOUS CLIMATE CHANGE* Georg M?LLER-F?RSTENBERGER1 Ingmar SCHUMACHER2 January 2009 Cahier n° 2009 opportunities; ii) we can fully characterize and quantify the impact of uncertainty on the agent's decisions

Boyer, Edmond

350

Critical analysis of uncertainties during particle filtration  

NASA Astrophysics Data System (ADS)

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

351

Parametric uncertainty modeling for LFT model realization  

Microsoft Academic Search

This paper presents a procedure for parametric uncertainty modeling of a small unmanned aerial vehicle flight dynamics for the purpose of LFT model realization. Experimental data measurements were taken to construct the parametric uncertainties in the 6-DOF nonlinear simulation model for the vehicle . A new approach is used to linearize the uncertain nonlinear simulation model developed and a physically

Yew Chai Paw; Gary J. Balas

2008-01-01

352

Uncertainty Propagation in an Ecosystem Nutrient Budget.  

EPA Science Inventory

New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

353

COMMON KNOWLEDGE, COHERENT UNCERTAINTIES AND CONSENSUS  

E-print Network

COMMON KNOWLEDGE, COHERENT UNCERTAINTIES AND CONSENSUS by Yakov Ben-Haim TECHNICAL REPORT ETR-2001 of Mechanical Engineering #12;Working Paper Common Knowledge, Coherent Uncertainties and Consensus Yakov Ben- and knowledge-functions, common knowledge and consensus. Our main results are that knowledge is constricted

Rimon, Elon

354

Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification  

NASA Astrophysics Data System (ADS)

In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

2014-11-01

355

Uncertainties in derived temperature-height profiles  

NASA Technical Reports Server (NTRS)

Nomographs were developed for relating uncertainty in temperature T to uncertainty in the observed height profiles of both pressure p and density rho. The relative uncertainty delta T/T is seen to depend not only upon the relative uncertainties delta P/P or delta rho/rho, and to a small extent upon the value of T or H, but primarily upon the sampling-height increment Delta h, the height increment between successive observations of p or delta. For a fixed value of delta p/p, the value of delta T/T varies inversely with Delta h. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperatures with unacceptably large uncertainties.

Minzner, R. A.

1974-01-01

356

Habitable Zone Dependence on Stellar Parameter Uncertainties  

E-print Network

An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with a HZ planet to determine the uncertainty in their HZ status.

Kane, Stephen R

2014-01-01

357

Habitable Zone Dependence on Stellar Parameter Uncertainties  

NASA Astrophysics Data System (ADS)

An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

Kane, Stephen R.

2014-02-01

358

Minimizing Uncertainty in Coastal Digital Elevation Models  

NASA Astrophysics Data System (ADS)

Digital elevation models (DEMs) have inherent uncertainties in their values that impact the accuracies of coastal inundation studies that utilize them. Sources of DEM uncertainty include: uncertainty of source data, gridding interpolation to fill data gaps, and morphologic change after data collection. These uncertainties are propagated into modeling results such that the modeling of coastal inundation cannot be more accurate than the source DEMs they rely upon. We describe some of the major challenges in building coastal DEMs--those that integrate bathymetry and topography at the coast--and how to recognize errors and minimize model uncertainties. We also discuss procedures for building DEMs, and the efforts of NOAA and USGS to develop high-resolution DEMs of coastal areas impacted by Hurricane Sandy in October 2012.

Eakins, B.; Danielson, J.; McLean, S. J.

2013-12-01

359

Dealing with uncertainties - communication between disciplines  

NASA Astrophysics Data System (ADS)

Climate adaptation research inevitably involves uncertainty issues - whether people are building a model, using climate scenarios, or evaluating policy processes. However, do they know which uncertainties are relevant in their field of work? And which uncertainties exist in the data from other disciplines that they use (e.g. climate data, land use, hydrological data) and how they propagate? From experiences in Dutch research programmes on climate change in the Netherlands we know that disciplines often deal differently with uncertainties. This complicates communication between disciplines and also with the various users of data and information on climate change and its impacts. In October 2012 an autumn school was organized within the Knowledge for Climate Research Programme in the Netherlands with as central theme dealing with and communicating about uncertainties, in climate- and socio-economic scenarios, in impact models and in the decision making process. The lectures and discussions contributed to the development of a common frame of reference (CFR) for dealing with uncertainties. The common frame contains the following: 1. Common definitions (typology of uncertainties, robustness); 2. Common understanding (why do we consider it important to take uncertainties into account) and aspects on which we disagree (how far should scientists go in communication?); 3. Documents that are considered important by all participants; 4. Do's and don'ts in dealing with uncertainties and communicating about uncertainties (e.g. know your audience, check how your figures are interpreted); 5. Recommendations for further actions (e.g. need for a platform to exchange experiences). The CFR is meant to help researchers in climate adaptation to work together and communicate together on climate change (better interaction between disciplines). It is also meant to help researchers to explain to others (e.g. decision makers) why and when researchers agree and when and why they disagree, and on what exactly. During the presentation some results of this autumn school will be presented.

Overbeek, Bernadet; Bessembinder, Janette

2013-04-01

360

Entropic uncertainty relations and quasi-Hermitian operators  

NASA Astrophysics Data System (ADS)

We discuss a possible treatment of quasi-Hermitian operators from the viewpoint of the uncertainty principle. Here, probabilities are actually determined by the pair containing the square root of a given metric operator and adopted resolution of the identity. For two pairs of such a kind, we derive some inequality between norm-like functionals of generated probability distributions. Based on Rieszs theorem, this inequality assumes that one enjoys some condition with norms for the squared roots of metric operators and measured density matrix. The derived inequality between norm-like functionals naturally leads to entropic uncertainty relations in terms of the unified entropies. Entropic bounds of both the state-dependent and state-independent forms are presented. The latter form means some implicit dependence, since the measured density matrix is involved in the above condition. The presented entropic bounds are an extension of the previous bounds to the quasi-Hermitian case. The results are discussed within an example of 22 quasi-Hermitian matrices. This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to Quantum physics with non-Hermitian operators.

Rastegin, A. E.

2012-11-01

361

Implementing measurement uncertainty for analytical chemistry: the Eurachem Guide for measurement uncertainty  

NASA Astrophysics Data System (ADS)

The developments that led to the third edition of the Eurachem Guide Quantifying Uncertainty in Analytical Measurement are reviewed. Particular attention is given to the rationale for early use of spreadsheet methods, the incorporation of method performance data in the second edition, and the third edition's provisions on uncertainties near zero and Monte Carlo methods. The development of uncertainty concepts in chemistry is reviewed briefly and some of the challenges found in early implementation of measurement uncertainty in chemistry are recalled. Problems arising from uncertainty evaluation for reference measurements with limited data are discussed.

Ellison, Stephen L. R.

2014-08-01

362

Principles of Virus Structural Organization  

PubMed Central

Viruses, the molecular nanomachines infecting hosts ranging from prokaryotes to eukaryotes, come in different sizes, shapes and symmetries. Questions such as what principles govern their structural organization, what factors guide their assembly, how these viruses integrate multifarious functions into one unique structure have enamored researchers for years. In the last five decades, following Caspar and Klug's elegant conceptualization of how viruses are constructed, high resolution structural studies using X-ray crystallography and more recently cryo-EM techniques have provided a wealth of information on structures of variety of viruses. These studies have significantly furthered our understanding of the principles that underlie structural organization in viruses. Such an understanding has practical impact in providing a rational basis for the design and development of antiviral strategies. In this chapter, we review principles underlying capsid formation in a variety of viruses, emphasizing the recent developments along with some historical perspective. PMID:22297509

Prasad, B.V. Venkataram; Schmid, Michael F

2013-01-01

363

The 4th Thermodynamic Principle?  

SciTech Connect

It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

Montero Garcia, Jose de la Luz [Institute for Scientific and Technological Information (IDICT), National Capitol, Havana (Cuba); Novoa Blanco, Jesus Francisco

2007-04-28

364

Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis  

USGS Publications Warehouse

This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

2008-01-01

365

Principles of Science Principles of Biology Reference Edition  

E-print Network

a mature understanding of scientific concepts. Our unique interactive design turns students into active the publishers of Nature and Scientific American. The Principles of Biology Reference Edition includes in science related courses · High academic achievement for all learning styles at all levels of education

Cai, Long

366

Green chemistry: principles and practice.  

PubMed

Green Chemistry is a relatively new emerging field that strives to work at the molecular level to achieve sustainability. The field has received widespread interest in the past decade due to its ability to harness chemical innovation to meet environmental and economic goals simultaneously. Green Chemistry has a framework of a cohesive set of Twelve Principles, which have been systematically surveyed in this critical review. This article covers the concepts of design and the scientific philosophy of Green Chemistry with a set of illustrative examples. Future trends in Green Chemistry are discussed with the challenge of using the Principles as a cohesive design system (93 references). PMID:20023854

Anastas, Paul; Eghbali, Nicolas

2010-01-01

367

Doing without the Equivalence Principle  

E-print Network

In Einstein's general relativity, geometry replaces the concept of force in the description of the gravitation interaction. Such an approach rests on the universality of free-fall--the weak equivalence principle--and would break down without it. On the other hand, the teleparallel version of general relativity, a gauge theory for the translation group, describes the gravitational interaction by a force similar to the Lorentz force of electromagnetism, a non-universal interaction. It is shown that, similarly to the Maxwell's description of electromagnetism, the teleparallel gauge approach provides a consistent theory for gravitation even in the absence of the weak equivalence principle.

R. Aldrovandi; J. G. Pereira; K. H. Vu

2004-10-08

368

Measurement Uncertainty in Visual Sample Plan (VSP)  

SciTech Connect

Uncertainty is naturally inherent in any environmental measurement. Contributions to uncertainty can come from a variety of sources. When measurements are derived from analysis of field samples, uncertainties in the results can be due to large-scale spatial site variations, small scale local in-homogeneity, sampling methods, sample handling, sample preparation, sub-sampling, and analytical variations. Each of these components of variation can be broken into additional sub-components that all combine to affect the total uncertainty. Visual Sample Plan (VSP) is a tool for determining the required number and placement of samples to ensure that the resulting data can support a sufficiently confident decision. It is important that users of VSP understand how the uncertainty estimates used in VSP represent the various components of variation described above. This paper will show how VSP can be used to explore the relative contributions of sampling and analytical uncertainties to the tot al uncertainty. Using VSP, one can evaluate whether it is better to reduce sampling variations by obtaining more samples or improving the sampling technique verses conducting replicate analyses or using a more precise analytical technique. The Measurement Quality Objectives (MQO) option will be demonstrated and discussed.

Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.

2003-07-18

369

Theoretical uncertainty of orifice flow measurement  

SciTech Connect

Orifice meters are the most common meters used for fluid flow measurement, especially for measuring hydrocarbons. Meters are rugged, mechanically simple, and well suited for field use under extreme weather conditions. Because of their long history of use and dominance in the fluid flow measurement, their designs, installation requirements, and equations for flow rate calculation have been standardized by different organizations in the United States and internationally. These standards provide the guideline for the users to achieve accurate flow measurement. and minimize measurement uncertainty. This paper discusses different factors that contribute to the measurement inaccuracy and provide an awareness to minimize or eliminate these errors. Many factors which influence the overall measurement uncertainty are associated with the orifice meter application. Major contributors to measurement uncertainty include the predictability of flow profile, fluid properties at flowing condition, precision of empirical equation for discharge coefficient, manufacturing tolerances in meter components, and the uncertainty associated with secondary devices monitoring the static line pressure, differential pressure across the orifice plate, flowing temperature, etc. Major factors contributing to the measurement uncertainty for a thin, concentric, square-edged orifice flowmeter are as follows: (a) Tolerances in prediction of coefficient of discharge, (b) Predictability in defining the physical properties of the flowing fluid, (c) Fluid flow condition, (d) Construction tolerances in meter components, (e) Uncertainty of secondary devices/instrumentation, and (f) Data reduction and computation. Different factors under each of the above areas are discussed with precautionary measures and installation procedures to minimize or eliminate measurement uncertainty.

Husain, Z.D. [Daniel Flow Products, Inc., Houston, TX (United States)

1995-12-01

370

Aeroservoelastic Uncertainty Model Identification from Flight Data  

NASA Technical Reports Server (NTRS)

Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

Brenner, Martin J.

2001-01-01

371

Propagated Uncertainty in Scattering in Humidified Nephelometers  

NASA Astrophysics Data System (ADS)

Atmospheric aerosols exert a cooling effect at the surface by directly scattering and absorbing incident sunlight and indirectly by serving as seeds for cloud droplets. They are highly variable liquid or solid particles suspended in gas phase whose climate impact is associated with their chemical composition and microphysical properties. One such aerosol property is the hygroscopic growth or increase in aerosol size and scattering with the uptake of water with increasing relative humidity (RH). Particle size is strongly linked to the wavelength of light scattered and absorbed. Defined as the parameter which characterizes the dispersion of the values about the measured quantity1, uncertainty can effectively place a measured value into perspective. Small uncertainties in instrument sensors can propagate to large errors in the measured hygroscopic growth of aerosols. The uncertainties in the aerosol scattering coefficients and hygroscopic growth fit parameter were calculated. Among the propagated uncertainties stems a considerable contribution from imprecise RH sensors. RH dependent uncertainty of the aerosol hygroscopic growth has never been reported in the literature; however, an increased uncertainty was calculated in aerosols with lower hygroscopic growth, particularly those in clean and wet conditions. 1. Cook, R. R., ASSESSMENT OF UNCERTAINTIES OF MEASUREMENT for calibration & testing laboratories. In National Association of Testing Authorities, Australia, 2002.

Morrow, H. A.; Jefferson, A.; Sherman, J. P.; Andrews, E.; Sheridan, P. J.; Hageman, D.; Ogren, J. A.

2013-12-01

372

Uncertainty quantification approaches for advanced reactor analyses.  

SciTech Connect

The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

Briggs, L. L.; Nuclear Engineering Division

2009-03-24

373

Climate change, uncertainty, and natural resource management  

USGS Publications Warehouse

Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

Nichols, J. D.; Koneff, M. D.; Heglund, P. J.; Knutson, M. G.; Seamans, M. E.; Lyons, J. E.; Morton, J. M.; Jones, M. T.; Boomer, G. S.; Williams, B. K.

2011-01-01

374

Predictive uncertainty in auditory sequence processing.  

PubMed

Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

Hansen, Niels Chr; Pearce, Marcus T

2014-01-01

375

Predictive uncertainty in auditory sequence processing  

PubMed Central

Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertaintya property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

Hansen, Niels Chr.; Pearce, Marcus T.

2014-01-01

376

Experimental Basis for Robust On-orbit Uncertainty Estimates for CLARREO InfraRed Sensors  

NASA Astrophysics Data System (ADS)

As defined by the National Research Council Decadal Survey of 2006, the CLimate Absolute Radiance and REfractivity Observatory (CLARREO) satisfies the need for a long-term global benchmark record of critical climate variables that are accurate over very long time periods, can be tested for systematic errors by future generations, are unaffected by interruption, and are pinned to international standards. These observational requirements testing for systematic errors, accuracy over indefinite time, and linkage to internationally recognized measurement standardsare achievable through an appeal to the concept of SI traceability. That is, measurements are made such that they are linked through an unbroken chain of comparisons, where each comparison has a stated and credible uncertainty, back to the definitions of the International System (SI) Units. While the concept of SI traceability is a straightforward one, achieving credible estimates of uncertainty, particularly in the case of complex sensors deployed in orbit, poses a significant challenge. Recently, a set of principles has been proposed to guide the development of sensors that realize fully the benefits of SI traceability. The application of these principles to the spectral infrared sensor that is part of the CLARREO mission is discussed. These principles include, but are not limited to: basing the sensor calibration on a reproducible physical property of matter, devising experimental tests for known sources of measurement bias (or systematic uncertainty), and providing independent system-level checks for the end-to-end radiometric performance of the sensor. The application of these principles to the infrared sensor leads to the following conclusions. To obtain the lowest uncertainty (or highest accuracy), the calibration should be traceable to the definition of the Kelvinthat is, the triple point of water. Realization of a Kelvin-based calibration is achieved through the use of calibration blackbodies. It is necessary to implement experimental tests for changes in the optical and thermodynamic properties of the blackbodies, in addition to implementing tests for radiometric uncertainties arising from polarization, stray light, and detector chain nonlinearities, and for sensitivity to influencing parameters from the local sensor environment and the target radiation. The implication of these conclusions is that a multi-institutional effort to design and test the sensor is necessary to achieve the transparency required to bolster the credibility of the observational results and their associated uncertainties. Practical options for pursuing this effort will be explored.

Dykema, J. A.; Revercomb, H. E.; Anderson, J.

2009-12-01

377

THE MORAL FOUNDATION OF THE PRECAUTIONARY PRINCIPLE  

Microsoft Academic Search

The Commission's recent interpretation of the Precautionary Principle is used as starting point for an analysis of the moral foundation of this principle. The Precau- tionary Principle is shown to have the ethical status of an amendment to a liberal principle to the effect that a state only may restrict a person's actions in order to prevent unacceptable harm to

KARSTEN KLINT JENSEN

2002-01-01

378

Measures of uncertainty in market network analysis  

NASA Astrophysics Data System (ADS)

A general approach to measure statistical uncertainty of different filtration techniques for market network analysis is proposed. Two measures of statistical uncertainty are introduced and discussed. One is based on conditional risk for multiple decision statistical procedures and another one is based on average fraction of errors. It is shown that for some important cases the second measure is a particular case of the first one. The proposed approach is illustrated by numerical evaluation of statistical uncertainty for popular network structures (minimum spanning tree, planar maximally filtered graph, market graph, maximum cliques and maximum independent sets) in the framework of Gaussian network model of stock market.

Kalyagin, V. A.; Koldanov, A. P.; Koldanov, P. A.; Pardalos, P. M.; Zamaraev, V. A.

2014-11-01

379

Uncertainty in weather and climate prediction  

PubMed Central

Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation. PMID:22042896

Slingo, Julia; Palmer, Tim

2011-01-01

380

Uncertainty: the Curate's egg in financial economics.  

PubMed

Economic theories of uncertainty are unpopular with financial experts. As sociologists, we rightly refuse predictions, but the uncertainties of money are constantly sifted and turned into semi-denial by a financial economics set on somehow beating the future. Picking out 'bits' of the future as 'risk' and 'parts' as 'information' is attractive but socially dangerous, I argue, because money's promises are always uncertain. New studies of uncertainty are reversing sociology's neglect of the unavoidable inability to know the forces that will shape the financial future. PMID:24712756

Pixley, Jocelyn

2014-06-01

381

Uncertainty analysis for Ulysses safety evaluation report  

NASA Technical Reports Server (NTRS)

As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator, the Interagency Nuclear Safety Review Panel (INSRP) performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

Frank, Michael V.

1991-01-01

382

Two basic Uncertainty Relations in Quantum Mechanics  

SciTech Connect

In the present article, we discuss two types of uncertainty relations in Quantum Mechanics-multiplicative and additive inequalities for two canonical observables. The multiplicative uncertainty relation was discovered by Heisenberg. Few years later (1930) Erwin Schroedinger has generalized and made it more precise than the original. The additive uncertainty relation is based on the three independent statistical moments in Quantum Mechanics-Cov(q,p), Var(q) and Var(p). We discuss the existing symmetry of both types of relations and applicability of the additive form for the estimation of the total error.

Angelow, Andrey [Institute of Solid State Physics, Bulgarian Academy of Sciences, 72 Tzarigradsko chaussee, 1784 Sofia (Bulgaria)

2011-04-07

383

Demonstrating Fermat's Principle in Optics  

ERIC Educational Resources Information Center

We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either

Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

2011-01-01

384

Aesthetic Principles for Instructional Design  

ERIC Educational Resources Information Center

This article offers principles that contribute to developing the aesthetics of instructional design. Rather than describing merely the surface qualities of things and events, the concept of aesthetics as applied here pertains to heightened, integral experience. Aesthetic experiences are those that are immersive, infused with meaning, and felt as

Parrish, Patrick E.

2009-01-01

385

CURRICULUM PRINCIPLES FOR PROSPECTIVE TEACHERS.  

ERIC Educational Resources Information Center

TO TEST THE EFFECTIVENESS OF TWO APPROACHES TO TEACHING PROSPECTIVE TEACHERS HOW TO SELECT APPROPRIATE INSTRUCTIONAL OBJECTIVES, 127 SECONDARY EDUCATION STUDENTS WERE PLACED IN TWO SAMPLE GROUPS AND TAUGHT DIFFERENT CURRICULUM PRINCIPLES. GROUP I WAS TAUGHT THE FIVE POINT RATIONALE DEVELOPED BY TYLER WHILE GROUP II USED THE BLOOM CLASSIFICATION OF

BAKER, EVA L.; POPHAM, W. JAMES

386

Principles of Condensed Matter Physics  

Microsoft Academic Search

Although there are many books on solid state physics and condensed matter, I suspect very few cover the same content as that found in {\\\\it Principles of Condensed Matter Physics} by Chaikin and Lubensky. The title is rather misleading as it suggests a survey of the important concepts in condensed matter. In spite of this there is much to commend

C C Matthai

2000-01-01

387

Biology 2250 Principles of Genetics  

E-print Network

and random Ultimate source of genetic variation Cancer: Proto-oncogene àoncogene à cancer mutation "...in1 Biology 2250 Principles of Genetics Announcements Lab 4 Information: B2250 (Innes) webpage, Wed., Thr or by appointment: 737-4754, dinnes@mun.ca Mendelian Genetics Topics: -Transmission of DNA

Innes, David J.

388

Principles and practice of sonography  

SciTech Connect

This book is a text of sonographic technique, emphasizing clinical and diagnostic procedures. Ultrasound images and explanatory line drawings are placed side-by-side to facilitate interpretation. This book covers instrumentation and scanning principles, obstetric, gynecologic, abdominal, renal and urologic, pediatric, plus superficial structure sonography.

Fleischer, A.C.; James, A.E.

1987-01-01

389

Principles of Equilibrium Statistical Mechanics  

Microsoft Academic Search

This modern textbook provides a complete survey of the broad field of statistical mechanics. Based on a series of lectures, it adopts a special pedagogical approach. The authors, both excellent lecturers, clearly distinguish between general principles and their applications in solving problems. Analogies between phase transitions in fluids and magnets using continuum and spin models are emphasized, leading to a

Debashish Chowdhury; Dietrich Stauffer

2000-01-01

390

Sociology 110 PRINCIPLES OF SOCIOLOGY  

E-print Network

Sociology 110 PRINCIPLES OF SOCIOLOGY Spring, 2008 Instructor Rob Balch, SS 325 Phone: 243 Sociology is the study of human social behavior. Sociologists are especially interested in groups, from sociological concepts to describe face-to-face interaction, small groups, communities, complex organizations

Vonessen, Nikolaus

391

Principles of Infectious Disease Epidemiology  

Microsoft Academic Search

\\u000a In this chapter, principles and concepts of modern infectious disease epidemiology Epidemiology are presented. We delineate\\u000a the role of epidemiology for public health and discuss the characteristics of infectious disease epidemiology. This chapter\\u000a also includes definitions of important terms used in infectious disease epidemiology.

Alexander Krmer; Manas Akmatov; Mirjam Kretzschmar

392

On the Dirichlet's Box Principle  

ERIC Educational Resources Information Center

In this note, we will focus on several applications on the Dirichlet's box principle in Discrete Mathematics lesson and number theory lesson. In addition, the main result is an innovative game on a triangular board developed by the authors. The game has been used in teaching and learning mathematics in Discrete Mathematics and some high schools in

Poon, Kin-Keung; Shiu, Wai-Chee

2008-01-01

393

Electronic Structure Principles and Aromaticity  

ERIC Educational Resources Information Center

The relationship between aromaticity and stability in molecules on the basis of quantities such as hardness and electrophilicity is explored. The findings reveal that aromatic molecules are less energetic, harder, less polarizable, and less electrophilic as compared to antiaromatic molecules, as expected from the electronic structure principles.

Chattaraj, P. K.; Sarkar, U.; Roy, D. R.

2007-01-01

394

Principles for Teaching Problem Solving  

NSDL National Science Digital Library

This 14-page monograph addresses the need to teach problem solving and other higher order thinking skills. After summarizing research and positions of various organizations, it defines several models and describes cognitive and attitudinal components of problem solving and the types of knowledge that are required. The authors provide a list of principles for teaching problem solving and include a list of references.

Kirkley, Rob F.

2003-01-01

395

Novel interpretation of synchro principles  

Microsoft Academic Search

This paper deals with a now interpretation of synchro principles. Approximate equations for currant and torque in a synchro may be established directly from synchronous machine theory. However, this analysis is built up from fundamental laws in order to establish a theory especially adapted to synchros.The investigation centres on obtaining the basic mathematical relations governing the electro-mechanical operation of a

F. L. N-NAGY; B. D. McNULTY

1969-01-01

396

Variational Principles for Water Waves  

E-print Network

We describe the Hamiltonian structures, including the Poisson brackets and Hamiltonians, for free boundary problems for incompressible fluid flows with vorticity. The Hamiltonian structure is used to obtain variational principles for stationary gravity waves both for irrotational flows as well as flows with vorticity.

Boris Kolev; David H. Sattinger

2007-12-01

397

Biology 2250 Principles of Genetics  

E-print Network

1 Biology 2250 Principles of Genetics Instructors: Dr. Steven M. Carr B Molecular Genetics Dr. David J. Innes B Mendelian Genetics Course Web Pages Dr.Innes: www.mun.ca/biology/dinnes/B2250/B2250 Structure and function of DNA and genes Innes: Oct. 13 ­ Nov. 15 Mendelian Genetics Carr: Nov. 22 ­ Dec. 1

Innes, David J.

398

Dealing with uncertainty in ecosystem models: The paradox of use for living marine resource management  

NASA Astrophysics Data System (ADS)

To better manage living marine resources (LMRs), it has become clear that ecosystem-based fisheries management (EBFM) is a desired approach. To do EBFM, one of the key tools will be to use ecosystem models. To fully use ecosystem models and have their outputs adopted, there is an increasingly recognized need to address uncertainty associated with such modeling activities. Here we characterize uncertainty as applied to ecosystem models into six major factors, including: natural variability; observation error; inadequate communication among scientists, decision-makers and stakeholders; the structural complexity of the model(s) used; outcome uncertainty; and unclear management objectives. We then describe best practices to address each of these uncertainties as they particularly apply to ecosystem models being used in a LMR management context. We also present case studies to highlight examples of how these best practices have been implemented. Although we acknowledge that this work was compiled by ecosystem modelers in an LMR management context primarily for other ecosystem modelers, the principles and practices described herein are also germane for managers, stakeholders and other natural resource management communities. We conclude by emphasizing not only the need to address uncertainty in ecosystem models, but also the feasibility and benefits of doing so.

Link, J. S.; Ihde, T. F.; Harvey, C. J.; Gaichas, S. K.; Field, J. C.; Brodziak, J. K. T.; Townsend, H. M.; Peterman, R. M.

2012-09-01

399

Uncertainty of Measurement: A Review of the Rules for Calculating Uncertainty Components through Functional Relationships  

PubMed Central

The Evaluation of Measurement Data - Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides general rules for evaluating and expressing uncertainty in measurement. When a measurand, y, is calculated from other measurements through a functional relationship, uncertainties in the input variables will propagate through the calculation to an uncertainty in the output y. The manner in which such uncertainties are propagated through a functional relationship provides much of the mathematical challenge to fully understanding the GUM. The aim of this review is to provide a general overview of the GUM and to show how the calculation of uncertainty in the measurand may be achieved through a functional relationship. That is, starting with the general equation for combining uncertainty components as outlined in the GUM, we show how this general equation can be applied to various functional relationships in order to derive a combined standard uncertainty for the output value of the particular function (the measurand). The GUM equation may be applied to any mathematical form or functional relationship (the starting point for laboratory calculations) and describes the propagation of uncertainty from the input variable(s) to the output value of the function (the end point or outcome of the laboratory calculation). A rule-based approach is suggested with a number of the more common rules tabulated for the routine calculation of measurement uncertainty. PMID:22896744

Farrance, Ian; Frenkel, Robert

2012-01-01

400

The Principle of Energetic Consistency  

NASA Technical Reports Server (NTRS)

A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of energetic consistency implies that, to precisely the extent that growing modes are important in data assimilation, this term is also important.

Cohn, Stephen E.

2009-01-01

401

10 CFR 436.24 - Uncertainty analyses.  

Code of Federal Regulations, 2011 CFR

... Uncertainty analyses. 436.24 Section 436.24 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses 436.24...

2011-01-01

402

Uncertainty in Mixtures and Cumulative Risk Assessment  

EPA Science Inventory

Uncertainty in Mixtures and Cumulative Risk Assessment JC Lipscomb and GE Rice U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, Cincinnati, Ohio, USA Humans and environmental species are rarely exposed to sing...

403

Methods for Composing Tradeoff Studies under Uncertainty  

E-print Network

subsystem-level tradeoff studies under uncertainty into mathematically valid system-level tradeoff studies and efficiently eliminate inferior alternatives through intelligent sampling. The approaches are based on three key ideas: the use of stochastic...

Bily, Christopher

2012-10-19

404

Uncertainty in climate change policy analysis  

E-print Network

Achieving agreement about whether and how to control greenhouse gas emissions would be difficult enough even if the consequences were fully known. Unfortunately, choices must be made in the face of great uncertainty, about ...

Jacoby, Henry D.; Prinn, Ronald G.

405

Radiometer Design Analysis Based Upon Measurement Uncertainty  

NASA Technical Reports Server (NTRS)

This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

Racette, Paul E.; Lang, Roger H.

2004-01-01

406

Utilizing general information theories for uncertainty quantification  

SciTech Connect

Uncertainties enter into a complex problem from many sources: variability, errors, and lack of knowledge. A fundamental question arises in how to characterize the various kinds of uncertainty and then combine within a problem such as the verification and validation of a structural dynamics computer model, reliability of a dynamic system, or a complex decision problem. Because uncertainties are of different types (e.g., random noise, numerical error, vagueness of classification), it is difficult to quantify all of them within the constructs of a single mathematical theory, such as probability theory. Because different kinds of uncertainty occur within a complex modeling problem, linkages between these mathematical theories are necessary. A brief overview of some of these theories and their constituents under the label of Generalized lnforrnation Theory (GIT) is presented, and a brief decision example illustrates the importance of linking at least two such theories.

Booker, J. M. (Jane M.)

2002-01-01

407

Performance and robustness analysis for structured uncertainty  

Microsoft Academic Search

This paper introduces a nonconservative measure of performance for linear feedback systems in the face of structured uncertainty. This measure is based on a new matrix function, which we call the Structured Singular Value.

John C. Doyle; Joseph E. Wall; Gunter Stein

1982-01-01

408

Estimating uncertainties in integrated reservoir studies  

E-print Network

To make sound investment decisions, decision makers need accurate estimates of the uncertainties present in forecasts of reservoir performance. In this work I propose a method, the integrated mismatch method, that incorporates the misfit...

Zhang, Guohong

2004-09-30

409

Analysis of S-Circuit Uncertainty  

E-print Network

The theory of sensori-computational circuits provides a capable framework for the description and optimization of robotic systems, including on-line optimizations. This theory, however, is inadequate in that it does not account for uncertainty in a...

Ahmed, Taahir

2011-08-08

410

Squeezing effect induced by minimal length uncertainty  

E-print Network

In this work, the dynamics of the deformed one-dimensional harmonic oscillator with minimal length uncertainty is examined and the analytical solutions for time evolution of position and momentum operators are presented in which the rough approximation that neglects the higher order terms in BakerHausdor lemma is avoided. Based on these analytical solutions the uncertainties for position and momentum operators are calculated in a coherent state, and an unexpected squeezing effect in both coordinate and momentum directions is found in comparison with ordinary harmonic oscillator. Obviously such a squeezing effect is induced by the minimal length uncertainty (gravitational effects). Our results are applied to the electrons trapped in strong magnetic fields to examine the degree of the existing squeezing effect in a real system, which shows the squeezing degree induced by minimal length uncertainty is very small.

Yue-Yue Chen; Xun-Li Feng; C. H. Oh; Zhi-Zhan Xu

2014-05-19

411

Quantifying reliability uncertainty : a proof of concept.  

SciTech Connect

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.

2009-10-01

412

A note on competitive investment under uncertainty  

E-print Network

This paper clarifies how uncertainty affects irreversible investment in a competitive market equilibrium. With free entry, irreversibility affects the distribution of future prices, and thereby creates an opportunity cost ...

Pindyck, Robert S.

1991-01-01

413

Uncertainty Quantification in ocean state estimation  

E-print Network

Quantifying uncertainty and error bounds is a key outstanding challenge in ocean state estimation and climate research. It is particularly difficult due to the large dimensionality of this nonlinear estimation problem and ...

Kalmikov, Alexander G

2013-01-01

414

QUANTIFICATION OF UNCERTAINTY IN COMPUTATIONAL FLUID DYNAMICS  

Microsoft Academic Search

This review covers Verification, Validation, Confirmation and related subjects for computational fluid dynamics (CFD), including error taxonomies, error estima- tion and banding, convergence rates, surrogate estimators, nonlinear dynamics, and error estimation for grid adaptation vs Quantification of Uncertainty.

P. J. Roache

1997-01-01

415

Dynamics of the Thermohaline Circulation Under Uncertainty  

E-print Network

The ocean thermohaline circulation under uncertainty is investigated by a random dynamical systems approach. It is shown that the asymptotic dynamics of the thermohaline circulation is described by a random attractor and by a system with finite degrees of freedom.

Wei Wang; Jianhua Sun; Jinqiao Duan

2006-06-30

416

RIMS Workshop Mathematics for Uncertainty and Fuzziness  

E-print Network

Murofushi 9:009:30 Kenjiro Yanagi (Yamaguchi University) Non-hermitian extensions of uncertainty relation 9) Inequalities for sums of joint entropy based on the strong subadditivity 16:2016:50 Masahiro Yanagida (Tokyo

417

Error Detection and Recovery for Robot Motion Planning with Uncertainty  

E-print Network

Robots must plan and execute tasks in the presence of uncertainty. Uncertainty arises from sensing errors, control errors, and uncertainty in the geometry of the environment. The last, which is called model error, has ...

Donald, Bruce Randall

1987-07-01

418

Uncertainty Quantification Techniques of SCALE/TSUNAMI  

SciTech Connect

The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.

Rearden, Bradley T [ORNL] [ORNL; Mueller, Don [ORNL] [ORNL

2011-01-01

419

The ends of uncertainty: Air quality science and planning in Central California  

SciTech Connect

Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.

Fine, James

2003-09-01

420

Propagating nu -Interaction Uncertainties via Event Reweighting  

NASA Astrophysics Data System (ADS)

We present an event reweighting scheme for propagating neutrino cross-section and intranuclear hadron transport model uncertainties which has been developed for the GENIE-based (C. Andreopoulos et al., arXiv:0905. 2517[hep-ph]) neutrino physics simulations. We discuss the motivations, implementation and validation of the scheme and show an example application where it is used to evaluate the associated systematic uncertainties for neutral current pi 0 production.

Dobson, J.; Andreopoulos, C.

2009-09-01

421

Uncertainty of Pyrometers in a Casting Facility  

SciTech Connect

This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.

Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.

2001-12-07

422

Uncertainty relations based on mutually unbiased measurements  

E-print Network

We derive uncertainty relation inequalities according to the mutually unbiased measurements. Based on the calculation of the index of coincidence of probability distribution given by $d+1$ MUMs on any density operator $\\rho$ in $\\mathbb{C}^{d}$, both state-dependent and state-independent forms of lower entropic bounds are given. Furthermore, we formulate uncertainty relations for MUMs in terms of R\\'{e}nyi and Tsallis entropies.

Bin Chen; Shao-Ming Fei

2014-07-25

423

Model uncertainty, performance persistence and flows  

Microsoft Academic Search

Model uncertainty makes it difficult to draw clear inference about mutual fund performance persistence. I propose a new performance\\u000a measure, Bayesian model averaged (BMA) alpha, which explicitly accounts for model uncertainty. Using BMA alphas, I find evidence\\u000a of performance persistence in a large sample of US funds. There is a positive and asymmetric relation between flows and past\\u000a BMA alphas,

Yee Cheng Loon

2011-01-01

424

Design Optimization of Composite Structures under Uncertainty  

NASA Technical Reports Server (NTRS)

Design optimization under uncertainty is computationally expensive and is also challenging in terms of alternative formulation. The work under the grant focused on developing methods for design against uncertainty that are applicable to composite structural design with emphasis on response surface techniques. Applications included design of stiffened composite plates for improved damage tolerance, the use of response surfaces for fitting weights obtained by structural optimization, and simultaneous design of structure and inspection periods for fail-safe structures.

Haftka, Raphael T.

2003-01-01

425

Assessing MODIS Macrophysical Cloud Property Uncertainties  

NASA Astrophysics Data System (ADS)

Cloud, being multifarious and ephemeral, is difficult to observe and quantify in a systematic way. Even basic terminology used to describe cloud observations is fraught with ambiguity in the scientific literature. Any observational technique, method, or platform will contain inherent and unavoidable measurement uncertainties. Quantifying these uncertainties in cloud observations is a complex task that requires an understanding of all aspects of the measurement. We will use cloud observations obtained from the Moderate Resolution Imaging Spectroradiameter(MODIS) to obtain metrics of the uncertainty of its cloud observations. Our uncertainty analyses will contain two main components, 1) an attempt to create a bias or uncertainty with respect to active measurements from CALIPSO and 2) a relative uncertainty within the MODIS cloud climatologies themselves. Our method will link uncertainty to the physical observation and its environmental/scene characteristics. Our aim is to create statistical uncertainties that are based on the cloud observational values, satellite view geometry, surface type, etc, for cloud amount and cloud top pressure. The MODIS instruments on the NASA Terra and Aqua satellites provide observations over a broad spectral range (36 bands between 0.415 and 14.235 micron) and high spatial resolution (250 m for two bands, 500 m for five bands, 1000 m for 29 bands), which the MODIS cloud mask algorithm (MOD35) utilizes to provide clear/cloud determinations over a wide array of surface types, solar illuminations and view geometries. For this study we use the standard MODIS products, MOD03, MOD06 and MOD35, all of which were obtained from the NASA Level 1 and Atmosphere Archive and Distribution System.

Maddux, B. C.; Ackerman, S. A.; Frey, R.; Holz, R.

2013-12-01

426

Instrument fault detection in systems with uncertainties  

Microsoft Academic Search

We demonstrate how to detect instrument faults in non-linear time-varying processes that include uncertainties such as modelling error, parameter ambiguity, and input and output noise. The design of state estimation filters with minimum sensitivity to the uncertainties and maximum sensitivity to the instrument faults is described together with existence conditions for such filters. Simulations based on a non-linear chemical reactor

K. WATANABE; D. M. HIMMELBLAU

1982-01-01

427

Principles of silicon surface chemistry from first principles  

SciTech Connect

First principles theoretical studies of dissociative adsorption of H{sub 2}, H{sub 2}O, SiH{sub 4} and other species on Si(100)-2x1 demonstrate some common principles that permit qualitative understanding of the mechanisms of reactive adsorption on Si. The structures of transition states and the interactions among surface sites can also be understood in terms of correlations between surface structure and local electron density. For example, the transition states for dissociative adsorption involve buckled surface dimers, which present both electrophilic and nucleophilic reaction sites and allow efficient addition across the dimer. A surface Diels-Alder reaction will also be described, in which symmetric addition to an unbuckled surface dimer is allowed by orbital symmetry. The Diets-Alder product establishes novel reactive surface sites that may be useful for subsequent surface modification. This work has been done in collaboration with Sharmila Pai, Robert Konecny and Anita Robinson Brown.

Doren, D.J. [Univ. of Delaware Newark, DE (United States)

1996-10-01

428

Uncertainty analysis for wheelchair propulsion dynamics.  

PubMed

Wheelchair propulsion kinetic measurements require the use of custom pushrim force/moment measuring instruments which are not currently commercially available. With the ability to measure pushrim forces and moments has come the development of several dynamic metrics derived for analyzing key aspects of wheelchair propulsion. This paper presents several of the equations used to calculate or derive the primary variables used in the study of wheelchair propulsion biomechanics. The uncertainties for these variables were derived, and then numerically calculated for a current version of the SMARTWheel. The uncertainty results indicate that the SMARTWheel provides data which has better than 5 to 10% uncertainty, depending upon the variable concerned, at the maximum, and during most of the propulsion phase the uncertainty is considerably smaller (i.e., approximately 1%). The uncertainty analysis provides a more complete picture of the attainable accuracy of the SMARTWheel and of the degree of confidence with which the data can be recorded. The derivations and results indicate where improvements in measurement of wheelchair propulsion biomechanical variables are likely to originate. The most efficient approach is to address those variables in the design of the system which make the greatest contribution to the uncertainty. Future research will focus on the point of force application and examination of nonlinear effects. PMID:9184899

Cooper, R A; Boninger, M L; VanSickle, D P; Robertson, R N; Shimada, S D

1997-06-01

429

Correlated Uncertainties in Radiation Shielding Effectiveness  

NASA Technical Reports Server (NTRS)

The space radiation environment is composed of energetic particles which can deliver harmful doses of radiation that may lead to acute radiation sickness, cancer, and even death for insufficiently shielded crew members. Spacecraft shielding must provide structural integrity and minimize the risk associated with radiation exposure. The risk of radiation exposure induced death (REID) is a measure of the risk of dying from cancer induced by radiation exposure. Uncertainties in the risk projection model, quality factor, and spectral fluence are folded into the calculation of the REID by sampling from probability distribution functions. Consequently, determining optimal shielding materials that reduce the REID in a statistically significant manner has been found to be difficult. In this work, the difference of the REID distributions for different materials is used to study the effect of composition on shielding effectiveness. It is shown that the use of correlated uncertainties allows for the determination of statistically significant differences between materials despite the large uncertainties in the quality factor. This is in contrast to previous methods where uncertainties have been generally treated as uncorrelated. It is concluded that the use of correlated quality factor uncertainties greatly reduces the uncertainty in the assessment of shielding effectiveness for the mitigation of radiation exposure.

Werneth, Charles M.; Maung, Khin Maung; Blattnig, Steve R.; Clowdsley, Martha S.; Townsend, Lawrence W.

2013-01-01

430

A procedure for assessing uncertainty in models  

SciTech Connect

This paper discusses uncertainty in the output calculation of a model due to uncertainty in inputs values. Uncertainty in input values, characterized by suitable probability distributions, propagates through the model to a probability distribution of an output. Our objective in studying uncertainty is to identify a subset of inputs as being important in the sense that fixing them greatly reduces the uncertainty, or variability, in the output. The procedures we propose are demonstrated with an application of the model called MELCOR Accident Consequence Code System (MACCS), described in Helton et al. (1992). The purpose of MACCS is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. In any particular application of MACCS there are likely to be many possible inputs and outputs of interest. In this paper, attention focuses on a single output and 36 inputs. Our objective is to determine a subset of the 36 model inputs that can be said to be dominant, or important, in the sense that they are the principal contributors to uncertainty in the output.

McKay, M.D.; Beckman, R.J.

1993-11-01

431

Sampling uncertainty in satellite rainfall estimates  

NASA Astrophysics Data System (ADS)

Accurate estimates of global precipitation patterns are essential for a better understanding of the hydrological cycle. Satellite observations allow for large scale estimates of rainfall intensities. Uncertainties in current satellite based rainfall estimates are due to uncertainties in the retrieval process as well as the different temporal and spatial sampling patterns of the observation systems. The focus of this study is set on analyzing sampling associated uncertainty for thirteen low Earth orbiting satellites carrying microwave instruments suitable for rainfall measurement. Satellites were grouped by the types of microwave sensors, where NOAA satellites with cross-track sounders and DMSP satellites with conical scanners make the core part of the constellations. The effect of three hourly geostationary measurements on the sampling uncertainty was evaluated as well. A precise orbital model SGP4 was used to generate realistic satellite overpasses database where orbital shifts are taken into account. Using the overpasses database we resampled rain gauge timeseries to simulate satellites rainfall estimates free of retrieval and calibration errors. We look at two regions, Germany and Benin, areas with different precipitation regimes . Our analysis show that sampling uncertainty for all available satellites may differ up to 100% for different latitudes and precipitation regimes. However the performance of various satellite groups is similar to each other, with greater differences in higher latitudes. Addition of three hourly geostationary observations reduces the sampling uncertainty but only to a limited extent.

Itkin, M.; Loew, A.

2012-04-01

432

Incorporating uncertainty in predictive species distribution modelling  

PubMed Central

Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates. PMID:22144387

Beale, Colin M.; Lennon, Jack J.

2012-01-01

433

Estimating uncertainty of inference for validation  

SciTech Connect

We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the first in a series of inference uncertainty estimations. While the methods demonstrated are primarily statistical, these do not preclude the use of nonprobabilistic methods for uncertainty characterization. The methods presented permit accurate determinations for validation and eventual prediction. It is a goal that these methods establish a standard against which best practice may evolve for determining degree of validation.

Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

2010-09-30

434

Integrating uncertainties for climate change mitigation  

NASA Astrophysics Data System (ADS)

The target of keeping global average temperature increase to below 2C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2C, or other temperature limits like 3C or 1.5C, under a wide range of scenarios.

Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

2013-04-01

435

Principles for system level electrochemistry  

NASA Technical Reports Server (NTRS)

The higher power and higher voltage levels anticipated for future space missions have required a careful review of the techniques currently in use to preclude battery problems that are related to the dispersion characteristics of the individual cells. Not only are the out-of-balance problems accentuated in these larger systems, but the thermal management considerations also require a greater degree of accurate design. Newer concepts which employ active cooling techniques are being developed which permit higher rates of discharge and tighter packing densities for the electrochemical components. This paper will put forward six semi-independent principles relating to battery systems. These principles will progressively address cell, battery and finally system related aspects of large electrochemical storage systems.

Thaller, L. H.

1986-01-01

436

[Genetics and the precautionary principle].  

PubMed

It is very important to follow the Precautionary Principles with regard to genetics because of its fast development and of its impact in the public imaginer. In that regard gene grafts, OGM or recombinant drugs are pre-eminently be suspected to transmit human or animal viruses and/or to induce severe allergies. Pharmaceutical Industry has conducted large reflexions on such problems and are using drastic rules to prevent them. By contrast, academic research laboratories are handicapped by the problem of fund seeking. At the genetic disease level and mainly because the fast development of Predictive Medicine, patient and family councelling requires a great lot of reflexion and carefulness. Memorization of the presence of abnormal genes in the history of families also constitutes an important problem. We believe that, to be in accordance with the Precautionary Principles, an important work of formation and information is required. PMID:11077717

Rosa, J

2000-01-01

437

Mechanics, cosmology and Mach's principle  

NASA Astrophysics Data System (ADS)

It is pointed out that recent cosmological findings seem to support the view that the mass/energy distribution of the universe defines the Newtonian inertial frames, as originally suggested by Mach. The background concepts of inertial frame, Newton's second law and fictitious forces are clarified. A precise definition of Mach's principle is suggested. Then, an approximation to general relativity discovered by Einstein, Infeld and Hoffmann is used and it is found that this precise formulation of Mach's principle is realized provided the mass/energy density of the universe has a specific value. This value turns out to be twice the critical density. The implications of this approximate result are put into context.

Essn, Hanno

2013-01-01

438

Equivalence principle for scalar forces.  

PubMed

The equivalence of inertial and gravitational masses is a defining feature of general relativity. Here, we clarify the status of the equivalence principle for interactions mediated by a universally coupled scalar, motivated partly by recent attempts to modify gravity at cosmological distances. Although a universal scalar-matter coupling is not mandatory, once postulated, it is stable against classical and quantum renormalizations in the matter sector. The coupling strength itself is subject to renormalization, of course. The scalar equivalence principle is violated only for objects for which either the graviton self-interaction or the scalar self-interaction is important--the first applies to black holes, while the second type of violation is avoided if the scalar is Galilean symmetric. PMID:21231444

Hui, Lam; Nicolis, Alberto

2010-12-01

439

Artificial intelligence: Principles and applications  

SciTech Connect

The book covers the principles of AI, the main areas of application, as well as considering some of the social implications. The applications chapters have a common format structured as follows: definition of the topic; approach with conventional computing techniques; why 'intelligence' would provide a better approach; and how AI techniques would be used and the limitations. The contents discussed are: Principles of artificial intelligence; AI programming environments; LISP, list processing and pattern-making; AI programming with POP-11; Computer processing of natural language; Speech synthesis and recognition; Computer vision; Artificial intelligence and robotics; The anatomy of expert systems - Forsyth; Machine learning; Memory models of man and machine; Artificial intelligence and cognitive psychology; Breaking out of the chinese room; Social implications of artificial intelligence; and Index.

Yazdami, M.

1985-01-01

440

Quantification and Propagation of Nuclear Data Uncertainties  

NASA Astrophysics Data System (ADS)

The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output quantities, the stochastic collocation method (SCM) is used to compute the PCE coefficients. Compared to the "brute force" Monte Carlo forward propagation method, the PCE-SCM approach is shown to be capable of obtaining the same amount of output quantity uncertainty information with orders of magnitude computational savings. Finally, the uncertainties quantified in the correlated model parameters for the suite of uranium and plutonium actinides are propagated through the Big Ten and Flattop assemblies. In the case of the k-effective uncertainties in the Big Ten assembly, the uncorrelated PFNS uncertainties leads to 17.5% smaller predicted uncertainties compared with the correlated PFNS uncertainties, suggesting the presence of these cross-isotope correlations are important for this application. Last, the unified Monte Carlo + total Monte Carlo (UMC+TMC) method is implemented to propagate uncertainties from the prior LA model parameters through the Flattop critical assemblies. Due to the fact that cross-experiment correlations are neglected in all of the present evaluation work, the UMC+TMC suffers by predicting smaller uncertainties in the integral quantities by an order of magnitude or more compared to direct sampling from the posterior LA model parameters.

Rising, Michael E.

441

The Principles of Phototransferred Thermoluminescence  

SciTech Connect

The principles of phototransferred thermoluminescence (PTTL) are described, and some of the basic theoretical ideas underlying this technique are presented. It is demonstrated that the PTTL efficiency is dependent on the photon energy as well as on the activation energies of the various traps involved in the process. A simple two-traps-one-recombination-center model is capable of predicting a variety of different PTTL behaviors, some already were observed experimentally.

Moscovitch, Marko [Georgetown University Medical Center, Washington, DC (United States)

2011-05-05

442

Discrepancy principle for DSM II  

NASA Astrophysics Data System (ADS)

Let Ay = f, A is a linear operator in a Hilbert space H, y ? N( A) ? { u : Au = 0}, R( A) ? { h : h = Au, u ? D( A)} is not closed, ? f? - f? ? ?. Given f?, one wants to construct u? such that lim ??0 ? u? - y? = 0. Two versions of discrepancy principles for the DSM (dynamical systems method) for finding the stopping time and calculating the stable solution u? to the original equation Ay = f are formulated and mathematically justified.

Ramm, A. G.

2008-09-01

443

Principles of Charged Particle Acceleration  

NSDL National Science Digital Library

This learning resources comprise a healthy introduction to charged particle acceleration. The site, by Stanley Humphries, a professor of electrical and computer engineering at University of New Mexico, amounts to an online textbook (.pdf) introducing the theory of charged particle acceleration. The book's fifteen chapters (with bibliography) summarize "the principles underlying all particle accelerators" and provide "a reference collection of equations and material essential to accelerator development and beam applications."

444

Archimedes' principle for Brownian liquid  

E-print Network

We consider a family of hard core objects moving as independent Brownian motions confined to a vessel by reflection. These are subject to gravitational forces modeled by drifts. The stationary distribution for the process has many interesting implications, including an illustration of the Archimedes' principle. The analysis rests on constructing reflecting Brownian motion with drift in a general open connected domain and studying its stationary distribution. In dimension two we utilize known results about sphere packing.

Krzysztof Burdzy; Zhen-Qing Chen; Soumik Pal

2009-12-30

445

Archimedes' principle for Brownian liquid  

E-print Network

We consider a family of hard core objects moving as independent Brownian motions confined to a vessel by reflection. These are subject to gravitational forces modeled by drifts. The stationary distribution for the process has many interesting implications, including an illustration of the Archimedes' principle. The analysis rests on constructing reflecting Brownian motion with drift in a general open connected domain and studying its stationary distribution. In dimension two we utilize known results about sphere packing.

Burdzy, Krzysztof; Pal, Soumik

2009-01-01

446

Uncertainty relations for multiple measurements with applications  

E-print Network

Uncertainty relations express the fundamental incompatibility of certain observables in quantum mechanics. Far from just being puzzling constraints on our ability to know the state of a quantum system, uncertainty relations are at the heart of why some classically impossible cryptographic primitives become possible when quantum communication is allowed. This thesis is concerned with strong notions of uncertainty relations and their applications in quantum information theory. One operational manifestation of such uncertainty relations is a purely quantum effect referred to as information locking. A locking scheme can be viewed as a cryptographic protocol in which a uniformly random n-bit message is encoded in a quantum system using a classical key of size much smaller than n. Without the key, no measurement of this quantum state can extract more than a negligible amount of information about the message, in which case the message is said to be "locked". Furthermore, knowing the key, it is possible to recover, that is "unlock", the message. We give new efficient constructions of bases satisfying strong uncertainty relations leading to the first explicit construction of an information locking scheme. We also give several other applications of our uncertainty relations both to cryptographic and communication tasks. In addition, we define objects called QC-extractors, that can be seen as strong uncertainty relations that hold against quantum adversaries. We provide several constructions of QC-extractors, and use them to prove the security of cryptographic protocols for two-party computations based on the sole assumption that the parties' storage device is limited in transmitting quantum information. In doing so, we resolve a central question in the so-called noisy-storage model by relating security to the quantum capacity of storage devices.

Omar Fawzi

2012-08-29

447

Variational principles for circle patterns  

NASA Astrophysics Data System (ADS)

A Delaunay cell decomposition of a surface with constant curvature gives rise to a circle pattern, consisting of the circles which are circumscribed to the facets. We treat the problem whether there exists a Delaunay cell decomposition for a given (topological) cell decomposition and given intersection angles of the circles, whether it is unique and how it may be constructed. Somewhat more generally, we allow cone-like singularities in the centers and intersection points of the circles. We prove existence and uniqueness theorems for the solution of the circle pattern problem using a variational principle. The functionals (one for the euclidean, one for the hyperbolic case) are convex functions of the radii of the circles. The analogous functional for the spherical case is not convex, hence this case is treated by stereographic projection to the plane. From the existence and uniqueness of circle patterns in the sphere, we derive a strengthened version of Steinitz' theorem on the geometric realizability of abstract polyhedra. We derive the variational principles of Colin de Verdire, Brgger, and Rivin for circle packings and circle patterns from our variational principles. In the case of Brgger's and Rivin's functionals. Leibon's functional for hyperbolic circle patterns cannot be derived directly from our functionals. But we construct yet another functional from which both Leibon's and our functionals can be derived. We present Java software to compute and visualize circle patterns.

Springborn, Boris A.

2003-12-01

448

Between Same-Sex Marriages and the Large Hadron Collider : Making Sense of the Precautionary Principle  

Microsoft Academic Search

The Precautionary Principle is a guide to coping with scientific uncertainties in the assessment and management of risks.\\u000a In recent years, it has moved to the forefront of debates in policy and applied ethics, becoming a key normative tool in policy\\u000a discussions in such diverse areas as medical and scientific research, health and safety regulation, environmental regulation,\\u000a product development, international

Anton Petrenko; Dan McArthur

2010-01-01

449

Uncertainty visualisation in the Model Web  

NASA Astrophysics Data System (ADS)

Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

2012-04-01

450

DPRESS: Localizing estimates of predictive uncertainty  

PubMed Central

Background The need to have a quantitative estimate of the uncertainty of prediction for QSAR models is steadily increasing, in part because such predictions are being widely distributed as tabulated values disconnected from the models used to generate them. Classical statistical theory assumes that the error in the population being modeled is independent and identically distributed (IID), but this is often not actually the case. Such inhomogeneous error (heteroskedasticity) can be addressed by providing an individualized estimate of predictive uncertainty for each particular new object u: the standard error of prediction su can be estimated as the non-cross-validated error st* for the closest object t* in the training set adjusted for its separation d from u in the descriptor space relative to the size of the training set. The predictive uncertainty factor ?t* is obtained by distributing the internal predictive error sum of squares across objects in the training set based on the distances between them, hence the acronym: Distributed PRedictive Error Sum of Squares (DPRESS). Note that st* and ?t*are characteristic of each training set compound contributing to the model of interest. Results The method was applied to partial least-squares models built using 2D (molecular hologram) or 3D (molecular field) descriptors applied to mid-sized training sets (N = 75) drawn from a large (N = 304), well-characterized pool of cyclooxygenase inhibitors. The observed variation in predictive error for the external 229 compound test sets was compared with the uncertainty estimates from DPRESS. Good qualitative and quantitative agreement was seen between the distributions of predictive error observed and those predicted using DPRESS. Inclusion of the distance-dependent term was essential to getting good agreement between the estimated uncertainties and the observed distributions of predictive error. The uncertainty estimates derived by DPRESS were conservative even when the training set was biased, but not excessively so. Conclusion DPRESS is a straightforward and powerful way to reliably estimate individual predictive uncertainties for compounds outside the training set based on their distance to the training set and the internal predictive uncertainty associated with its nearest neighbor in that set. It represents a sample-based, a posteriori approach to defining applicability domains in terms of localized uncertainty. PMID:20298517

2009-01-01

451

The precautionary principle and emerging biologic risks: lessons from human immunodeficiency virus in blood products.  

PubMed

In times of crisis, such as during the early 1980s when acquired immune deficiency syndrome (AIDS) was first recognized as a threat to the blood supply, it can be difficult to find reliable evidence upon which to base appropriate public health policies. Unreliable evidence produces substantial scientific uncertainty. Yet despite ambiguity and unanswered questions, decisions must be made and policy established to protect people's health. The precautionary principle provides important guidelines for public health policy decision making that are of particular value in times of crisis, such as the emergence of a new pathogen: be open and honest about scientific uncertainty; communicate with the public; and consider immediate, adaptable policy decisions. Ongoing research into the important uncertainties and review of policies in light of the data that emerge are crucial to the development of good public policy. PMID:16631821

Stoto, Michael A

2006-04-01

452

Addressing uncertainty in adaptation planning for agriculture  

PubMed Central

We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three cropclimate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Laderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

2013-01-01

453

Spatial Uncertainty Analysis of Ecological Models  

SciTech Connect

The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

2000-09-02

454

Minimizing Uncertainty in Cryogenic Surface Figure Measurement  

NASA Technical Reports Server (NTRS)

A new facility at the Goddard Space Flight Center is designed to measure with unusual accuracy the surface figure of mirrors at cryogenic temperatures down to 12 K. The facility is currently configured for spherical mirrors with a radius of curvature (ROC) of 600 mm, and apertures of about 150 mm or less. The goals of the current experiment were to 1) Obtain the best possible estimate of test mirror surface figure, S(x,y) at 87 K and 20 K; 2) Obtain the best estimate of the cryo-change, Delta (x,y): the change in surface figure between room temperature and the two cryo-temperatures; and 3) Determine the uncertainty of these measurements, using the definitions and guidelines of the ISO Guide to the Expression of Uncertainty in Measurement. A silicon mirror was tested, and the cry-change from room temperature to 20K was found to be 3.7 nm rms, with a standard uncertainty of 0.23 nm in the rms statistic. Both the cryo-change figure and the uncertainty are among the lowest such figures yet published. This report describes the facilities, experimental methods, and uncertainty analysis of the measurements.

Blake, Peter; Mink, Ronald G.; Chambers, John; Robinson, F. David; Content, David; Davila, Pamela

2005-01-01

455

Addressing uncertainty in adaptation planning for agriculture.  

PubMed

We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Lderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

2013-05-21

456

Uncertainty Assessment of Hypersonic Aerothermodynamics Prediction Capability  

NASA Technical Reports Server (NTRS)

The present paper provides the background of a focused effort to assess uncertainties in predictions of heat flux and pressure in hypersonic flight (airbreathing or atmospheric entry) using state-of-the-art aerothermodynamics codes. The assessment is performed for four mission relevant problems: (1) shock turbulent boundary layer interaction on a compression corner, (2) shock turbulent boundary layer interaction due a impinging shock, (3) high-mass Mars entry and aerocapture, and (4) high speed return to Earth. A validation based uncertainty assessment approach with reliance on subject matter expertise is used. A code verification exercise with code-to-code comparisons and comparisons against well established correlations is also included in this effort. A thorough review of the literature in search of validation experiments is performed, which identified a scarcity of ground based validation experiments at hypersonic conditions. In particular, a shortage of useable experimental data at flight like enthalpies and Reynolds numbers is found. The uncertainty was quantified using metrics that measured discrepancy between model predictions and experimental data. The discrepancy data is statistically analyzed and investigated for physics based trends in order to define a meaningful quantified uncertainty. The detailed uncertainty assessment of each mission relevant problem is found in the four companion papers.

Bose, Deepak; Brown, James L.; Prabhu, Dinesh K.; Gnoffo, Peter; Johnston, Christopher O.; Hollis, Brian

2011-01-01

457

Reactivity worth determination for rod position uncertainty  

SciTech Connect

This document provides the technical basis for determining the reactivity uncertainty associated with control rod position uncertainty. This report supports resolution of Issue B-235, Technical Specifications Submittal No. 73. The N Reactor Technical Specifications B 3/4 2.1, Bases for Horizontal Control Rod Systems, allows an uncertainty of 14 inches in rod tip location. Specifically, this reference states, ``An allowance of {+-} 14 inches {+-} (4.0 percent indicated position) in the indicated rod position is permitted.`` Calculations of reactivity uncertainties have been performed with two independent reactor physics computer codes, DELPHI and 3DN to provide estimates of the reactivity change associated with this uncertainty. DELPHI and 3DN have been used extensively for N Reactor neutronics calculations, especially in support of the Flux Flattening Program and the Accelerated Safety Enhancement Program. For this study, the appropriate standard N Reactor model was used in each code with the full Central Zone Flattening (CZF) fuel loading pattern using the standard flux flattening fuel charge loadings. 11 refs., 4 figs., 2 tabs.

Schwinkendorf, K.N.; Tollefson, D.A.; Finfrock, D.H.

1988-03-01

458

Toward a Principle-Based Translator  

E-print Network

A principle-based computational model of natural language translation consists of two components: (1) a module which makes use of a set of principles and parameters to transform the source language into an annotated surface ...

Dorr, Bonnie J.

459

First principles: Systems and their analysis.  

National Technical Information Service (NTIS)

This paper is intended to challenge systems professionals to think about systems -- not at the process level but at the foundational level: first principles. System principles at the concept level, and what one understands about them, determine what one p...

T. W. Woods

1993-01-01

460

Policy Name: Coordinated Timetabling: Principles, Rules & Responsibilities  

E-print Network

Policy Name: Coordinated Timetabling: Principles, Rules & Responsibilities Originating-President (Students and Enrolment) and University Registrar Policy: Coordinated Timetabling Policies: i) Principles of Timetabling, ii) Basic Rules of Timetabling and iii) Academic Timetable Responsibilities ­ Chairs

Carleton University

461

Principles of Procedure in Curriculum Evaluation.  

ERIC Educational Resources Information Center

Suggests evaluation strategies (principles of procedure) for use by curriculum evaluators who work in the "illuminative,""responsive," and "democratic" modes. Principles include independence, disinterest, negotiated access, negotiation of boundaries, negotiation of accounts, publication, confidentiality, and accountability. (DB)

Kemmis, S.; Robottom, I.