While these samples are representative of the content of Science.gov,

they are not comprehensive nor are they the most current set.

We encourage you to perform a real-time search of Science.gov

to obtain the most current and comprehensive results.

Last update: August 15, 2014.

1

Extrapolation, uncertainty factors, and the precautionary principle.

This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. PMID:21802639

Steel, Daniel

2011-09-01

2

Dilaton cosmology and the modified uncertainty principle

NASA Astrophysics Data System (ADS)

Very recently Ali et al. (2009) proposed a new generalized uncertainty principle (with a linear term in Plank length which is consistent with doubly special relativity and string theory. The classical and quantum effects of this generalized uncertainty principle (termed as modified uncertainty principle or MUP) are investigated on the phase space of a dilatonic cosmological model with an exponential dilaton potential in a flat Friedmann-Robertson-Walker background. Interestingly, as a consequence of MUP, we found that it is possible to get a late time acceleration for this model. For the quantum mechanical description in both commutative and MUP framework, we found the analytical solutions of the Wheeler-DeWitt equation for the early universe and compare our results. We have used an approximation method in the case of MUP.

Majumder, Barun

2011-09-01

3

A generalization of the uncertainty threshold principle

A generalized version of the uncertainty threshold principle is presented for discrete-time systems with both jump Markov parameters and independent (white) multiplicative noises. It is shown that this threshold is not to be exceeded for the existence of the quadratically optimal steady-state control for this class of systems

E. Yaz

1990-01-01

4

Dilaton cosmology, noncommutativity, and generalized uncertainty principle

NASA Astrophysics Data System (ADS)

The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

Vakili, Babak

2008-02-01

5

Black Holes and the Generalized Uncertainty Principle

NASA Astrophysics Data System (ADS)

We propose a new way in which black holes connect macrophysics and microphysics. The Generalized Uncertainty Principle suggests corrections to the Uncertainty Principle as the energy increases towards the Planck value. It also provides a natural transition between the expressions for the Compton wavelength below the Planck mass and the black hole event horizon size above it. This suggests corrections to the the event horizon size as the black hole mass falls towards the Planck value, leading to the concept of a Generalized Event Horizon. Extrapolating this expression below the Planck mass suggests the existence of a new kind of black hole, whose size is of order its Compton wavelength. Recently it has been found that such a black hole solution is permitted by loop quantum gravity, its unusual properties deriving from the fact that it is hidden behind the throat of a wormhole. This has important implications for the formation and evaporation of black holes in the early Universe, especially if there are extra spatial dimensions.

Carr, B. J.

2013-12-01

6

Open timelike curves violate Heisenberg's uncertainty principle.

Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity. PMID:23432226

Pienaar, J L; Ralph, T C; Myers, C R

2013-02-01

7

Risks, scientific uncertainty and the approach of applying precautionary principle.

The paper intends to clarify the nature and aspects of risks and scientific uncertainty and also to elaborate the approach of application of precautionary principle for the purpose of handling the risk arising from scientific uncertainty. It explains the relations between risks and the application of precautionary principle at international and domestic levels. In the situations where an international treaty has admitted the precautionary principle and in the situation where there is no international treaty admitting the precautionary principle or enumerating the conditions to take measures, the precautionary principle has a role to play. The paper proposes a decision making tool, containing questions to be asked, to help policymakers to apply the principle. It also proposes a "weighing and balancing" procedure to help them decide the contents of the measure to cope with the potential risk and to avoid excessive measures. PMID:19705643

Lo, Chang-fa

2009-03-01

8

Self-completeness and the generalized uncertainty principle

NASA Astrophysics Data System (ADS)

The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.

Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero

2013-11-01

9

The Generalized Uncertainty Principle and the Friedmann equations

NASA Astrophysics Data System (ADS)

The Generalized Uncertainty Principle (or GUP) affects the dynamics in Plank scale. So the known equations of physics are expected to get modified at that very high energy regime. Very recently authors in Ali et al. (Phys. Lett. B 678:497, 2009) proposed a new Generalized Uncertainty Principle (or GUP) with a linear term in Plank length. In this article, the proposed GUP is expressed in a more general form and the effect is studied for the modification of the Friedmann equations of the FRW universe. In the midway the known entropy-area relation get some new correction terms, the leading order term being proportional to sqrt{Area}.

Majumder, Barun

2011-12-01

10

Single-Slit Diffraction and the Uncertainty Principle

ERIC Educational Resources Information Center

A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

Rioux, Frank

2005-01-01

11

The Uncertainty Principle, Virtual Particles and Real Forces

ERIC Educational Resources Information Center

This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

Jones, Goronwy Tudor

2002-01-01

12

Gauge theories under incorporation of a generalized uncertainty principle

There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

Kober, Martin [Frankfurt Institute for Advanced Studies (FIAS), Institut fuer Theoretische Physik, Johann Wolfgang Goethe-Universitaet, Ruth-Moufang-Strasse 1, 60438 Frankfurt am Main (Germany)

2010-10-15

13

Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

NASA Astrophysics Data System (ADS)

The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4?). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

Oppenheim, Jacob N.; Magnasco, Marcelo O.

2013-01-01

14

Born-Jordan quantization and the uncertainty principle

NASA Astrophysics Data System (ADS)

The Weyl correspondence and the related Wigner formalism lie at the core of traditional quantum mechanics. We discuss here an alternative quantization scheme, the idea of which goes back to Born and Jordan, and which has recently been revived in another context, namely time-frequency analysis. We show in particular that the uncertainty principle does not enjoy full symplectic covariance properties in the Born and Jordan scheme, as opposed to what happens in the Weyl quantization.

de Gosson, Maurice A.

2013-11-01

15

On uncertainty principle of the local polynomial Fourier transform

NASA Astrophysics Data System (ADS)

In this article, a comprehensive study on uncertainty principle of the local polynomial Fourier transform (LPFT) is presented. It shows that the uncertainty product of the LPFT of an arbitrary order is related to the parameters of the signal and the window function, in addition to the errors of estimating the polynomial coefficients. Important factors that affect resolutions of signal representation, such as the window width, the length of overlap between signal segments, order mismatch and estimation errors of polynomial coefficients, are discussed. The effects of minimizing computational complexities on signal representation by reducing the order of the transform and the overlap length between signal segments are also examined. In terms of the signal concentration, comparisons among the short-time Fourier transform, the Wigner-Ville distribution and the second order LPFT are presented. The LPFT is shown to be an excellent candidate providing better representations for time-varying signals.

Li, Xiumei; Bi, Guoan; Li, Shenghong

2012-12-01

16

Weak values, ``negative probability,'' and the uncertainty principle

NASA Astrophysics Data System (ADS)

A quantum transition can be seen as a result of interference between various pathways (e.g., Feynman paths), which can be labeled by a variable f . An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of f¯ . We show f¯ to be an average obtained with an amplitude distribution which can, in general, take negative values, which, in accordance with the uncertainty principle, need not contain information about the actual range of f which contributes to the transition. It is also demonstrated that the moments of such alternating distributions have a number of unusual properties which may lead to a misinterpretation of the weak-measurement results. We provide a detailed analysis of weak measurements with and without post-selection. Examples include the double-slit diffraction experiment, weak von Neumann and von Neumann like measurements, traversal time for an elastic collision, phase time, and local angular momentum.

Sokolovski, D.

2007-10-01

17

Robertson-Schrödinger-type formulation of Ozawa's noise-disturbance uncertainty principle

NASA Astrophysics Data System (ADS)

In this work we derive a matrix formulation of a noise-disturbance uncertainty relation, which is akin to the Robertson-Schrödinger uncertainty principle. Our inequality is stronger than Ozawa's uncertainty principle and takes noise-disturbance correlations into account. Moreover, we show that for certain types of measurement interactions it is covariant with respect to linear symplectic transformations of the noise and disturbance operators. Finally, we also study the tightness of our matrix inequality.

Bastos, Catarina; Bernardini, Alex E.; Bertolami, Orfeu; Costa Dias, Nuno; Prata, João Nuno

2014-04-01

18

This article explores the use of the precautionary principle in situations of intermingled uncertainty and risk. It analyses how the so-called uncertainty paradox works out by examining the Pfizer case. It reveals regulatory complexities that result from contradictions in precautionary thinking. In conclusion, a plea is made for embedment of uncertainty information, while stressing the need to rethink regulatory reform in the broader sense. PMID:16304932

van Asselt, M B A; Vos, E

2005-01-01

19

A minimal time and time-temperature uncertainty principle

We show that introducing torsion in general relativity, that is, physically, considering the effect of the spin and linking the torsion to defects in spacetime topology, we can have a minimal unit of time. Also an uncertainty relation between time and temperature is suggested. The interesting thing is that with this minimal time we can eliminate the divergence of the

Venzo de Sabbata; C. Sivaram

1992-01-01

20

Uncertainty Principle and the Zero-Point Energy of the Harmonic Oscillator

ACCORDING to quantum mechanics, an oscillator possesses a definite zero-point energy of vibration, and an attempt has been made to express this result directly in terms of some general principle. It has been found that the result may be deduced from the uncertainty principle, in view of the particular relation between position, momentum and energy in a simple harmonic field.

R. A. Newing

1935-01-01

21

NASA Astrophysics Data System (ADS)

A macroscopic object equipped with synchronized clocks is examined. General physical relations are directly derived from Lorentz transformations for the case of one-dimensional motion (along the X axis) - the uncertainty relation of the object's x coordinate and the projection of its impulse along the X axis, px, and the uncertainty relation of the object's observation time, t, and its energy, E. The uncertainty relations take the form dpxdx > H and dEdt > H. The H value in the relation has action dimensions and is dependent upon the precision of the rod's clocks and its mass. It is shown that if the macroscopic object in and of itself performs the function of an ideal physical clock, the uncertainty relations derived in the limiting case then take the usual form of dpxdx ? h and dEdt ? h, where h is the Planck constant.

Matvejev, Oleg V.; Matveev, Vadim N.

2013-09-01

22

NASA Astrophysics Data System (ADS)

Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students’ depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an understanding of quantum mechanics. A phenomenographic study was carried out to categorize a picture of students’ descriptions of these key quantum concepts. Data for this study were obtained from a semistructured in-depth interview conducted with undergraduate physics students (N=25) from Bahir Dar, Ethiopia. The phenomenographic data analysis revealed that it is possible to construct three qualitatively different categories to map students’ depictions of the concept wave-particle duality, namely, (1) classical description, (2) mixed classical-quantum description, and (3) quasiquantum description. Similarly, it is proposed that students’ depictions of the concept uncertainty can be described with four different categories of description, which are (1) uncertainty as an extrinsic property of measurement, (2) uncertainty principle as measurement error or uncertainty, (3) uncertainty as measurement disturbance, and (4) uncertainty as a quantum mechanics uncertainty principle. Overall, we found students are more likely to prefer a classical picture of interpretations of quantum mechanics. However, few students in the quasiquantum category applied typical wave phenomena such as interference and diffraction that cannot be explained within the framework classical physics for depicting the wavelike properties of quantum entities. Despite inhospitable conceptions of the uncertainty principle and wave- and particlelike properties of quantum entities in our investigation, the findings presented in this paper are highly consistent with those reported in previous studies. New findings and some implications for instruction and the curricula are discussed.

Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

2011-12-01

23

\\u000a This paper pursues previous studies concerning the foundations of a possibility\\/fuzzy expression of measurement uncertainty.\\u000a Indeed a possibility distribution can be identified to a family of probability distributions whose dispersion intervals are\\u000a included in the level cuts of the possibility distribution. The fuzzy inclusion ordering, dubbed specificity ordering, constitutes\\u000a the basis of a maximal specificity principle for uncertainty expression. We

Gilles Mauris

2010-01-01

24

Corrected Equipartition of Energy of Black Holes in the Generalized Uncertainty Principle Framework

NASA Astrophysics Data System (ADS)

The corrected equipartition of energy for the thermal radiation of black holes is studied through using generalized uncertainty principle and the ensemble theory, and to extend the theory of phase space in the canonical ensemble into the curved space-time, the Painlevé coordinates is introduced. It is found that, there are two items in the corrected expression of the internal energy. The leading item is precisely satisfied the equipartition of energy of the system. The corrected item is inversely proportional to the temperature. Its coefficient is related to the position parameter in the generalized uncertainty principle which is of order of the Plank length.

He, Tang-mei; Wu, Feng-jie; Yuan, Yu-hai

2013-08-01

25

Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

NASA Technical Reports Server (NTRS)

The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

Terazawa, Hidezumi

1996-01-01

26

The principle of possibility maximum specificity as a basis for measurement uncertainty expression

This paper deals with the foundations of a possibility\\/fuzzy expression of measurement uncertainty. Indeed the notion of possibility distribution is clearly identified to a family of probability distributions whose coverage intervals are included in the level cuts of the possibility distribution Thus the fuzzy inclusion ordering, dubbed specificity ordering, constitutes the basis of a maximal specificity principle. The latter is

Gilles Mauris

2009-01-01

27

Commercialization of genetically modified organisms (GMOs) have sparked profound controversies concerning adequate approaches to risk regulation. Scientific uncertainty and ambiguity, omitted research areas, and lack of basic knowledge crucial to risk assessmentshave become apparent. The objective of this article is to discuss the policy and practical implementation of the Precautionary Principle. A major conclusion is that the void in scientific

Anne Ingeborg Myhr; Terje Traavik

2002-01-01

28

We study the dynamical consequences of Maggiore's unique generalised uncertainty principle (GUP). We find that it leads naturally, and generically, to novel consequences. In the high temperature limit, there is a drastic reduction in the degrees of freedom, of the type found, for example, in strings far above the Hagedorn temperature. In view of this, the present GUP may perhaps

S. Kalyana Rama

2001-01-01

29

Black hole entropy and the modified uncertainty principle: A heuristic analysis

NASA Astrophysics Data System (ADS)

Recently Ali et al. (2009) proposed a Generalized Uncertainty Principle (or GUP) with a linear term in momentum (accompanied by Plank length). Inspired by this idea here we calculate the quantum corrected value of a Schwarzschild black hole entropy and a Reissner-Nordström black hole with double horizon by utilizing the proposed generalized uncertainty principle. We find that the leading order correction goes with the square root of the horizon area contributing positively. We also find that the prefactor of the logarithmic contribution is negative and the value exactly matches with some earlier existing calculations. With the Reissner-Nordström black hole we see that this model-independent procedure is not only valid for single horizon spacetime but also valid for spacetimes with inner and outer horizons.

Majumder, Barun

2011-09-01

30

NASA Astrophysics Data System (ADS)

We study the dynamical consequences of Maggiore's unique generalised uncertainty principle (GUP). We find that it leads naturally, and generically, to novel consequences. In the high temperature limit, there is a drastic reduction in the degrees of freedom, of the type found, for example, in strings far above the Hagedorn temperature. In view of this, the present GUP may perhaps be taken as the new version of the Heisenberg uncertainty principle, conjectured by Atick and Witten to be responsible for such reduction. Also, the present GUP leads naturally to varying speed of light and modified dispersion relations. They are likely to have novel implications for cosmology and black hole physics, a few of which we discuss qualitatively.

Kalyana Rama, S.

2001-10-01

31

Microscope and spectroscope results are not limited by Heisenberg's Uncertainty Principle!

NASA Astrophysics Data System (ADS)

A reviewing of many published experimental and theoretical papers demonstrate that the resolving powers of microscopes, spectroscopes and telescopes can be enhanced by orders of magnitude better than old classical limits by various advanced techniques including de-convolution of the CW-response function of these instruments. Heisenberg's original analogy of limited resolution of a microscope, to support his mathematical uncertainty relation, is no longer justifiable today. Modern techniques of detecting single isolated atoms through fluorescence also over-ride this generalized uncertainty principle. Various nano-technology techniques are also making atoms observable and location precisely measurable. Even the traditional time-frequency uncertainty relation or bandwidth limit ?v?t >= 1 can be circumvented while doing spectrometry with short pulses by deriving and de-convolving the pulse-response function of the spectrometer just as we do for CW input.

Prasad, Narasimha S.; Roychoudhuri, Chandrasekhar

2011-09-01

32

Remnant mass and entropy of black holes and modified uncertainty principle

NASA Astrophysics Data System (ADS)

In this paper, we study the thermodynamics of black holes using a generalized uncertainty principle (GUP) with a correction term linear order in the momentum uncertainty. The mass-temperature relation and heat capacity are calculated from which critical and remnant masses are obtained. The results are exact and are found to be identical. The entropy expression gives the famous area theorem upto leading order corrections from GUP. In particular, the linear order term in GUP leads to a correction to the area theorem. Finally, the area theorem can be expressed in terms of a new variable termed as reduced horizon area only when the calculation is done to the next higher order correction from GUP.

Dutta, Abhijit; Gangopadhyay, Sunandan

2014-06-01

33

Key Rate Available from Mismatched Measurements in the BB84 Protocol and the Uncertainty Principle

NASA Astrophysics Data System (ADS)

We consider the mismatched measurements in the BB84 quantum key distribution protocol, in which measuring bases are different from transmitting bases. We give a lower bound on the amount of a secret key that can be extracted from the mismatched measurements. Our lower bound shows that we can extract a secret key from the mismatched measurements with certain quantum channels, such as the channel over which the Hadamard matrix is applied to each qubit with high probability. Moreover, the entropic uncertainty principle implies that one cannot extract the secret key from both matched measurements and mismatched ones simultaneously, when we use the standard information reconciliation and privacy amplification procedure.

Matsumoto, Ryutaroh; Watanabe, Shun

34

Before and beyond the precautionary principle: Epistemology of uncertainty in science and law

The precautionary principle has become, in European regulation of science and technology, a general principle for the protection of the health of human beings, animals, plants, and the environment. It requires that '[w]here there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation'. By focusing on situations of scientific uncertainty where data are lacking, insufficient, or inconclusive, the principle introduced a shift from a neutral legal attitude towards science to a bias in favor of safety, and a shift from the paradigm of science certain and objective to the awareness that the legal regulation of science involves decisions about values and interests. Implementation of the precautionary principle is highly variable. A crucial question still needs to be answered regarding the assumption that scientific certainty is a 'normal' characteristic of scientific knowledge. The relationship between technoscience and society has moved into a situation where uncertain knowledge is the rule. From this perspective, a more general framework for a democratic governance of science is needed. In democratic society, science may still have a special authoritative voice, but it cannot be the ultimate word on decisions that only the broader society may make. Therefore, the precautionary model of scientific regulation needs to be informed by an 'extended participatory model' of the relationship between science and society.

Tallacchini, Mariachiara [Bioethics, Faculty of Biotechnology, University of Milan, Via Celoria 10, 20100 Milan (Italy) and Science Technology and Law, Law Faculty, University of Piacenza, Via Emilia Parmense 84, 29100 Piacenza (Italy)]. E-mail: mariachiara.tallacchini@unimi.it

2005-09-01

35

Covariant energy–momentum and an uncertainty principle for general relativity

We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.

Cooperstock, F.I., E-mail: cooperst@uvic.ca [Department of Physics and Astronomy, University of Victoria, P.O. Box 3055, Victoria, B.C. V8W 3P6 (Canada); Dupre, M.J., E-mail: mdupre@tulane.edu [Department of Mathematics, Tulane University, New Orleans, LA 70118 (United States)

2013-12-15

36

NASA Astrophysics Data System (ADS)

most important drawback of standard fuzzy arithmetic is unrealistic accumulation of input uncertainties which results in divergence of fuzzy outputs. Some of the currently available methods of simulating fuzzy systems provide the results which tend to naive large values after several time steps of the system simulation. In this paper, a new fuzzy arithmetic operator based on fuzzy extension principle has been proposed for simulation and assessment of uncertainty in hydrological systems. Implementing the concept of fuzzy approximate reasoning in the proposed approach in this study represents acceptable behavior in uncertainty propagation from the parameters and structures of the models to the outputs. To show the efficiency of the proposed fuzzy arithmetic operator in the context of hydrologic modeling, two nonlinear monthly water balance models have been examined and their outputs have been compared with the results obtained by standard fuzzy arithmetic and the Vertex method. One small humid basin in France and a middle size basin in a semiarid region in Iran have been the case studies of this research. In this paper, the lower and upper bounds and the most frequent values of the model parameters inferred from the sampling-simulation procedure have been used to define triangular fuzzy membership functions. Three statistical indicators have been used to evaluate efficiency of the methods based on the bracket observations and coverage of the uncertainty bounds. The estimated values of these indicators have shown that both Vertex and the proposed methods outperform standard fuzzy arithmetic. Also, the proposed method has provided better or roughly equal efficiencies compared with Vertex method over both basins.

Nasseri, M.; Ansari, A.; Zahraie, B.

2014-02-01

37

Completeness, special functions and uncertainty principles over q-linear grids

NASA Astrophysics Data System (ADS)

We derive completeness criteria for sequences of functions of the form f(x?n), where ?n is the nth zero of a suitably chosen entire function. Using these criteria, we construct complete nonorthogonal systems of Fourier-Bessel functions and their q-analogues, as well as other complete sets of q-special functions. We discuss connections with uncertainty principles over q-linear grids and the completeness of certain sets of q-Bessel functions is used to prove that, if a function f and its q-Hankel transform both vanish at the points {q-n}?n=1, 0 < q < 1, then f must vanish on the whole q-linear grid {qn}?n=-?.

Abreu, Luís Daniel

2006-11-01

38

MY exuberant friend Prof. Armstrong (NATURE, Feb. 6, p. 195) seems uncertain about many things for which there is good evidence, and to glory in his uncertainty; but there is no merit in uncertainty in itself: it is just as much a sign of crankiness to reject good evidence as it is to accept bad. His attitude prevents his own

Oliver Lodge

1926-01-01

39

This enquiry concerning the principles of cultural norms and values focuses on the impact of mortality and uncertainty salience on people’s reactions to events that violate or bolster their cultural norms and values. Five experiments show that both mortality and uncertainty salience influence people’s reactions to violations and bolstering of their cultural worldviews, yielding evidence for both terror and uncertainty

Kees van den Bos; P. Marijn Poortvliet; Marjolein Maas; Joost Miedema; Ernst-Jan van den Ham

2005-01-01

40

The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law. PMID:16304938

Lierman, S; Veuchelen, L

2005-01-01

41

NASA Astrophysics Data System (ADS)

By using the null tetrad and the 't Hooft brick-wall model, the quantum entropies of a Reissner-Nordström black hole due to the Weyl neutrino, electromagnetic, massless Rarita-Schwinger and gravitational fields for the source-free case are investigated from a generalized uncertainty principle. The divergence structure for the entropy is demonstrated. In addition to the usual linearly and logarithmically divergent terms, additional quadratic, cubic, biquadratic and other higher order divergences exist near the event horizon in the entropy, which not only depend on the black hole characteristics but also on the spin fields and the gravitational interactions. The terms describe the contribution of the quantum fields to the entropy and the effects of the generalized uncertainty principle on it. If the smallest length scale is taken into account, the contribution of the gravitational interactions to the entropy is found to be a part of the dominant term and very important, and therefore it can not be neglected.

Li, Guqiang

2014-06-01

42

A violation of the uncertainty principle implies a violation of the second law of thermodynamics.

Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature. PMID:23575674

Hänggi, Esther; Wehner, Stephanie

2013-01-01

43

NASA Astrophysics Data System (ADS)

Galactic scaling relations between the (surface densities of) the gas mass and the star formation (SF) rate are known to develop substantial scatter or even change form when considered below a certain spatial scale. We quantify how this behaviour should be expected due to the incomplete statistical sampling of independent star-forming regions. Other included limiting factors are the incomplete sampling of SF tracers from the stellar initial mass function and the spatial drift between gas and stars. We present a simple uncertainty principle for SF, which can be used to predict and interpret the failure of galactic SF relations on small spatial scales. This uncertainty principle explains how the scatter of SF relations depends on the spatial scale and predicts a scale-dependent bias of the gas depletion time-scale when centring an aperture on gas or SF tracer peaks. We show how the scatter and bias are sensitive to the physical size and time-scales involved in the SF process (such as its duration or the molecular cloud lifetime), and illustrate how our formalism provides a powerful tool to constrain these largely unknown quantities. Thanks to its general form, the uncertainty principle can also be applied to other astrophysical systems, e.g. addressing the time evolution of star-forming cores, protoplanetary discs or galaxies and their nuclei.

Kruijssen, J. M. Diederik; Longmore, Steven N.

2014-04-01

44

We introduce a self-consistent theoretical framework associated with the Schwinger unitary operators whose basic mathematical rules embrace a new uncertainty principle that generalizes and strengthens the Massar–Spindel inequality. Among other remarkable virtues, this quantum-algebraic approach exhibits a sound connection with the Wiener–Khinchin theorem for signal processing, which permits us to determine an effective tighter bound that not only imposes a new subtle set of restrictions upon the selective process of signals and wavelet bases, but also represents an important complement for property testing of unitary operators. Moreover, we establish a hierarchy of tighter bounds, which interpolates between the tightest bound and the Massar–Spindel inequality, as well as its respective link with the discrete Weyl function and tomographic reconstructions of finite quantum states. We also show how the Harper Hamiltonian and discrete Fourier operators can be combined to construct finite ground states which yield the tightest bound of a given finite-dimensional state vector space. Such results touch on some fundamental questions inherent to quantum mechanics and their implications in quantum information theory. -- Highlights: •Conception of a quantum-algebraic framework embracing a new uncertainty principle for unitary operators. •Determination of new restrictions upon the selective process of signals and wavelet bases. •Demonstration of looser bounds interpolating between the tightest bound and the Massar–Spindel inequality. •Construction of finite ground states properly describing the tightest bound. •Establishment of an important connection with the discrete Weyl function.

Marchiolli, M.A., E-mail: marcelo_march@bol.com.br [Avenida General Osório 414, Centro, 14.870-100 Jaboticabal, SP (Brazil); Mendonça, P.E.M.F., E-mail: pmendonca@gmail.com [Academia da Força Aérea, C.P. 970, 13.643-970 Pirassununga, SP (Brazil)] [Academia da Força Aérea, C.P. 970, 13.643-970 Pirassununga, SP (Brazil)

2013-09-15

45

Disputes over invocation of precaution in the presence of uncertainty are building. This essay finds: (1) analysis of past WTO panel decisions and current EU-US regulatory conflicts suggests that appeals to scientific risk assessment will not resolve emerging conflicts; (2) Bayesian updating strategies, with commitments to modify policies as information emerges, may ameliorate conflicts over precaution in environmental and security affairs. PMID:16304935

Oye, K A

2005-01-01

46

Asymptotic theories of classical micromechanics are built on a fundamental assumption of large separation of scales. For random heterogeneous materials the scale- decoupling assumption however is inapplicable in many circumstances from conventional failure problems to novel small-scale engineering systems. Development of new theories for scale-coupling mechanics and uncertainty quantification is considered to have significant impacts on diverse disciplines. Scale-coupling effects

X. Frank Xu

2009-01-01

47

NASA Astrophysics Data System (ADS)

The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

McLeod, David; McLeod, Roger

2008-04-01

48

Statistical uncertainty in soil temperature and volumetric water content and related moisture and heat fluxes predicted by a state-of-the-art soil module (embedded in a numerical weather prediction (NWP) model) is analyzed by Gaussian error-propagation (GEP) principles. This kind of uncertainty results from the indispensable use of empirical soil parameters. Since for the same thermodynamic and hydrological surface forcing and mean

Nicole Mölders; Mihailo Jankov; Gerhard Kramm

2005-01-01

49

Femtoscopic scales in p+p and p+Pb collisions in view of the uncertainty principle

NASA Astrophysics Data System (ADS)

A method for quantum corrections of Hanbury-Brown/Twiss (HBT) interferometric radii produced by semi-classical event generators is proposed. These corrections account for the basic indistinguishability and mutual coherence of closely located emitters caused by the uncertainty principle. A detailed analysis is presented for pion interferometry in p+p collisions at LHC energy (?{s}=7 TeV). A prediction is also presented of pion interferometric radii for p+Pb collisions at ?{s}=5.02 TeV. The hydrodynamic/hydrokinetic model with UrQMD cascade as ‘afterburner’ is utilized for this aim. It is found that quantum corrections to the interferometry radii improve significantly the event generator results which typically overestimate the experimental radii of small systems. A successful description of the interferometry structure of p+p collisions within the corrected hydrodynamic model requires the study of the problem of thermalization mechanism, still a fundamental issue for ultrarelativistic A+A collisions, also for high multiplicity p+p and p+Pb events.

Shapoval, V. M.; Braun-Munzinger, P.; Karpenko, Iu. A.; Sinyukov, Yu. M.

2013-08-01

50

NASA Technical Reports Server (NTRS)

The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

1992-01-01

51

Decision Making Under Uncertainty.

National Technical Information Service (NTIS)

This report introduces concepts, principles, and approaches for addressing uncertainty in decision making. The sources of uncertainty in decision making are discussed, emphasizing the distinction between uncertainty and risk, and the characterization of u...

B. K. Harper K. N. Mitchell M. T. Schultz T. S. Bridges

2010-01-01

52

NASA Astrophysics Data System (ADS)

The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.

Mazurova, Elena; Lapshin, Aleksey

2013-04-01

53

It is frequently lamented that human factors and ergonomics knowledge does not receive the attention and consideration that it deserves. In this paper I argue that in order to change this situation human factors/ergonomics based system design needs to be positioned as a strategic task within a conceptual framework that incorporates both business and design concerns. The management of uncertainty is presented as a viable candidate for such a framework. A case is described where human factors/ergonomics experts in a railway company have used the management of uncertainty perspective to address strategic concerns at firm level. Furthermore, system design is discussed in view of the relationship between organization and technology more broadly. System designers need to be supported in better understanding this relationship in order to cope with the uncertainties this relationship brings to the design process itself. Finally, the emphasis on uncertainty embedded in the recent surge of introducing risk management across all business sectors is suggested as another opportunity for bringing human factors and ergonomics expertise to the fore. PMID:23622735

Grote, Gudela

2014-01-01

54

NASA Astrophysics Data System (ADS)

The non-negativity of the density operator of a state is faithfully coded in its Wigner distribution, and this coding places on the moments of the Wigner distribution constraints arising from the non-negativity of the density operator. Working in a monomial basis for the algebra \\hat{ A} of operators on the Hilbert space of a bosonic mode, we formulate these constraints in a canonically covariant form which is both concise and explicit. Since the conventional uncertainty relation is such a constraint on the first and second moments, our result constitutes a generalization of the same to all orders. The structure constants of \\hat{ A}, in the monomial basis, are shown to be essentially the SU(2) Clebsch-Gordan coefficients. Our results have applications in quantum state reconstruction using optical homodyne tomography and, when generalized to the n-mode case, which will be done in the second part of this work, will have applications also for continuous variable quantum information systems involving non-Gaussian states.

Ivan, J. Solomon; Mukunda, N.; Simon, R.

2012-05-01

55

NASA Astrophysics Data System (ADS)

Put two counters at origin O and particle P respectively, the wave-number difference counted by two counters at same moment is the length x between P and O (as a rod). The metrical result of known Doppler effect is: x(?) = x0 (1+ ? cos ?) (1). ?= v/c, v is the velocity of counter to light-source, c = c+ = c -is the metrical one-way velocity of light, v • n = v cos ?, ? is the angle between v and unit-vector n of light-beam pointing to counter from light-source, x0 is the metrical length when v = 0. The result counted by a counter in one second is the light-wave frequency: f(?) = f0 (1 -? cos ?) (2). f0 is the metrical frequency when v = 0. From Eq.(1) and Eq (2): x 2 (?) = x0 2 (1+2 ? cos ? + ? 2 cos2 ?); f 2 (?) = f 0 2 (1-2 ? cos ? + ? 2 cos2 ?). Define the square-difference root of the metrical results in two contrary directions: ?x = (x 2 (0) -x 2 (?)) 1/2 = 2 x0 ? 1/2 (3); ?f = (f 2 (0) -f 2 (?))1/2 = i 2 f0 ? 1/2 (4); ?x • ?f = i 4 x0 f0 ? (5). From p = m v and the variance in absolute average value of Eq.(2) ?f= 2 f0 ?v/? c, Eq.(5) changes into: ?x•?p= 2 ? x0 p (6). Once a particle collides with CMB photon, its velocity will change as in a quasi-Brownian motion. Let S be the average space-distance between CMB photons, the time-interval between two collisions is S / v, v is the velocity of particle. Because x0 is the length of an imaginary resting rod, i.e., after every collision the origin O must be reset jumpily at a new position and the jumpy distance (S/v) • ?v is just the displacement of particle x0 , ?v is the variance in velocity caused by each collision. The variance in momentum of particle ?p in each collision is the average momentum p0 of CMB photon, then we obtain: x0 = S ?v / v = S ?p /p = S p0 /p and Eq.(6) changes into: ?x•?p= 2 ? p0 S (7). The average energy and average momentum of CMB photon in 2.7K are: e0 = k T= 3.72•10-16 erg; p0 = e0 /c =1.24•10 -26 g cm s -1 . The average number density of CMB photons is about 200/cm3 (or 5.9/cm) measured on U2 airplane. The reciprocal 0.17cm of 5.9/cm is just the average freedom path S of the particle impacting with CMB photons. The virtual photons possess e0 and p0 of CMB photons owing to the energy-exchange in long-time coexist. The metrical value of Casimir force shows that the number density of virtual photons is far larger than that of CMB photons. The most collisions of virtual photons with particle have no measurable effect (self-counteracting momentum-balance). The residual virtual photons in imbalanced collisions with CMB photons are again in a dynamical balance and both number and both average freedom paths will be equal when a particle has no macro-displacement. In the cosmic space the virtual photons and CMB photons gather together, the total valid average freedom path of a particle will be equal to 0.085cm. The action-quantity p0 S on a particle by CMB photons and virtual photons is: p0 S =1.24•10-26 g cm s-1 • 0.085cm =1.054•10-27 erg • s. The metrical Planck constant is: h / 2? =1.0546•10-27 erg • s. It is worth thinking that both p0 S and h /2 ? have the same dimension and their magnitudes are also very approaching. If we think that the quantum effect comes from the action on the particle by the vacuum virtual photons and CMB photons, then the action-quantity 2 ? p0 S is just the Planck constant h and ?x•?p= h (8). It is just the uncertainty principle, now it is the metrical results of Doppler effects in two contrary directions. The wave-particle duality is likely a quasi-Brownian motion of a particle in vacuum. The nonzero time in measuring course and the particle's quasi-Brownian motion make it impossible to measure accurately the position x and the momentum p of a particle. Then the uncertainty principle becomes a metrical theorem of the generalized Newton mechanics.

Chen, Shao-Guang

56

Risk Management Principles for Nanotechnology

Risk management of nanotechnology is challenged by the enormous uncertainties about the risks, benefits, properties, and future\\u000a direction of nanotechnology applications. Because of these uncertainties, traditional risk management principles such as acceptable\\u000a risk, cost–benefit analysis, and feasibility are unworkable, as is the newest risk management principle, the precautionary\\u000a principle. Yet, simply waiting for these uncertainties to be resolved before undertaking

Gary E. Marchant; Douglas J. Sylvester; Kenneth W. Abbott

2008-01-01

57

Quantitative uncertainty assessments and the distribution of risk are under scrutiny and significant criticism has been made of null hypothesis testing when careful consideration of Type I (false positive) and II (false negative) error rates have not been taken into account. An alternative method, equivalence testing, is discussed yielding more transparency and potentially more precaution in the quantifiable uncertainty assessments. With thousands of chemicals needing regulation in the near future and low public trust in the regulatory process, decision models are required with transparency and learning processes to manage this task. Adaptive, iterative, and learning decision making tools and processes can help decision makers evaluate the significance of Type I or Type II errors on decision alternatives and can reduce the risk of committing Type III errors (accurate answers to the wrong questions). Simplistic cost-benefit based decision-making tools do not incorporate the complex interconnectedness characterizing environmental risks, nor do they enhance learning, participation, or include social values and ambiguity. Hence, better decision-making tools are required, and MIRA is an attempt to include some of the critical aspects. PMID:16304937

Sanderson, H; Stahl, C H; Irwin, R; Rogers, M D

2005-01-01

58

Uncertainty in Computational Aerodynamics

NASA Technical Reports Server (NTRS)

An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

2003-01-01

59

NASA Astrophysics Data System (ADS)

Vision, via transform space: ``Nature behaves in a reciprocal way;' also, Rect x pressure-input sense-reports as Sinc p, indicating brain interprets reciprocal ``p'' space as object space. Use Mott's and Sneddon's Wave Mechanics and Its Applications. Wave transformation functions are strings of positron, electron, proton, and neutron; uncertainty is a semantic artifact. Neutrino-string de Broglie-Schr"odinger wave-function models for electron, positron, suggest three-quark models for protons, neutrons. Variably vibrating neutrino-quills of this model, with appropriate mass-energy, can be a vertical proton string, quills leftward; thread string circumferentially, forming three interlinked circles with ``overpasses''. Diameters are 2:1:2, center circle has quills radially outward; call it a down quark, charge --1/3, charge 2/3 for outward quills, the up quarks of outer circles. String overlap summations are nodes; nodes also far left and right. Strong nuclear forces may be --px. ``Dislodging" positron with neutrino switches quark-circle configuration to 1:2:1, `downers' outside. Unstable neutron charge is 0. Atoms build. With scale factors, retinal/vision's, and quantum mechanics,' spatial Fourier transforms/inverses are equivalent.

Mc Leod, Roger David; Mc Leod, David M.

2007-10-01

60

The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill. PMID:18573808

Bartley, David; Lidén, Göran

2008-08-01

61

Uncertainty is triggered by many events during the experience of illness - from hearing bad news to meeting a new doctor. Oncology professionals need to recognize the intense feelings associated with uncertainty and respond empathically to patients. This article describes opportunities to strengthen the therapeutic connection and minimize uncertainty. PMID:24337763

Schapira, Lidia

2014-03-01

62

NASA Astrophysics Data System (ADS)

Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

Koch, Michael

63

The Uncertainty Principle: A Reply to Kempen.

ERIC Educational Resources Information Center

Responds to a commentary in this issue by Kempen on an experiment by Frazier and others involving Dutch-language lexical processing. Postulates that it is unclear control items were open to complex verbal analysis; more research is needed to determine how the verb "hebben" is interpreted in context; and Kempen's account of the results is…

Frazier, Lyn

1995-01-01

64

Nab: Measurement Principles, Apparatus and Uncertainties

The Nab collaboration will perform a precise measurement of a, the electron-neutrino correlation parameter, and b, the Fierz interference term in neutron beta decay, in the Fundamental Neutron Physics Beamline at the SNS, using a novel electric/magnetic field spectrometer and detector design. The experiment is aiming at the 10{sup -3} accuracy level in {Delta}a/a, and will provide an independent measurement of {lambda} = G{sub A}/G{sub V}, the ratio of axial-vector to vector coupling constants of the nucleon. Nab also plans to perform the first ever measurement of b in neutron decay, which will provide an independent limit on the tensor weak coupling.

Pocanic, Dinko [University of Virginia; Bowman, James D [ORNL; Cianciolo, Vince [ORNL; Greene, Geoffrey [University of Tennessee, Knoxville (UTK); Grzywacz, Robert [University of Tennessee, Knoxville (UTK); Penttila, Seppo [Oak Ridge National Laboratory (ORNL); Rykaczewski, Krzysztof Piotr [ORNL; Young, Glenn R [ORNL; The, Nab [Collaboration affiliations

2009-01-01

65

Evaluation of uncertainty visualization techniques for information fusion

This paper highlights the importance of uncertainty visualization in information fusion, reviews general methods of representing uncertainty and presents perceptual and cognitive principles from Tufte, Chambers and Bertin as well as users experiments documented in the literature. Examples of uncertainty representations in information fusion are analyzed using these general theories. These principles can be used in future theoretical evaluations of

Maria Riveiro

2007-01-01

66

Position-momentum uncertainty products

NASA Astrophysics Data System (ADS)

We point out two interesting features of position-momentum uncertainty product: U = ?x?p. We show that two special (non-differentiable) eigenstates of the Schrödinger operator with the Dirac delta potential [V(x) = ?V0?(x)], V0 > 0, also satisfy Heisenberg’s uncertainty principle by yielding U> \\frac{\\hbar }{2}. One of these eigenstates is a zero-energy and zero-curvature bound state.

Ahmed, Zafar; Yadav, Indresh

2014-07-01

67

Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation\\u000a of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of\\u000a Pb by ICP-AES. The improved calibration uncertainty was verified from independent measurements of the same sample by demonstrating\\u000a statistical control of analytical results and the absence

Kaj Heydorn; Thomas Anglov

2002-01-01

68

NSDL National Science Digital Library

This article, authored by P.G. Moore for the Royal Statistical Society's website, provides well-defined exercises to assess the probabilities of decision-making and the degree of uncertainty. The author states the focus of the article as: "When analyzing situations which involve decisions to be made as between alternative courses of action under conditions of uncertainty, decision makers and their advisers are often called upon to assess judgmental probability distributions of quantities whose true values are unknown to them. How can this judgment be taught?" Moore provides five different exercises and even external reference for those interested in further study of the topic.

Moore, P. G.

2009-04-08

69

NSDL National Science Digital Library

Many physics teachers have an unclear understanding of Bernoulli's principle, particularly when the principle is applied to aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it altogether. The following simplified treatment of the principle ignores most of the complexities of aerodynamics and hopefully will encourage teachers to bring Bernoulli back into the classroom.

Hewitt, Paul G.

2004-09-01

70

The evolution of Bayesian approaches for model uncertainty over the past decade has been remarkable. Catalyzed by advances in methods and technology for posterior computation, the scope of these methods has widened substantially. Major thrusts of these developments have included new methods for semiautomatic prior specification and posterior exploration. To illustrate key aspects of this evolution, the highlights of some

Merlise Clyde; Edward I. George

2004-01-01

71

Generalized Entropic Uncertainty Relations with Tsallis' Entropy

NASA Technical Reports Server (NTRS)

A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

Portesi, M.; Plastino, A.

1996-01-01

72

NSDL National Science Digital Library

This page, from Kyoto University, provides a discussion of Machâs Principle, a concept that played an important role in forming Einstein's theory of general relativity. Excerpts from Machâs original text are examined and discussed for his ideas that are closely related to this principle. The general ambiguity of Machâs Principle, and Einsteinâs interpretations of it are also presented.

Uchii, Soshichi

2007-10-10

73

Uncertainty in the Classroom--Teaching Quantum Physics

ERIC Educational Resources Information Center

The teaching of the Heisenberg uncertainty principle provides one of those rare moments when science appears to contradict everyday life experiences, sparking the curiosity of the interested student. Written at a level appropriate for an able high school student, this article provides ideas for introducing the uncertainty principle and showing how…

Johansson, K. E.; Milstead, D.

2008-01-01

74

An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

Thomas, R.E.

1982-03-01

75

The link between entropic uncertainty and nonlocality

NASA Astrophysics Data System (ADS)

Two of the most intriguing features of quantum physics are the uncertainty principle and the occurrence of nonlocal correlations. The uncertainty principle states that there exist pairs of incompatible measurements on quantum systems such that their outcomes cannot both be predicted. On the other hand, nonlocal correlations of measurement outcomes at different locations cannot be explained by classical physics, but appear in the presence of entanglement. Here, we show that these two fundamental quantum effects are quantitatively related. Namely, we provide an entropic uncertainty relation for the outcomes of two binary measurements, where the lower bound on the uncertainty is quantified in terms of the maximum Clauser-Horne-Shimony-Holt value that can be achieved with these measurements. We discuss applications of this uncertainty relation in quantum cryptography, in particular, to certify quantum sources using untrusted devices.

Tomamichel, Marco; Hänggi, Esther

2013-02-01

76

Entropic uncertainty relations in multidimensional position and momentum spaces

Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

Huang Yichen [Department of Physics, University of California, Berkeley, Berkeley, California 94720 (United States)

2011-05-15

77

NSDL National Science Digital Library

This site from HyperPhysics provides a description of Pascal's Principle, which explains how pressure is transmitted in an enclosed fluid. Drawings and sample calculations are provided. Examples illustrating the principle include a hydraulic press and an automobile hydraulic lift.

Nave, Carl R.

2011-11-28

78

NASA Astrophysics Data System (ADS)

Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

Lamport, Leslie

2012-08-01

79

Regulating Stock Externalities Under Uncertainty

Using a simple analytical model incorporating benefits of a stock, costs of adjusting the stock, and uncertainty in costs, we uncover several important principles governing the choice of price-based policies (e.g., taxes) relative to quantity-based policies (e.g., tradable permits) for controlling stock externalities. As in Weitzman (Rev. Econom. Stud. 41(4) (1974) 477), the relative slopes of the marginal benefits and

Richard G. Newell; William A. Pizer

2000-01-01

80

Angular performance measure for tighter uncertainty relations

The uncertainty principle places a fundamental limit on the accuracy with which we can measure conjugate quantities. However, the fluctuations of these variables can be assessed in terms of different estimators. We propose an angular performance that allows for tighter uncertainty relations for angle and angular momentum. The differences with previous bounds can be significant for particular states and indeed may be amenable to experimental measurement with the present technology.

Hradil, Z.; Rehacek, J. [Department of Optics, Palacky University, 17. listopadu 50, 772 00 Olomouc (Czech Republic); Klimov, A. B. [Departamento de Fisica, Universidad de Guadalajara, 44420 Guadalajara, Jalisco (Mexico); Rigas, I.; Sanchez-Soto, L. L. [Departamento de Optica, Facultad de Fisica, Universidad Complutense, E-28040 Madrid (Spain)

2010-01-15

81

Planning with Map Uncertainty.

National Technical Information Service (NTIS)

We describe an efficient method for planning in environments for which prior maps are plagued with uncertainty. Our approach processes the map to determine key areas whose uncertainty is crucial to the planning task. It then incorporates the uncertainty a...

A. Stentz D. Ferguson

2004-01-01

82

Improved bounds on entropic uncertainty relations

NASA Astrophysics Data System (ADS)

Entropic uncertainty relations place nontrivial lower bounds to the sum of Shannon information entropies for noncommuting observables. Here we obtain a lower bound on the entropy sum for general pairs of observables in finite-dimensional Hilbert space, which improves on the best bound known to date [H. Maassen and J. B. M. Uffink, Phys. Rev. Lett. 60, 1103 (1988)] for a wide class of observables. This result follows from another formulation of the uncertainty principle, the Landau-Pollak inequality, whose relationship to the Maassen-Uffink entropic uncertainty relation is discussed.

de Vicente, Julio I.; Sánchez-Ruiz, Jorge

2008-04-01

83

NSDL National Science Digital Library

In this lab, students will use a little background information about Bernoulli's principle to figure out how the spinning of a moving ball affects its trajectory. The activity is inquiry in that students will be discovering this relationship on their own.

Horton, Michael

2009-05-30

84

NSDL National Science Digital Library

Bernoulli's principle relates the pressure of a fluid to its elevation and its speed. Bernoulli's equation can be used to approximate these parameters in water, air or any fluid that has very low viscosity. Students learn about the relationships between the components of the Bernoulli equation through real-life engineering examples and practice problems.

Integrated Teaching And Learning Program And Laboratory

85

NSDL National Science Digital Library

On this site from the NASA Glenn Research Center Learning Technologies Project, the science and history of rocketry is explained. Visitors will find out how rocket principles illustrate Newton's Laws of Motion. There is a second page of this site, Practical Rocketry, which discusses the workings of rockets, including propellants, engine thrust control, stability and control systems, and mass.

2008-07-29

86

The basic operating principles, design, and applications of radars are discussed in an introductory text intended for first-year graduate students. Topics addressed include radar measurements, radar target cross sections, radar detection, ground effects, matched filters, ambiguity functions, coded radar signals, and radar measurement accuracy. Consideration is given to processing coherent pulse trains, moving-target indicators, CFAR, SAR, and monopulse antenna tracking.

Nadav Levanon

1988-01-01

87

Alternative Approaches to Uncertainty Calculations for TIMS Isotopic Measurements

Two methods of estimating uncertainty for TIMS U isotopic ratio measurements were evaluated. Although these methods represent fundamentally different approaches both are consistent with the principles outlined in the ISO \\

R. B. Thomas; R. M. Essex; S. A. Goldberg

2006-01-01

88

NASA Astrophysics Data System (ADS)

The basic operating principles, design, and applications of radars are discussed in an introductory text intended for first-year graduate students. Topics addressed include radar measurements, radar target cross sections, radar detection, ground effects, matched filters, ambiguity functions, coded radar signals, and radar measurement accuracy. Consideration is given to processing coherent pulse trains, moving-target indicators, CFAR, SAR, and monopulse antenna tracking. Extensive diagrams and graphs are provided.

Levanon, Nadav

89

NASA Technical Reports Server (NTRS)

Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

Sato, Toru

1989-01-01

90

How to handle calibration uncertainties in high-energy astrophysics

Unlike statistical errors, whose importance has been well established in astronomical applications, uncertainties in instrument calibration are generally ignored. Despite wide recognition that uncertainties in calibration can cause large systematic errors, robust and principled methods to account for them have not been developed, and consequently there is no mechanism by which they can be incorporated into standard astronomical data analysis.

Vinay L. Kashyap; Hyunsook Lee; Aneta Siemiginowska; Jonathan McDowell; Arnold Rots; Jeremy Drake; Pete Ratzlaff; Andreas Zezas; Rima Izem; Alanna Connors; David van Dyk; Taeyoung Park

2008-01-01

91

Uncertainty and Dimensional Calibrations.

National Technical Information Service (NTIS)

The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules, Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty sta...

T. Doiron J. Stoup

1997-01-01

92

Epistemic uncertainty quantification tutorial

This paper presents a basic tutorial on epistemic uncertainty quantification methods. Epistemic uncertainty, characterizing lack-of-knowledge, is often prevalent in engineering applications. However, the methods we have for analyzing and propagating epistemic uncertainty are not as nearly widely used or well-understood as methods to propagate aleatory uncertainty (e.g. inherent variability characterized by probability distributions). We examine three methods used in propagating

Laura Painton Swiler; Randall Lee Mayes; Thomas Lee Paez

2008-01-01

93

Uncertainty in audiometer calibration

NASA Astrophysics Data System (ADS)

The objective of this work is to present a metrology study necessary for the accreditation of audiometer calibration procedures at the National Brazilian Institute of Metrology Standardization and Industrial Quality—INMETRO. A model for the calculation of measurement uncertainty was developed. Metrological aspects relating to audiometer calibration, traceability and measurement uncertainty were quantified through comparison between results obtained at the Industrial Noise Laboratory—LARI of the Federal University of Santa Catarina—UFSC and the Laboratory of Electric/acoustics—LAETA of INMETRO. Similar metrological performance of the measurement system used in both laboratories was obtained, indicating that the interlaboratory results are compatible with the expected values. The uncertainty calculation was based on the documents: EA-4/02 Expression of the Uncertainty of Measurement in Calibration (European Co-operation for Accreditation 1999 EA-4/02 p 79) and Guide to the Expression of Uncertainty in Measurement (International Organization for Standardization 1993 1st edn, corrected and reprinted in 1995, Geneva, Switzerland). Some sources of uncertainty were calculated theoretically (uncertainty type B) and other sources were measured experimentally (uncertainty type A). The global value of uncertainty calculated for the sound pressure levels (SPLs) is similar to that given by other calibration institutions. The results of uncertainty related to measurements of SPL were compared with the maximum uncertainties Umax given in the standard IEC 60645-1: 2001 (International Electrotechnical Commission 2001 IEC 60645-1 Electroacoustics—Audiological Equipment—Part 1:—Pure-Tone Audiometers).

Aurélio Pedroso, Marcos; Gerges, Samir N. Y.; Gonçalves, Armando A., Jr.

2004-02-01

94

Uncertainty and Cognitive Control

A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

2011-01-01

95

Uncertainty, neuromodulation, and attention.

Uncertainty in various forms plagues our interactions with the environment. In a Bayesian statistical framework, optimal inference and prediction, based on unreliable observations in changing contexts, require the representation and manipulation of different forms of uncertainty. We propose that the neuromodulators acetylcholine and norepinephrine play a major role in the brain's implementation of these uncertainty computations. Acetylcholine signals expected uncertainty, coming from known unreliability of predictive cues within a context. Norepinephrine signals unexpected uncertainty, as when unsignaled context switches produce strongly unexpected observations. These uncertainty signals interact to enable optimal inference and learning in noisy and changeable environments. This formulation is consistent with a wealth of physiological, pharmacological, and behavioral data implicating acetylcholine and norepinephrine in specific aspects of a range of cognitive processes. Moreover, the model suggests a class of attentional cueing tasks that involve both neuromodulators and shows how their interactions may be part-antagonistic, part-synergistic. PMID:15944135

Yu, Angela J; Dayan, Peter

2005-05-19

96

[The precautionary principle and the environment].

The precautionary principle is a response to uncertainty in the face of risks to health or the environment. In general, it involves taking measures to avoid potential harm, despite lack of scientific certainty. In recent years it has been applied, not without difficulties, as a legal and political principle in many countries, particularly on the European and International level. In spite of the controversy, the precautionary principle has become an integral component of a new paradigm for the creation of public policies needed to meet today's challenges and those of the future. PMID:15913050

de Cózar Escalante, José Manuel

2005-01-01

97

The Precautionary Principle in Environmental Science

Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, pro- posed as a new guideline in environmental decision making, has four central components: taking pre- ventive action in the face of uncertainty; shifting the burden of proof to the proponents of

David Kriebel; Joel Tickner; Paul Epstein; John Lemons; Richard Levins; Edward L. Loechler; Margaret Quinn; Ruthann Rudel; Ted Schettler; Michael Stoto

98

Weak values, 'negative probability', and the uncertainty principle

A quantum transition can be seen as a result of interference between various pathways (e.g., Feynman paths), which can be labeled by a variable f. An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of f. We show f to be an average obtained with an amplitude distribution which can,

Sokolovski

2007-01-01

99

Weak values, 'negative probability' and the uncertainty principle

A quantum transition can be seen as a result of interference between various pathways(e.g. Feynman paths) which can be labelled by a variable $f$. An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of $\\\\bar{f}$. We show $\\\\bar{f}$ to be an average obtained with amplitude distribution which can, in general,

D. Sokolovski

2009-01-01

100

Weak values, ``negative probability,'' and the uncertainty principle

A quantum transition can be seen as a result of interference between various pathways (e.g., Feynman paths), which can be labeled by a variable f . An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of f¯ . We show f¯ to be an average obtained with an amplitude distribution

D. Sokolovski

2007-01-01

101

The Uncertainty Relation for Quantum Propositions

NASA Astrophysics Data System (ADS)

Logical propositions with the fuzzy modality "Probably" are shown to obey an uncertainty principle very similar to that of Quantum Optics. In the case of such propositions, the partial truth values are in fact probabilities. The corresponding assertions in the metalanguage, have complex assertion degrees which can be interpreted as probability amplitudes. In the logical case, the uncertainty relation is about the assertion degree, which plays the role of the phase, and the total number of atomic propositions, which plays the role of the number of modes. In analogy with coherent states in quantum physics, we define "quantum coherent propositions" those which minimize the above logical uncertainty relation. Finally, we show that there is only one kind of compound quantum-coherent propositions: the "cat state" propositions.

Zizzi, Paola

2013-01-01

102

Deterministic uncertainty analysis

Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig.

Worley, B.A.

1987-01-01

103

MOUSE UNCERTAINTY ANALYSIS SYSTEM

The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

104

Equivalence principles and electromagnetism

NASA Technical Reports Server (NTRS)

The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

Ni, W.-T.

1977-01-01

105

This paper proposes some principles of cyber- warfare. The principles of warfare are well documented, but are not always applicable to cyber-warfare. Differences between cyberspace and the real world suggest some additional principles. This is not intended to be a comprehensive listing of such principles but suggestions leading toward discussion and dialogue. The current candidate list of principles of cyber-warfare

Raymond C. Parks; David P. Duggan

2011-01-01

106

Non-Scalar Uncertainty: Uncertainty in Dynamic Systems.

National Technical Information Service (NTIS)

The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the...

S. G. Martinez

1992-01-01

107

Information Theoretic Quantification of Diagnostic Uncertainty

Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes’ rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians’ deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians’ application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

2012-01-01

108

NASA Astrophysics Data System (ADS)

It might come as a disappointment to some chemists, but just as there are uncertainties in physics and mathematics, there are some chemistry questions we may never know the answer to either, suggests Fredric M. Menger.

Menger, Fredric M.

2010-09-01

109

Practice uncertainty: changing perceptions.

Practice uncertainty is inevitable in health care, and there are many contextual factors that can lead to either good or bad outcomes for patients and health care providers. Practice uncertainty is not a well-established concept in the literature, perhaps because of the predominant empirical paradigm and the high value placed on certainty within current health care culture. This study was conducted to explore practice uncertainty and bring this topic into the foreground as a first step toward practice evolution. A shift in the perception of practice uncertainty may change the way in which practitioners experience this phenomenon. This process must start with nursing educators recognizing and acknowledging this phenomenon when it occurs. PMID:23875604

Vaid, Patrycja R; Ewashen, Carol; Green, Theresa

2013-10-01

110

Uncertainty: Medicine's Frequent Companion

... no means rare. Back to top The Elusive Gold Standard The "gold standard" is a concept commonly embraced by doctors — ... one or the other. The biopsy is the gold standard, and there is generally little uncertainty about ...

111

National Technical Information Service (NTIS)

Models in space and space-time are essential for representing the battlespace. For example, they are used in estimating a dynamically evolving danger function or in predicting a waypoint in the presence of uncertainties. In netcentric warfare, the uncerta...

N. A. Cressie

2011-01-01

112

Minimum Uncertainty and Entanglement

NASA Astrophysics Data System (ADS)

We address the question, does a system A being entangled with another system B, put any constraints on the Heisenberg uncertainty relation (or the Schrödinger-Robertson inequality)? We find that the equality of the uncertainty relation cannot be reached for any two noncommuting observables, for finite dimensional Hilbert spaces if the Schmidt rank of the entangled state is maximal. One consequence is that the lower bound of the uncertainty relation can never be attained for any two observables for qubits, if the state is entangled. For infinite-dimensional Hilbert space too, we show that there is a class of physically interesting entangled states for which no two noncommuting observables can attain the minimum uncertainty equality.

Hari Dass, N. D.; Qureshi, Tabish; Sheel, Aditi

2013-06-01

113

National Technical Information Service (NTIS)

This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exi...

M. Athans R. Ku S. B. Gershwin

1976-01-01

114

Conundrums with uncertainty factors.

The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767

Cooke, Roger

2010-03-01

115

Dasymetric Modeling and Uncertainty

Dasymetric models increase the spatial resolution of population data by incorporating related ancillary data layers. The role of uncertainty in dasymetric modeling has not been fully addressed as of yet. Uncertainty is usually present because most population data are themselves uncertain, and/or the geographic processes that connect population and the ancillary data layers are not precisely known. A new dasymetric methodology - the Penalized Maximum Entropy Dasymetric Model (P-MEDM) - is presented that enables these sources of uncertainty to be represented and modeled. The P-MEDM propagates uncertainty through the model and yields fine-resolution population estimates with associated measures of uncertainty. This methodology contains a number of other benefits of theoretical and practical interest. In dasymetric modeling, researchers often struggle with identifying a relationship between population and ancillary data layers. The PEDM model simplifies this step by unifying how ancillary data are included. The P-MEDM also allows a rich array of data to be included, with disparate spatial resolutions, attribute resolutions, and uncertainties. While the P-MEDM does not necessarily produce more precise estimates than do existing approaches, it does help to unify how data enter the dasymetric model, it increases the types of data that may be used, and it allows geographers to characterize the quality of their dasymetric estimates. We present an application of the P-MEDM that includes household-level survey data combined with higher spatial resolution data such as from census tracts, block groups, and land cover classifications.

Nagle, Nicholas N.; Buttenfield, Barbara P.; Leyk, Stefan; Speilman, Seth

2014-01-01

116

Classification images with uncertainty

Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms.

Tjan, Bosco S.; Nandy, Anirvan S.

2009-01-01

117

NASA Astrophysics Data System (ADS)

The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs at the generating points (effectively the center of the polygon). This methodology readily admits to rigorous statistical analysis using standard components found in R and thus entirely compatible with the visualization package we use (Visit and/or ParaView), the language we use (Python) and the UVCDAT environment that provides the programmer and analyst workbench. We will demonstrate the power and effectiveness of this methodology in climate studies. We will further argue that our method of defining (or predicting) values in a region has many advantages over the traditional visualization notion of value at a point.

Jones, P. W.; Strelitz, R. A.

2012-12-01

118

Conditionally valid uncertainty relations

NASA Astrophysics Data System (ADS)

It is shown that the well-defined unbiased measurement or disturbance of a dynamical variable is not maintained for the precise measurement of the conjugate variable, independently of uncertainty relations. The conditionally valid uncertainty relations on the basis of those additional assumptions, which include most of the familiar Heisenberg-type relations, thus become singular for the precise measurement. We clarify some contradicting conclusions in the literature concerning those conditionally valid uncertainty relations: The failure of a naive Heisenberg-type error-disturbance relation and the modified Arthurs-Kelly relation in the recent spin measurement is attributed to this singular behavior. The naive Heisenberg-type error-disturbance relation is formally preserved in quantum estimation theory, which is shown to be based on the strict unbiased measurement and disturbance, but it leads to unbounded disturbance for bounded operators such as spin variables. In contrast, the Heisenberg-type error-error uncertainty relation and the Arthurs-Kelly relation, as conditionally valid uncertainty relations, are consistently maintained.

Fujikawa, Kazuo

2013-07-01

119

Site uncertainty, allocation uncertainty, and superfund liability valuation

The amount and timing of a firm's ultimate financial obligation for contingent liabilities is uncertain and subject to the outcome of future events. We decompose uncertainty about Superfund contingent liabilities into two sources: (1) uncertainty regarding site clean-up cost (site uncertainty); and (2) uncertainty regarding allocation of total site-clean-up cost across multiple parties associated with the site (allocation uncertainty). We

Katherine Campbell; Stephan E. Sefcik; Naomi S. Soderstrom

1998-01-01

120

Measurement uncertainty relations

NASA Astrophysics Data System (ADS)

Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

2014-04-01

121

Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find specific principles unacceptably vague or see them as clearly

Ragnar E. Löfstedt; Baruch Fischhoff; Ilya R. Fischhoff

2002-01-01

122

An Examination of Procedural Justice Principles in China and the U.S

This paper examines procedural justice principles from a cultural perspective, and examines the relationships between three dimensions of national culture (uncertainty avoidance, societal emphasis on collectivism, and gender egalitarianism), three principles of procedural justice (consistency, social sensitivity, and account-giving), and judgments of fairness. The results suggest that culture can influence employees' perceptions of the fairness of procedural justice principles; different

Jasmine Tata; Ping Ping Fu; Rongxian Wu

2003-01-01

123

Uncertainty and calibration analysis

All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.

Coutts, D.A.

1991-03-01

124

NASA Technical Reports Server (NTRS)

An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

Brown, Laurie M.

1993-01-01

125

Optimising uncertainty in physical sample preparation.

Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation. PMID:16222372

Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

2005-11-01

126

The propagation of uncertainty for humidity calculations

NASA Astrophysics Data System (ADS)

This paper addresses the international humidity community's need for standardization of methods for propagation of uncertainty associated with humidity generators and for handling uncertainty associated with the reference water vapour-pressure and enhancement-factor equations. The paper outlines uncertainty calculations for the mixing ratio, dew-point temperature and relative humidity output from humidity generators, and in particular considers controlling equations for a theoretical hybrid humidity generator combining single-pressure (1-P), two-pressure (2-P) and two-flow (2-F) principles. Also considered is the case where the humidity generator is used as a stable source with traceability derived from a reference hygrometer, i.e. a dew-point meter, a relative humidity meter or a wet-bulb psychrometer. Most humidity generators in use at national metrology institutes can be considered to be special cases of those considered here and sensitivity coefficients for particular types may be extracted. The ability to account for correlations between input variables and between different instances of the evaluation of the reference equations is discussed. The uncertainty calculation examples presented here are representative of most humidity calculations.

Lovell-Smith, J.

2009-12-01

127

Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

NASA Astrophysics Data System (ADS)

For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M?4+/-C4 with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

Amoroso, Richard L.

2010-12-01

128

Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

Amoroso, Richard L. [Noetic Advanced Studies Institute, Oakland, CA 94610-1422 (United States)

2010-12-22

129

Over the last decade, substantial development and progress has been made in the understanding of the nature of severe accidents and associated fission product release and transport. As part of this continuing effort, the United States Nuclear Regulatory Commission (USNRC) sponsored the development of the Source Term Code Package (STCP), which models core degradation, fission product release from the damaged fuel, and the subsequent migration of the fission products from the primary system to the containment and finally to the environment. The objectives of the QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors) program are: (1) to address the uncertainties associated with input parameters and phenomenological models used in the STCP; and (2) to define reasonable and technically defensible parameter ranges and modelling assumptions for the use in the STCP. The uncertainties in the radiological releases to the environment can be defined as the degree of current knowledge associated with the magnitude, the timing, duration, and other pertinent characteristics of the release following a severe nuclear reactor accident. These uncertainties can be quantified by probability density functions (PDF) using the Source Term Code Package as the physical model. An attempt will also be made to address the phenomenological issues not adequately modeled by the STCP, using more advanced, mechanistic models.

Khatib-Rahbar, M.; Park, C.; Davis, R.; Nourbakhsh, H.; Lee, M.; Cazzoli, E.; Schmidt, E.

1986-10-01

130

Uncertainties and Error Propagation

NSDL National Science Digital Library

This item is a tutorial on Uncertainties and Error Propagation. Topics covered include Systematic versus Random Error, Determining Random Errors, Relative and Absolute error, Propagation of errors, Rounding answers properly, and Significant figures. A list of well illustrated problems are embedded throughout the tutorial.

Lindberg, Vern

2008-07-22

131

Principles of project management

NASA Technical Reports Server (NTRS)

The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

1982-01-01

132

Chemical Principls Exemplified

ERIC Educational Resources Information Center

Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

Plumb, Robert C.

1973-01-01

133

Questioning the Equivalence Principle.

National Technical Information Service (NTIS)

The Equivalence Principle (EP) is not one of the 'universal' principles of physics (like the Action Principle). It is a heuristic hypothesis which was introduced by Einstein in 1907, and used by him to construct his theory of General Relativity. In modern...

T. Damour

2001-01-01

134

Principles of plasma diagnostics

Principles of Plasma Diagnostics provides a detailed derivation and discussion of the plasma physics principles on which diagnostics are base, including magnetic measurements, electric probes, refractive index, radiation emission and scattering, and ionic processes. The text is based on first-principles development of the required concepts and includes examples of diagnostics in action taken from fusion research.

Ian H. Hutchinson

1987-01-01

135

Chemical Principles Exemplified

ERIC Educational Resources Information Center

This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

Plumb, Robert C.

1970-01-01

136

Reconsidering Archimedes' Principle

Archimedes' principle as stated originally by Archimedes and in modern texts can lead to an incorrect prediction if the submerged object is in contact with a solid surface. In this paper we look experimentally at a submerged object and show that though the theoretical explanations of the principle are valid, the statement of the principle needs clarification.

Jeffrey Bierman; Eric Kincanon

2003-01-01

137

The ability of a reaction model to predict the combustion behavior of a fuel relies on the rigorous quantification of the kinetic rate parameter uncertainty. Although the accuracy of a detailed kinetic model may be ensured, in principle, by a multi-parameter optimization, the inherent uncertainties in the fundamental combustion targets used for optimization cause the resulting optimized model to be

David A. Sheen; Xiaoqing You; Hai Wang; Terese Løvås

2009-01-01

138

Driving Toward Guiding Principles

As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate.

Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

1999-01-01

139

NASA Astrophysics Data System (ADS)

Juan G. Roederer raises some difficult questions in his Forum article “The Challenge of Global Change” (Eos, Sept. 18, 1990): “How can we sustain a public sense of the common danger of global change while remaining honest in view of the realities of scientific uncertainties? How can we nurture this sense of common danger without making statements based on half-baked ideas, statistically unreliable results, or oversimplified models? How can we strike a balance between the need to overstate a case to attract the attention of the media and the obligation to adhere strictly to the ethos of science?”There are no easy answers to these questions since debate and uncertainty characterize most scientific advancements. Also, it is often very difficult in the early stages of those advancements to determine which ideas are “half-baked,” which results are “statistically unreliable,” or which models are “oversimplified.”

Walker, Daniel A.

140

Majorization entropic uncertainty relations

NASA Astrophysics Data System (ADS)

Entropic uncertainty relations in a finite-dimensional Hilbert space are investigated. Making use of the majorization technique we derive explicit lower bounds for the sum of Rényi entropies describing probability distributions associated with a given pure state expanded in eigenbases of two observables. Obtained bounds are expressed in terms of the largest singular values of submatrices of the unitary rotation matrix. Numerical simulations show that for a generic unitary matrix of size N = 5, our bound is stronger than the well-known result of Maassen and Uffink (MU) with a probability larger than 98%. We also show that the bounds investigated are invariant under the dephasing and permutation operations. Finally, we derive a classical analogue of the MU uncertainty relation, which is formulated for stochastic transition matrices. Dedicated to Iwo Bia?ynicki-Birula on the occasion of his 80th birthday.

Pucha?a, Zbigniew; Rudnicki, ?ukasz; ?yczkowski, Karol

2013-07-01

141

Reachability under uncertainty

The paper studies the problem of reachability for linear systems in the presence of uncertain (unknown but bounded) input disturbances, which may also be interpreted as the action of an adversary in a game-theoretic setting. It defines possible notions of reachability under uncertainty emphasizing the differences between open-loop and closed-loop control. Solution schemes for calculating reachability sets are indicated. The

A. B. Kurzhanski; P. Varaiyat

2002-01-01

142

Calibration Under Uncertainty.

This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

Swiler, Laura Painton; Trucano, Timothy Guy

2005-03-01

143

Uncertainty Propagation Through Computer Codes.

National Technical Information Service (NTIS)

The propagation of uncertainty through computer codes has important applications in reactor safety studies. This study reviews a class of techniques for uncertainty propagation and selects the Response Surface and Monte Carlo techniques for further invest...

D. A. Dahlgren G. P. Steck R. G. Easterling R. L. Iman

1978-01-01

144

Uncertainty and Labor Contract Durations.

National Technical Information Service (NTIS)

This paper provides an empirical investigation into the relationship between ex ante U.S. labor contract durations and uncertainty over the period 1970 to 1995. We construct measures of inflation uncertainty as well as aggregate nominal and real uncertain...

R. Rich J. Tracy

2000-01-01

145

Predictive Uncertainty in Hydrological Forecasting

NASA Astrophysics Data System (ADS)

This work aims at discussing the role and the relevance of "predictive uncertainty" in flood forecasting and water resources management . Predictive uncertainty, is here defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume) conditional on prior observations and knowledge as well as on all the information we can obtain on that specific future value, which is typically embodied in one or more hydrological /hydraulic model forecasts. The aim of this work is also to clarify questions such as: What is the conceptual difference between "total model uncertainty" (commonly used when dealing with model verification) from the predictive uncertainty (which is used when forecasting into the future)? What is the difference between models, parameters, input output measurement errors, initial and boundary conditions, etc. uncertainty and predictive uncertainty? How one can incorporate all these uncertainties into the predictive uncertainty and, most of all, is it really necessary? The presently available uncertainty processors are then introduced and compared on the basis of their relative performances using operational flood forecasting systems. The uncertainty processors can be continuous (Hydrologic Uncertainty Processor, Bayesian Model Averaging, Model Conditional Processor, etc.) or binary ( Logistic Regression, Binary Multivariate Bayesian Processor, etc.) depending on the scope for which they are developed and the type of decision one must take. Finally, the benefits of incorporating predictive uncertainty into the decision making process will be compared, on actual real world derived examples, to the ones obtainable when using deterministic forecasts, as currently done in practice.

Todini, E.

2009-04-01

146

Improvement of Statistical Decisions under Parametric Uncertainty

NASA Astrophysics Data System (ADS)

A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

2011-10-01

147

Architecture Principle Specifications

\\u000a This chapter is concerned with the specification of architecture principles. The focus of this chapter is on the specification\\u000a itself, and not on the process of specifying. It shows how architectural information, such as architecture principles, can\\u000a be classified in multiple dimensions. Specifically, architecture principles can be classified along the dimensions: type of\\u000a information, scope, genericity, detail level, stakeholder, transformation,

Danny Greefhorst; Erik Proper

148

Uncertainty and precaution in environmental management.

In this paper, two different visions of the relationship between science and policy are contrasted with one another: the "modern" vision and the "precautionary" vision. Conditions which must apply in order to invoke the Precautionary Principle are presented, as are some of the main challenges posed by the principle. The following central question remains: If scientific certainty cannot be provided, what may then justify regulatory interventions, and what degree of intervention is justifiable? The notion of "quality of information" is explored, and it is emphasized that there can be no absolute definition of good or bad quality. Collective judgments of quality are only possible through deliberation on the characteristics of the information, and on the relevance of the information to the policy context. Reference to a relative criterion therefore seems inevitable and legal complexities are to be expected. Uncertainty is presented as a multidimensional concept, reaching far beyond the conventional statistical interpretation of the concept. Of critical importance is the development of methods for assessing qualitative categories of uncertainty. Model quality assessment should observe the following rationale: identify a model that is suited to the purpose, yet bears some reasonable resemblance to the "real" phenomena. In this context, "purpose" relates to the policy and societal contexts in which the assessment results are to be used. It is therefore increasingly agreed that judgment of the quality of assessments necessarily involves the participation of non-modellers and non-scientists. A challenging final question is: How to use uncertainty information in policy contexts? More research is required in order to answer this question. PMID:16304928

Krayer von Krauss, M; van Asselt, M B A; Henze, M; Ravetz, J; Beck, M B

2005-01-01

149

Instructional Software Design Principles.

ERIC Educational Resources Information Center

Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)

Hazen, Margret

1985-01-01

150

Satellite altitude determination uncertainties

NASA Technical Reports Server (NTRS)

Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

Siry, J. W.

1972-01-01

151

Picturing Data With Uncertainty

NASA Technical Reports Server (NTRS)

NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi-valued data can also be interpreted by the number of peaks and the widths of the peaks as shown by the PDF walls.

Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

2004-01-01

152

Schwarzschild mass uncertainty

NASA Astrophysics Data System (ADS)

Applying Dirac's procedure to -dependent constrained systems, we derive a reduced total Hamiltonian, resembling an upside down harmonic oscillator, which generates the Schwarzschild solution in the mini super-spacetime. Associated with the now -dependent Schrodinger equation is a tower of localized Guth-Pi-Barton wave packets, orthonormal and non-singular, admitting equally spaced average-`energy' levels. Our approach is characterized by a universal quantum mechanical uncertainty structure which enters the game already at the flat spacetime level, and accompanies the massive Schwarzschild sector for any arbitrary mean mass. The average black hole horizon surface area is linearly quantized.

Davidson, Aharon; Yellin, Ben

2014-02-01

153

Participatory Development Principles and Practice: Reflections of a Western Development Worker.

ERIC Educational Resources Information Center

Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

Keough, Noel

1998-01-01

154

Probabilistic Mass Growth Uncertainties

NASA Technical Reports Server (NTRS)

Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

Plumer, Eric; Elliott, Darren

2013-01-01

155

Antarctic Photochemistry: Uncertainty Analysis

NASA Technical Reports Server (NTRS)

Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

Stewart, Richard W.; McConnell, Joseph R.

1999-01-01

156

Uncertainty in adaptive capacity

NASA Astrophysics Data System (ADS)

The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

Adger, W. Neil; Vincent, Katharine

2005-03-01

157

NASA Astrophysics Data System (ADS)

The anthropic principle states that the fact of existence of intelligent beings may be a valid explanation of why the universe and laws of physics are as they are. The origin and some of the deeper implications of the principle are investigated. The discussion involves considerations of physics and metaphysics, unified schemes and holism, the nature of physical explanation, realism and idealism, and symmetry.

Rosen, Joe

1985-04-01

158

NSDL National Science Digital Library

This tutorial provides instruction on Pauli's exclusion principle, formulated by physicist Wolfgang Pauli in 1925, which states that no two electrons in an atom can have identical quantum numbers. Topics include a mathematical statement of the principle, descriptions of some of its applications, and its role in ionic and covalent bonding, nuclear shell structure, and nuclear binding energy.

Nave, Rod

159

Hamilton's Principle for Beginners

ERIC Educational Resources Information Center

I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…

Brun, J. L.

2007-01-01

160

This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by

John-Stewart Gordon

2011-01-01

161

Assessment Principles and Tools

The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described.

Golnik, Karl C.

2014-01-01

162

ERIC Educational Resources Information Center

A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world…

Batstone, Rob; Ellis, Rod

2009-01-01

163

Variation Principles of Hydrodynamics

It is shown that the Lagrangian equations for the motion of both incompressible and compressible fluids can be derived from variation principles. As has been pointed out by C. C. Lin, an important feature of these principles is the boundary condition: The coordinates of each particle (and not merely the normal component of its velocity) must be specified. A systematic

Carl Eckart

1960-01-01

164

Group environmental preference aggregation: the principle of environmental justice

The aggregation of group environmental preference presents a challenge of principle that has not, as yet, been satisfactorily met. One such principle, referred to as an environmental justice, is established based on a concept of social justice and axioms for rational choice under uncertainty. It requires that individual environmental choices be so decided that their supporters will least mind being anyone at random in the new environment. The application of the principle is also discussed. Its only information requirement is a ranking of alternative choices by each interested party. 25 references.

Davos, C.A.

1986-01-01

165

Mach's principle is the proposition that inertial frames are determined by matter. We put forth and implement a precise correspondence between matter and geometry that realizes Mach's principle. Einstein's equations are not modified and no selection principle is applied to their solutions; Mach's principle is realized wholly within Einstein's general theory of relativity. The key insight is the observation that, in addition to bulk matter, one can also add boundary matter. Given a space-time, and thus the inertial frames, we can read off both boundary and bulk stress tensors, thereby relating matter and geometry. We consider some global conditions that are necessary for the space-time to be reconstructible, in principle, from bulk and boundary matter. Our framework is similar to that of the black hole membrane paradigm and, in asymptotically anti-de Sitter space-times, is consistent with holographic duality.

Khoury, Justin [Perimeter Institute for Theoretical Physics, 31 Caroline St. N., Waterloo, Ontario, Canada N2L 2Y5 (Canada); Center for Particle Cosmology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Parikh, Maulik [Institute for Strings, Cosmology, and Astroparticle Physics, Columbia University, New York, New York 10027 (United States); Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Pune 411007 (India)

2009-10-15

166

NASA Technical Reports Server (NTRS)

The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

Zuk, J.

1976-01-01

167

Earthquake Loss Estimation Uncertainties

NASA Astrophysics Data System (ADS)

The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity to assessing the damage to different elements at risk, of the databases on different elements at risk, such as population and building stock distribution, as well critical facilities characteristics, on the reliability of expected loss estimations at regional and global scale.

Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

2013-04-01

168

Uncertainty in Seismic Hazard Assessment

NASA Astrophysics Data System (ADS)

Uncertainty is a part of our life, and society has to deal with it, even though it is sometimes difficult to estimate. This is particularly true in seismic hazard assessment for large events, such as the mega-tsunami in Southeast Asia and the great New Madrid earthquakes in the central United States. There are two types of uncertainty in seismic hazard assessment: temporal and spatial. Temporal uncertainty describes distribution of the events in time and is estimated from the historical records, while spatial uncertainty describes distribution of physical measurements generated at a specific point by the events and is estimated from the measurements at the point. These uncertainties are of different characteristics and generally considered separately in hazard assessment. For example, temporal uncertainty (i.e., the probability of exceedance in a period) is considered separately from spatial uncertainty (a confidence level of physical measurement) in flood hazard assessment. Although estimating spatial uncertainty in seismic hazard assessment is difficult because there are not enough physical measurements (i.e., ground motions), it can be supplemented by numerical modeling. For example, the ground motion uncertainty or tsunami uncertainty at a point of interest has been estimated from numerical modeling. Estimating temporal uncertainty is particularly difficult, especially for large earthquakes, because there are not enough instrumental, historical, and geological records. Therefore, the temporal and spatial uncertainties in seismic hazard assessment are of different characteristics and should be determined separately. Probabilistic seismic hazard analysis (PSHA), the most widely used method to assess seismic hazard for various aspects of public and financial policy, uses spatial uncertainty (ground motion uncertainty) to extrapolate temporal uncertainty (ground motion occurrence), however. This extrapolation, or so-called ergodic assumption, is caused by a mathematical error in hazard calculation of PSHA: incorrectly equating the conditional exceedance probability of the ground-motion attenuation relationship (a function) to the exceedance probability of the ground-motion uncertainty (a variable). An alternative approach has been developed to correct the error and to determine temporal and spatial uncertainties separately.

Wang, Z.

2006-12-01

169

Integrating out astrophysical uncertainties

Underground searches for dark matter involve a complicated interplay of particle physics, nuclear physics, atomic physics, and astrophysics. We attempt to remove the uncertainties associated with astrophysics by developing the means to map the observed signal in one experiment directly into a predicted rate at another. We argue that it is possible to make experimental comparisons that are completely free of astrophysical uncertainties by focusing on integral quantities, such as g(v{sub min})=v{sub min}dvf(v)/v and v{sub thresh}dvvg(v). Direct comparisons are possible when the v{sub min} space probed by different experiments overlap. As examples, we consider the possible dark matter signals at CoGeNT, DAMA, and CRESST-Oxygen. We find that the expected rate from CoGeNT in the XENON10 experiment is higher than observed, unless scintillation light output is low. Moreover, we determine that S2-only analyses are constraining, unless the charge yields Q{sub y}<2.4 electrons/keV. For DAMA to be consistent with XENON10, we find for q{sub Na}=0.3 that the modulation rate must be extremely high (> or approx. 70% for m{sub {chi}=}7 GeV), while for higher quenching factors, it makes an explicit prediction (0.8-0.9 cpd/kg) for the modulation to be observed at CoGeNT. Finally, we find CDMS-Si, even with a 10 keV threshold, as well as XENON10, even with low scintillation, would have seen significant rates if the excess events at CRESST arise from elastic WIMP scattering, making it very unlikely to be the explanation of this anomaly.

Fox, Patrick J. [Theoretical Physics Department, Fermilab, Batavia, Illinois 60510 (United States); School of Natural Sciences, Institute for Advanced Study, Einstein Drive, Princeton, New Jersey 08540 (United States); Liu Jia [Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, New York 10003 (United States); Weiner, Neal [Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, New York 10003 (United States); School of Natural Sciences, Institute for Advanced Study, Einstein Drive, Princeton, New Jersey 08540 (United States)

2011-05-15

170

Exploring Uncertainty with Projectile Launchers

NASA Astrophysics Data System (ADS)

The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between random and systematic uncertainties, or to compare the uncertainties associated with different techniques. In this paper, we describe an experiment suitable for an introductory college-level (or advanced high school) course that uses velocity measurements to clearly show students the effects of both random and systematic uncertainties.

Orzel, Chad; Reich, Gary; Marr, Jonathan

2012-12-01

171

Uncertainties in the Astronomical Ephemeris as Constraints on New Physics

NASA Astrophysics Data System (ADS)

Most extensions of the standard model of particle physics predict composition-dependent violations of the universality of free fall (equivalence principle). We test this idea using observational uncertainties in mass, range and mean motion for the Moon and planets, as well as orbit uncertainties for Trojan asteroids and Saturnian satellites. For suitable pairs of solar-system bodies, we derive linearly independent constraints on relative difference in gravitational and inertial mass from modifications to Kepler's third law, the migration of stable Lagrange points, and orbital polarization (the Nordtvedt effect). These constraints can be combined with data on bulk composition to extract limits on violations of the equivalence principle for individual elements relative to one another. These limits are weaker than those from laboratory experiments, but span a much larger volume in composition space.

Warecki, Zoey; Overduin, J.

2014-01-01

172

Implicit learning of arithmetic principles

Past research has investigated childrenpsilas knowledge of arithmetic principles over development. However, little is known about the mechanisms involved in acquiring principle knowledge. We hypothesize that experience with equations that violate a to-be-learned principle will lead to changes in equation encoding, which in turn will promote acquisition of principle knowledge. Adultspsila knowledge of an arithmetic principle was evaluated before and

Richard W. Prather; Martha W. Alibali

2008-01-01

173

Maximum predictive power and the superposition principle

NASA Technical Reports Server (NTRS)

In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

Summhammer, Johann

1994-01-01

174

Analysis of Infiltration Uncertainty

The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the simulated multi-rectangular region approximating the repository footprint, shown in Figure 1-1. (For brevity, these maps will be referred to as the analog maps, and the corresponding averaged net infiltration values as the analog values.)

R. McCurley

2003-10-27

175

National Technical Information Service (NTIS)

Explains the basic principles of dc motor operation by a progressive development of magnetic fields and shows how a current-carrying device acts when placed in these fields. Explains the positions of maximum and minimum torque.

1994-01-01

176

Chemical Principles Exemplified

ERIC Educational Resources Information Center

Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

Plumb, Robert C.

1972-01-01

177

Principles of plasma electrodynamics

Principles of the linear electrodynamics of uniform plasma in thermodynamic equilibrium are presented. Linear electromagnetic effects in nonequilibrium, spatially nonuniform plasma (plasma equilibrium theory) are studied. Methods for analysis of nonlinear electrodynamic processes in plasma are described.

A. F. Aleksandrov; L. S. Bogdankevich; A. A. Rukhadze

1978-01-01

178

Principles of Nonlinear Optics.

National Technical Information Service (NTIS)

This report contains a summary of essential principles of nonlinear optics such as optical bistability, phase conjugation, and harmonic generation. The origins of nonlinearity are described, tracing it back to its manifestation as the modification of the ...

P. P. Banerjee

1989-01-01

179

Global ethics and principlism.

This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together. PMID:22073817

Gordon, John-Stewart

2011-09-01

180

Buoyancy and Archimedes Principle

NSDL National Science Digital Library

Summary Buoyancy is based on Archimedes' Principle which states that the buoyant force acting upward on an object completely or partially immersed in a fluid equals the weight of the fluid displaced by the ...

181

Archimedes' Principle in Action

ERIC Educational Resources Information Center

The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

Kires, Marian

2007-01-01

182

How Uncertainty Bounds the Shape Index of Simple Cells

We propose a theoretical motivation to quantify actual physiological features, such as the shape index distributions measured by Jones and Palmer in cats and by Ringach in macaque monkeys. We will adopt the uncertainty principle associated to the task of detection of position and orientation as the main tool to provide quantitative bounds on the family of simple cells concretely implemented in primary visual cortex. Mathematics Subject Classification (2000)2010: 62P10, 43A32, 81R15.

2014-01-01

183

NASA Astrophysics Data System (ADS)

A study to find the essential and important matters which can effect the reliable uninterrupted operation of telecommunications power supply systems and to suggest an optimal uncertainty management scheme is reported. The main goal was to find simple and practical but effective methods on which the uncertainty management and the implementation of its tool can be based. Uncertainty management ensures that there is enough reserve energy and minimizes additional uncertainties. It turned out that an optimal solution can be obtained by means of intelligent supervision, control and alarm facilities by utilizing human reasoning methodology, and minimize-uncertainty principles.

Suntio, Teuvo

1992-01-01

184

Principles of Forecasting Project

NSDL National Science Digital Library

Directed by J. Scott Armstrong at the Wharton School of the University of Pennsylvania, the Principles of Forecasting Project seeks to "develop a comprehensive and structured review of the state of knowledge in the field of forecasting" in order to aid future research. The project will lead to a book entitled Principles of Forecasting: A Handbook for Researchers and Practitioners, and sample chapters, contact information, updates, and links to forecasting resources add value to this expanding compilation.

185

Summary In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical\\u000a principle for the design of advanced computers. During our work on these computers we found that the Gabor formulation can\\u000a be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing\\u000a principle.

H. A. Fatmi; G. Resconi

1988-01-01

186

Buoyancy: Archimedes Principle

NSDL National Science Digital Library

This site describes bouyancy (the difference between the upward and downward forces acting on the bottom and the top of an object) and the Archimedes Principle, which states that the buoyant force on a submerged object is equal to the weight of the fluid that is displaced by it. It consists of text descriptions of these principles, using the examples of metal cubes suspended in water and hot air baloons in the atmosphere. Mathematical word problems are included.

187

Pandemic influenza: certain uncertainties

SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them.

Morens, David M.; Taubenberger, Jeffery K.

2011-01-01

188

Uncertainties in Arctic Precipitation

NASA Astrophysics Data System (ADS)

Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

2012-12-01

189

The relevance of extrinsic uncertainty

Extrinsic uncertainty is effective at a competitive equilibrium. This is generic if spot markets are inoperative: the only objects of exchange are assets for the contingent delivery of commodities; and the asset market is incomplete. The structure of payoffs of assets may allow for non-trivial allocations invariant with respect to the extrinsic uncertainty, and the economy with a complete asset

Heracles POLEMARCHAKIS; Luigi VENTURA

1995-01-01

190

The relevance of extrinsic uncertainty

Extrinsic uncertainty is effective at a competitive equilibrium. This is generically the case if commodities are exchanged indirectly, through the exchange of assets, spot markets are inoperative, while the asset market is incomplete. The structure of payoffs of assets may allow for non - trivial allocations invariant with respect to the extrinsic uncertainty, and the economy with a complete asset

Heracles M. POLEMARCHAKIS; Luigi VENTURA

2000-01-01

191

Housing Uncertainty and Childhood Impatience

ERIC Educational Resources Information Center

The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

2011-01-01

192

Uncertainty modeling: Examples and issues

New techniques in uncertainty analysis are illustrated with simplified examples from recent applications of interest for safety science. The applications are: BLEVE (boiling liquid expanding vapor explosions) models, satellite life distributions, dispersion coefficients and modeling contamination in food chains. The examples illustrate the importance of carefully modeling dependence in uncertainties. They also illustrate new uses of conditional Monte Carlo sampling.

Roger M. Cooke

1997-01-01

193

Uncertainty determination in QXAFS measurements

NASA Astrophysics Data System (ADS)

Measured uncertainties in QEXAFS measurements are determined using an oversampling technique. The output of an ionization chamber is read through a current amplifier using a sampling analog to digital converter. By oversampling the data, uncertainties are determined simultaneously with data collection. .

Quintana, J. P. G.

2000-06-01

194

Quantification of Emission Factor Uncertainty

Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

195

Designing for Uncertainty: Three Approaches

ERIC Educational Resources Information Center

Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

Bennett, Scott

2007-01-01

196

Hydrology, society, change and uncertainty

NASA Astrophysics Data System (ADS)

Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

Koutsoyiannis, Demetris

2014-05-01

197

Uncertainty of testing methods--what do we (want to) know?

It is important to stimulate innovation for regulatory testing methods. Scrutinizing the knowledge of (un)certainty of data from actual standard in vivo methods could foster the interest in new testing approaches. Since standard in vivo data often are used as reference data for model development, improved uncertainty accountability also would support the validation of new in vitro and in silico methods, as well as the definition of acceptance criteria for the new methods. Hazard and risk estimates, transparent for their uncertainty, could further support the 3Rs, since they may help focus additional information requirements on aspects of highest uncertainty. Here we provide an overview on the various types of uncertainties in quantitative and qualitative terms and suggest improving this knowledge base. We also reference principle concepts on how to use uncertainty information for improved hazard characterization and development of new testing methods. PMID:23665803

Paparella, Martin; Daneshian, Mardas; Hornek-Gausterer, Romana; Kinzl, Maximilian; Mauritz, Ilse; Mühlegger, Simone

2013-01-01

198

Aspects of modeling uncertainty and prediction

Probabilistic assessment of variability in model prediction considers input uncertainty and structural uncertainty. For input uncertainty, understanding of practical origins of probabilistic treatments as well as restrictions and limitations of methodology is much more developed than for structural uncertainty. There is a simple basis for structural uncertainty that parallels that for input uncertainty. Although methodologies for assessing structural uncertainty for models in general are very limited, more options are available for submodels.

McKay, M.D.

1993-12-31

199

Uncertainty and How to Approach the Unknown

Uncertainty is one of the most complex issues plaguing systems architecting. Uncertainty occurs in all life cycle phases of architecting any system, yet, there aren't any proven techniques that can make uncertainty completely disappear. However, a smart architect will at least try to account for uncertainty starting at the conceptual phase. Many architects approach uncertainty in their architecture by using

Angela Bartholomaus; Cihan H. Dagli

200

Credible Computations: Standard and Uncertainty

NASA Technical Reports Server (NTRS)

The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties involved in all stages of a process on the final responses. There are two approaches for conducting the uncertainty analysis: experimental and computational. These analyses and approaches are briefly described.

Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

1995-01-01

201

NSDL National Science Digital Library

This newly published document from the Basel Committee on Banking Supervision at the Bank of International Settlements considers the methodology used in determining The Core Principles for Effective Banking Supervision, "a global standard for prudential regulation and supervision," which has been endorsed by many countries worldwide. There are three sections to the report. The first chapter looks at the background for the core principles and "the preconditions for effective banking supervision." The second chapter "raises a few basic considerations regarding the conduct of an assessment and the compilation and presentation of the results," and the last chapter discusses each core principle individually. The 56-page document is available in .pdf format. A thumbnail map of each page, shown on the left, is the best way to navigate the report.

202

Basic Principles of Chromatography

NASA Astrophysics Data System (ADS)

Chromatography has a great impact on all areas of analysis and, therefore, on the progress of science in general. Chromatography differs from other methods of separation in that a wide variety of materials, equipment, and techniques can be used. [Readers are referred to references (1-19) for general and specific information on chromatography.]. This chapter will focus on the principles of chromatography, mainly liquid chromatography (LC). Detailed principles and applications of gas chromatography (GC) will be discussed in Chap. 29. In view of its widespread use and applications, high-performance liquid chromatography (HPLC) will be discussed in a separate chapter (Chap. 28). The general principles of extraction are first described as a basis for understanding chromatography.

Ismail, Baraem; Nielsen, S. Suzanne

203

Differential Landauer's principle

NASA Astrophysics Data System (ADS)

Landauer's principle states that the erasure of information must be a dissipative process. In this paper, we carefully analyze the recording and erasure of information on a physical memory. On the one hand, we show that, in order to record some information, the memory has to be driven out of equilibrium. On the other hand, we derive a differential version of Landauer's principle: We link the rate at which entropy is produced at every time of the erasure process to the rate at which information is erased.

Granger, Léo; Kantz, Holger

2013-03-01

204

The Shakespearean Principle Revisited

Let every eye negotiate for itself and trust no agent. That line is from William Shakespeare's Much Ado About Nothing. 1 To me, it is a fundamental doctrine of patient care, and I have named it the Shakespearean Principle.2 It stimulates skepticism,3 promotes doubt,4 improves communication, fosters proper decision-making, and protects against a malady that currently plagues our profession—herd mentality.5 This editorial shows what can happen when doctors violate the Shakespearean Principle. The story is real and tells of a woman whose doctor unintentionally killed her. To ensure anonymity, the time and place of the tragedy, as well as the players involved, have been changed.

Fred, Herbert L.

2012-01-01

205

NASA Technical Reports Server (NTRS)

The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

Hankins, D. B.; Wake, W. H.

1981-01-01

206

Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation

This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

Smith, D.L. [Argonne National Laboratory, 1710 Avenida Del Mundo 1506, Coronado, CA 92118 (United States)] [Argonne National Laboratory, 1710 Avenida Del Mundo 1506, Coronado, CA 92118 (United States); Otuka, N. [Nuclear Data Section, International Atomic Energy Agency, Wagramerstrasse 5, A-1400 Wien (Austria)] [Nuclear Data Section, International Atomic Energy Agency, Wagramerstrasse 5, A-1400 Wien (Austria)

2012-12-15

207

Experimental Nuclear Reaction Data Uncertainties: Basic Concepts and Documentation

NASA Astrophysics Data System (ADS)

This paper has been written to provide experimental nuclear data researchers and data compilers with practical guidance on dealing with experimental nuclear reaction data uncertainties. It outlines some of the properties of random variables as well as principles of data uncertainty estimation, and illustrates them by means of simple examples which are relevant to the field of nuclear data. Emphasis is placed on the importance of generating mathematical models (or algorithms) that can adequately represent individual experiments for the purpose of estimating uncertainties in their results. Several types of uncertainties typically encountered in nuclear data experiments are discussed. The requirements and procedures for reporting information on measurement uncertainties for neutron reaction data, so that they will be useful in practical applications, are addressed. Consideration is given to the challenges and opportunities offered by reports, conference proceedings, journal articles, and computer libraries as vehicles for reporting and documenting numerical experimental data. Finally, contemporary formats used to compile reported experimental covariance data in the widely used library EXFOR are discussed, and several samples of EXFOR files are presented to demonstrate their use.

Smith, D. L.; Otuka, N.

2012-12-01

208

The Principle of General Tovariance

We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's

C. J. M. Heunen; N. P. Landsman; B. A. W. Spitters

2008-01-01

209

Statistical Principles in Image Modeling

Images of natural scenes contain a rich variety of visual patterns. To learn and recognize these patterns from natural images, it is necessary to construct statistical models for these patterns. In this review arti- cle we describe three statistical principles for modeling image patterns: the sparse coding principle, the minimax entropy principle, and the meaningful alignment principle. We explain these

Ying Nian Wu; Jinhui Li; Ziqiang Liu; Song-Chun Zhu

2007-01-01

210

Non-scalar uncertainty: Uncertainty in dynamic systems

NASA Technical Reports Server (NTRS)

The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an accurate model than to get a precise solution to a model constrained by simplifying assumptions. Precision has a very heavy cost in present physical models, but this formalism allows the trade between uncertainty and simplicity. It was found that modeling reality sometimes requires that state transition probabilities should be manipulated as nonscalar quantities, finding at the end that there is always a transformation to get back to scalar probability.

Martinez, Salvador Gutierrez

1992-01-01

211

Uncertainty quantification using polynomial chaos expansion with points of monomial cubature rules

This paper proposes an efficient method for estimating uncertainty propagation and identifying influence factors contributing to uncertainty. In general, the system is dominated by some of the main effects and lower-order interactions due to the sparsity-of-effect principle. Therefore, the construction of polynomial chaos expansion with points of monomial cubature rules is particularly attractive in dealing with large computational model. This

D. L. Wei; Z. S. Cui; J. Chen

2008-01-01

212

Uncertainty as a Motivating Variable.

National Technical Information Service (NTIS)

An individual is uncertain when a situation elicits response alternatives no one of which is overwhelmingly dominant and his degree of uncertainty is a function both of the number of competing responses elicited by the situation and the relative response ...

J. T. Lanzetta

1967-01-01

213

Scientific Uncertainty: An Industry Perspective.

ERIC Educational Resources Information Center

Discusses the uncertainties inherent in assessing the nature and extent of any damage that might be attributed to acidic deposition. Probes associated dilemmas related to decisions involving control strategies, and indicates societal and legislative roles for solving this problem. (ML)

Perhac, Ralph

1986-01-01

214

Uncertainty Relation for Smooth Entropies

NASA Astrophysics Data System (ADS)

Uncertainty relations give upper bounds on the accuracy by which the outcomes of two incompatible measurements can be predicted. While established uncertainty relations apply to cases where the predictions are based on purely classical data (e.g., a description of the system’s state before measurement), an extended relation which remains valid in the presence of quantum information has been proposed recently [Berta et al., Nature Phys.NPAHAX1745-2473 6, 659 (2010)10.1038/nphys1734]. Here, we generalize this uncertainty relation to one formulated in terms of smooth entropies. Since these entropies measure operational quantities such as extractable secret key length, our uncertainty relation is of immediate practical use. To illustrate this, we show that it directly implies security of quantum key distribution protocols. Our security claim remains valid even if the implemented measurement devices deviate arbitrarily from the theoretical model.

Tomamichel, Marco; Renner, Renato

2011-03-01

215

The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

NASA Technical Reports Server (NTRS)

This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

2014-01-01

216

Uncertainty estimation by the concept of virtual instruments

NASA Astrophysics Data System (ADS)

For the calibration of length standards and instruments, various methods are available for which usually an uncertainty according to the GUM [1] can be set up. However, from calibration data of a measuring instrument it is not always evident what the uncertainty will be in an actual measurement (or calibration) using that calibrated instrument. Especially where many measured data are involved, such as in CMM measurements, but also in typical dimensional geometry measurements such as roughness, roundness and flatness measurements, setting up an uncertainty budget according to the GUM for each measurement can be tedious and even impossible. On the other hand, international standards require that for a proof of the conformance to specifications, the measurement uncertainty must be taken into account. Apart from this it is not so consistent that a lot is invested in the calibration of instruments where it is still unclear what the uncertainty is of measurements carried out with these 'calibrated' instruments. In this paper it is shown that the 'standard' GUM-uncertainty budget can be modified in several ways to accommodate more complicated measurements. Also, it is shown how this budget can be generated automatically by the measuring instrument, by the simulation of measurements by instruments with alternative metrological characteristics, so called virtual instruments. This can lead to a measuring instrument where, next to the measured value, also the uncertainty is displayed. It is shown how these principles are already used for roughness instruments, and how they can be used as well for e.g. roundness, cylindricity, flatness and CMM measurements.

Haitjema, Han; van Dorp, Bas W.; Morel, M.; Schellekens, Piet H. J.

2001-10-01

217

ERIC Educational Resources Information Center

The principles of readability are in every style manual. Readability formulas are in every writing aid. What is missing is the research and theory on which they stand. This short review of readability research spans 100 years. The first part covers the history of adult literacy studies in the U.S., establishing the stratified nature of the adult…

DuBay, William H.

2004-01-01

218

PRINCIPLES OF WATER FILTRATION

This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...

219

National Technical Information Service (NTIS)

We propose a new action principle to be associated with a noncommutative space (A,H,D). The universal formula for the spectral action is (psi,Dpsi) + Trace (X(D/lambda)) where psi is a spinor on the Hilbert space, lambda is a scale and x a positive functi...

A. H. Chamseddine A. Connes

1996-01-01

220

Principles of Applied Mathematics

NSDL National Science Digital Library

This course, presented by MIT and taught by professor Aslan Kasimov, describes basic principles of applied mathematics. Specifically, the material looks at mathematical analysis of continuum models of various natural phenomena. The course materials include student assignments and exams. MIT presents OpenCourseWare as free educational material online. No registration or enrollment is required to use the materials.

Kasimov, Aslan

2010-12-09

221

Principles of Plasma Diagnostics

This book provides a systematic introduction to the physics of plasma diagnostics measurements. It develops from first principles the concepts needed to plan, execute and interpret plasma measurements, making it a suitable book for graduate students and professionals with little plasma physics background. The book will also be a valuable reference for seasoned plasma physicists, both experimental and theoretical, as

I. H. Hutchinson

2002-01-01

222

Principles of antimicrobial prophylaxis

The most important principle in surgical antibiotic prophylaxis is to ensure high blood levels of antibiotic at the time of anticipated wound contamination. This is best achieved by intravenous administration commenced at the time of induction of anesthesia. The continued efficacy of prophylaxis depends on the implementation of policies that minimize the opportunities for bacteria to acquire resistance to antibiotics.

Douglas W. Burdon

1982-01-01

223

We propose and study the following Mirror Principle: certain sequences of multiplicative equivariant characteristic classes on Kontsevich's stable map moduli spaces can be computed in terms of certain hypergeometric type classes. As applications, we compute the equivariant Euler classes of obstruction bundles induced by any concavex bundles -- including any direct sum of line bundles -- on $\\\\P^n$. This includes

Bong H. Lian; Kefeng Liu; S. T. Yau

1997-01-01

224

Principles of sound ecotoxicology.

We have become progressively more concerned about the quality of some published ecotoxicology research. Others have also expressed concern. It is not uncommon for basic, but extremely important, factors to apparently be ignored. For example, exposure concentrations in laboratory experiments are sometimes not measured, and hence there is no evidence that the test organisms were actually exposed to the test substance, let alone at the stated concentrations. To try to improve the quality of ecotoxicology research, we suggest 12 basic principles that should be considered, not at the point of publication of the results, but during the experimental design. These principles range from carefully considering essential aspects of experimental design through to accurately defining the exposure, as well as unbiased analysis and reporting of the results. Although not all principles will apply to all studies, we offer these principles in the hope that they will improve the quality of the science that is available to regulators. Science is an evidence-based discipline and it is important that we and the regulators can trust the evidence presented to us. Significant resources often have to be devoted to refuting the results of poor research when those resources could be utilized more effectively. PMID:24512103

Harris, Catherine A; Scott, Alexander P; Johnson, Andrew C; Panter, Grace H; Sheahan, Dave; Roberts, Mike; Sumpter, John P

2014-03-18

225

Principles of respiratory protection.

This review describes the various types of respiratory protective devices used in Israel during the Persian Gulf war, and summarizes the relevant physiological concepts of respiratory protection. Physiological principles of modern devices with powered air supply are discussed in detail. Our experience may be useful in the evaluation of new respirators and in finding solutions for problematic subpopulations. PMID:1757237

Arad, M; Epstein, Y; Krasner, E; Danon, Y L; Atsmon, J

1991-01-01

226

National Technical Information Service (NTIS)

A relatively simple proof of the maximum principle is presented. The main objective was to obtain a proof, similar to that due to Halkin, but replacing the use of Brouwer's fixed point theorem by an easily proven contraction mapping theorem. The first use...

G. F. Bryant D. Q. Mayne

1973-01-01

227

Pattern recognition principles

NASA Technical Reports Server (NTRS)

The present work gives an account of basic principles and available techniques for the analysis and design of pattern processing and recognition systems. Areas covered include decision functions, pattern classification by distance functions, pattern classification by likelihood functions, the perceptron and the potential function approaches to trainable pattern classifiers, statistical approach to trainable classifiers, pattern preprocessing and feature selection, and syntactic pattern recognition.

Tou, J. T.; Gonzalez, R. C.

1974-01-01

228

ERIC Educational Resources Information Center

This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

Martz, Carlton

1999-01-01

229

ERIC Educational Resources Information Center

A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

Kamat, R. V.

1991-01-01

230

Principles of Biomedical Ethics

In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making.

Athar, Shahid

2012-01-01

231

Uncertainty in measurements by counting

NASA Astrophysics Data System (ADS)

Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

Bich, Walter; Pennecchi, Francesca

2012-02-01

232

Wildfire Decision Making Under Uncertainty

NASA Astrophysics Data System (ADS)

Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

Thompson, M.

2013-12-01

233

A Typology for Visualizing Uncertainty

Information analysts must rapidly assess information to determine its usefulness in supporting and informing decision makers. In addition to assessing the content, the analyst must also be confident about the quality and veracity of the information. Visualizations can concisely represent vast quantities of information thus aiding the analyst to examine larger quantities of material; however visualization programs are challenged to incorporate a notion of confidence or certainty because the factors that influence the certainty or uncertainty of information vary with the type of information and the type of decisions being made. For example, the assessment of potentially subjective human-reported data leads to a large set of uncertainty concerns in fields such as national security, law enforcement (witness reports), and even scientific analysis where data is collected from a variety of individual observers. What's needed is a formal model or framework for describing uncertainty as it relates to information analysis, to provide a consistent basis for constructing visualizations of uncertainty. This paper proposes an expanded typology for uncertainty, drawing from past frameworks targeted at scientific computing. The typology provides general categories for analytic uncertainty, a framework for creating task-specific refinements to those categories, and examples drawn from the national security field.

Thomson, Judi R.; Hetzler, Elizabeth G.; MacEachren, Alan; Gahegan, Mark N.; Pavel, Misha

2005-01-05

234

Structural model uncertainty in stochastic simulation

Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

McKay, M.D.; Morrison, J.D. [Los Alamos National Lab., NM (United States). Technology and Safety Assessment Div.

1997-09-01

235

The precautionary principle in environmental science.

Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy.

Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

2001-01-01

236

NASA Astrophysics Data System (ADS)

We consider a Generalized Uncertainty Principle (GUP) framework which predicts a maximal uncertainty in momentum and minimal uncertainties both in position and momentum. We apply supersymmetric quantum mechanics method and the shape invariance condition to obtain the exact harmonic oscillator eigenvalues in this GUP context. We find the supersymmetric partner Hamiltonians and show that the harmonic oscillator belongs to a hierarchy of Hamiltonians with a shift in momentum representation and different masses and frequencies. We also study the effect of a uniform electric field on the harmonic oscillator energy spectrum in this setup.

Asghari, M.; Pedram, P.; Nozari, K.

2013-10-01

237

Principles of Semiconductor Devices

NSDL National Science Digital Library

Home page of an online and interactive textbook, Principles of Semiconductor Devices., written by Bart J. Van Zeghbroeck, Ph.D., Professor in the Department of Electrical and Computer Engineering at the University of Colorado at Boulder. The goal of this text is to provide the basic principles of common semiconductor devices, with a special focus on Metal-Oxide-Semiconductor Field-Effect-Transistors (MOSFETs). A browser environment was chosen so that text, figures and equations can be linked for easy reference. A table of contents, a glossary, active figures and some study aids are integrated with the text with the intention to provide a more effective reference and learning environment. Chapter titles include: Semiconductor Fundamentals, Metal-Semiconductor Junctions, p-n Junctions, Bipolar Transistors, MOS Capacitors, and MOSFET.

Van Zeghbroeck, Bart J.

2011-06-13

238

Common Principles and Multiculturalism

Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.

Zahedi, Farzaneh; Larijani, Bagher

2009-01-01

239

This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

Aswathanarayana, U.

1985-01-01

240

NSDL National Science Digital Library

The Principle's of Flight Web site is offered by the Pilot's Web Aviation Journal and contains an excellent introduction to the physics of flight. Topics include Newton's laws of motion and force, airfoils, lift and drag, forces acting on an airplane, speed, flight maneuvers, the effects of roll, and more. Each topic contains good illustrations, descriptions, and equations. Overall, the site is an interesting and informative look behind the science of flight.

2001-01-01

241

Principles of lake sedimentology

This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.

Janasson, L.

1983-01-01

242

Principles of gravitational biology

NASA Technical Reports Server (NTRS)

Physical principles of gravitation are enumerated, including gravitational and inertial forces, weight and mass, weightlessness, size and scale effects, scale limits of gravitational effects, and gravity as biogenic factor. Statocysts, otolithic organs of vertebrates, gravity reception in plants, and clinostat studies for gravitation orientation are reviewed. Chronic acceleration is also studied, as well as physiology of hyper and hypodynamic fields. Responses of animals to a decreased acceleration field are examined, considering postural changes, work capacity, growth, and physiologic deadaptation.

Smith, A. H.

1975-01-01

243

This book is an introduction on fluid mechanics incorporating computer applications. Topics covered are as follows: brief history; what is a fluid; two classes of fluids: liquids and gases; the continuum model of a fluid; methods of analyzing fluid flows; important characteristics of fluids; fundamentals and equations of motion; fluid statics; dimensional analysis and the similarity principle; laminar internal flows; ideal flow; external laminar and channel flows; turbulent flow; compressible flow; fluid flow measurements.

Kreider, J.F.

1985-01-01

244

Uncertainty quantification in reacting flow.

Chemically reacting flow models generally involve inputs and parameters that are determined from empirical measurements, and therefore exhibit a certain degree of uncertainty. Estimating the propagation of this uncertainty into computational model output predictions is crucial for purposes of reacting flow model validation, model exploration, as well as design optimization. Recent years have seen great developments in probabilistic methods and tools for efficient uncertainty quantification (UQ) in computational models. These tools are grounded in the use of Polynomial Chaos (PC) expansions for representation of random variables. The utility and effectiveness of PC methods have been demonstrated in a range of physical models, including structural mechanics, transport in porous media, fluid dynamics, aeronautics, heat transfer, and chemically reacting flow. While high-dimensionality remains nominally an ongoing challenge, great strides have been made in dealing with moderate dimensionality along with non-linearity and oscillatory dynamics. In this talk, I will give an overview of UQ in chemical systems. I will cover both: (1) the estimation of uncertain input parameters from empirical data, and (2) the forward propagation of parametric uncertainty to model outputs. I will cover the basics of forward PC UQ methods with examples of their use. I will also highlight the need for accurate estimation of the joint probability density over the uncertain parameters, in order to arrive at meaningful estimates of model output uncertainties. Finally, I will discuss recent developments on the inference of this density given partial information from legacy experiments, in the absence of raw data.

Marzouk, Youssef M. (MIT, Cambridge, MA); Debusschere, Bert J.; Najm, Habib N.; Berry, Robert Bruce

2010-06-01

245

Communicating Uncertainties on Climate Change

NASA Astrophysics Data System (ADS)

The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

Planton, S.

2009-09-01

246

Teaching professionalism: general principles.

There are educational principles that apply to the teaching of professionalism during undergraduate education and postgraduate training. It is axiomatic that there is a single cognitive base that applies with increasing moral force as students enter medical school, progress to residency or registrar training, and enter practice. While parts of this body of knowledge are easier to teach and learn at different stages of an individual's career, it remains a definable whole at all times and should be taught as such. While the principle that self-reflection on theoretical and real issues encountered in the life of a student, resident or practitioner is essential to the acquisition of experiential learning and the incorporation of the values and behaviors of the professional, the opportunities to provide situations where this can take place will change as an individual progresses through the system, as will the sophistication of the level of learning. Teaching the cognitive base of professionalism and providing opportunities for the internalization of its values and behaviors are the cornerstones of the organization of the teaching of professionalism at all levels. Situated learning theory appears to provide practical guidance as to how this may be implemented. While the application of this theory will vary with the type of curriculum, the institutional culture and the resources available, the principles outlined should remain constant. PMID:16753716

Cruess, Richard L; Cruess, Sylvia R

2006-05-01

247

NASA Astrophysics Data System (ADS)

The principle of least action in its original form á la Maupertuis is used to explain geodetic and frame-dragging precessions which are customarily accounted for a curved space-time in general relativity. The least-time equations of motion agree with observations and are also in concert with general relativity. Yet according to the least-time principle, gravitation does not relate to the mathematical metric of space-time, but to a tangible energy density embodied by photons. The density of free space is in balance with the total mass of the Universein accord with the Planck law. Likewise, a local photon density and its phase distribution are in balance with the mass and charge distribution of a local body. Here gravitational force is understood as an energy density difference that will diminish when the oppositely polarized pairs of photons co-propagate from the energy-dense system of bodies to the energy-sparse system of the surrounding free space. Thus when the body changes its state of motion, the surrounding energy density must accommodate the change. The concurrent resistance in restructuring the surroundings, ultimately involving the entire Universe, is known as inertia. The all-around propagating energy density couples everything with everything else in accord with Mach’s principle.

Annila, Arto

2012-06-01

248

The Economics of Uncertainty Vi.

National Technical Information Service (NTIS)

The empirical evidence--from laboratory experiments and economic observations--which may be relevant to the Bernoulli Principle is reviewed. It is concluded that, apart from obvious mistakes, people seem to make the decisions which will maximize expected ...

K. Borch

1965-01-01

249

Uncertainty representation using fuzzy measures.

We introduce the fuzzy measure and discuss its use as a unifying structure for modeling knowledge about an uncertain variable. We show that a large class of well-established types of uncertainty representations can be modeled within this framework. A view of the Dempster-Shafer (D-S) belief structure as an uncertainty representation corresponding to a set of possible fuzzy measures is discussed. A methodology for generating this set of fuzzy measures from a belief structure is described. A measure of entropy associated with a fuzzy measure is introduced and its manifestation for different fuzzy measures is described. The problem of uncertain decision making for the case in which the uncertainty represented by a fuzzy measure is considered. The Choquet integral is introduced as providing a generalization of the expected value to this environment. PMID:18238099

Yager, R R

2002-01-01

250

Climate negotiations under scientific uncertainty

How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk.

Barrett, Scott; Dannenberg, Astrid

2012-01-01

251

Estimating uncertainty in resolution tests

NASA Astrophysics Data System (ADS)

Resolution testing of imaging optical equipment is still commonly performed using the USAF 1951 target. The limiting resolution is normally calculated from the group and element that can just be resolved by an observer. Although resolution testing has limitations, its appeal lies in the fact that it is a quick test with low complexity. Resolution uncertainty can serve as a diagnostic tool, aid in understanding observer variability, and assist in planning experiments. It may also be necessary to satisfy a customer requirement or international standard. This paper derives theoretical results for estimating resolution and calculating its uncertainty, based on observer measurements, while taking the target spatial-frequency quantization into account. We show that estimating the resolution by simply averaging the target spatial frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger uncertainty analysis.

Goncalves, Duarte P.; Griffith, Derek J.

2006-05-01

252

Linear Programming Problems for Generalized Uncertainty

ERIC Educational Resources Information Center

Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

Thipwiwatpotjana, Phantipa

2010-01-01

253

Uncertainty of temperature measurement with thermal cameras

All main international metrological organizations are proposing a parameter called uncertainty as a measure of the accuracy of measurements. A mathematical model that enables the calculations of uncertainty of temperature measurement with thermal cameras is presented. The standard uncertainty or the expanded uncertainty of temperature measurement of the tested object can be calculated when the bounds within which the real

Krzysztof Chrzanowski; Robert Matyszkiel; Joachim Fischer; Jaroslaw Barela

2001-01-01

254

Nanosecond delay with subpicosecond uncertainty.

We have combined a commercially available, variable-length coaxial delay line (trombone line) with a high-resolution linear translation system. The result is better resolution and lower uncertainty in the achievable delays than previously available. The range of delay is 0 ps to approximately 1250 ps, the bidirectional resolution is 2.0 ps, the unidirectional resolution is 0.2 ps, and the uncertainty (95% confidence interval) in the measured delay is +/-0.09 ps. Drift, temperature dependence, repeatability, linearity, and hysteresis were also examined. PMID:17764341

Larson, Donald R; Paulter, Nicholas G

2007-08-01

255

Nanosecond delay with subpicosecond uncertainty

NASA Astrophysics Data System (ADS)

We have combined a commercially available, variable-length coaxial delay line (trombone line) with a high-resolution linear translation system. The result is better resolution and lower uncertainty in the achievable delays than previously available. The range of delay is 0 ps to approximately 1250 ps, the bidirectional resolution is 2.0 ps, the unidirectional resolution is 0.2 ps, and the uncertainty (95% confidence interval) in the measured delay is +/-0.09 ps. Drift, temperature dependence, repeatability, linearity, and hysteresis were also examined.

Larson, Donald R.; Paulter, Nicholas G.

2007-08-01

256

Awe, uncertainty, and agency detection.

Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728

Valdesolo, Piercarlo; Graham, Jesse

2014-01-01

257

Guiding Principles for Diabetes Care.

National Technical Information Service (NTIS)

The National Diabetes Education Program (NDEP) has developed these Guiding Principles for Diabetes Care to help the health care team manage the disease effectively. The principles outline seven essential components of quality diabetes care that form the b...

2004-01-01

258

A methodology for quantifying uncertainty in models

This paper, condensed from McKay et al. (1992) outlines an analysis of uncertainty in the output of computer models arising from uncertainty in inputs (parameters). Uncertainty of this type most often arises when proper input values are imprecisely known. Uncertainty in the output is quantified in its probability distribution, which results from treating the inputs as random variables. The assessment of which inputs are important (sensitivity analysis) with respect to uncertainty is done relative to the probability distribution of the output.

McKay, M.D.; Beckman, R.J.

1993-09-01

259

Archimedes' Principle in General Coordinates

ERIC Educational Resources Information Center

Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

Ridgely, Charles T.

2010-01-01

260

Uncertainty of height information in coherence scanning interferometry

NASA Astrophysics Data System (ADS)

Coherence scanning interferometry CSI with a broadband light source (short known as white light interferometry) is, beside the confocal technique, one of the most popular optical principles to measure surface topography. Compared to coherent interferometry, the broadband light source leads, theoretically, to an unambiguous phase information. The paper describes the properties of the correlogram in the spatial and in the frequency domain. All deviations from the ideal correlogram are expressed by an addition phase term. The uncertainty of height information is discussed for both, the frequency domain analyse (FDA) proposed by de Groot and the Hilbert transform. For the frequency domain analyse, the uncertainty is quantified by the Cramér-Rao bound. The second part of the paper deals with the phase evaluation of the correlogram, which is necessary to achieve a high vertical resolution. Because the envelope function is often distorted, phase jumps lead to ambiguous height informations. In particular, this effect can be observed measuring rough surfaces.

Seewig, J.; Böttner, T.; Broschart, D.

2011-05-01

261

Measuring uncertainty by extracting fuzzy rules using rough sets

NASA Technical Reports Server (NTRS)

Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

Worm, Jeffrey A.

1991-01-01

262

Complex Correspondence Principle

NASA Astrophysics Data System (ADS)

Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

Bender, Carl M.; Hook, Daniel W.; Meisinger, Peter N.; Wang, Qing-Hai

2010-02-01

263

NSDL National Science Digital Library

This activity is used in Principles of Sociology class for undergraduate students. This activity looks at the labor force and factors that affect occupation over time in the United States on a state-by-state basis. This activity uses a customized data set made from combining census information from 1950-1990. It guides students through data manipulation using WebCHIP software found at DataCounts!. To open WebCHIP with the dataset for the activity, please see instructions and links in the exercise documents under teaching materials. For more information on how to use WebCHIP, see the How To section on DataCounts!

Ciabattari, Theresa

264

Principles of Environmental Chemistry

NASA Astrophysics Data System (ADS)

Roy M. Harrison, Editor RSC Publishing; ISBN 0854043713; x + 363 pp.; 2006; $69.95 Environmental chemistry is an interdisciplinary science that includes chemistry of the air, water, and soil. Although it may be confused with green chemistry, which deals with potential pollution reduction, environmental chemistry is the scientific study of the chemical and biochemical principles that occur in nature. Therefore, it is the study of the sources, reactions, transport, effects, and fates of chemical species in the air, water, and soil environments, and the effect of human activity on them. Environmental chemistry not only explores each of these environments, but also closely examines the interfaces and boundaries where the environments intersect.

Hathaway, Ruth A.

2007-07-01

265

Exploring Uncertainty with Projectile Launchers

ERIC Educational Resources Information Center

The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

Orzel, Chad; Reich, Gary; Marr, Jonathan

2012-01-01

266

Impact of orifice metering uncertainties

In a recent utility study, attributed 38% of its unaccounted-for UAF gas to orifice metering uncertainty biasing caused by straightening vanes. How this was determined and how this applied to the company's orifice meters is described. Almost all (97%) of the company's UAF gas was found to be attributed to identifiable accounting procedures, measurement problems, theft and leakage.

Stuart, J.W. (Pacific Gas and Electric Co., San Francisco, CA (USA))

1990-12-01

267

Quantification of entanglement via uncertainties

We show that entanglement of pure multiparty states can be quantified by means of quantum uncertainties of certain basic observables through the use of a measure that was initially proposed by Klyachko et al. [Appl. Phys. Lett. 88, 124102 (2006)] for bipartite systems.

Klyachko, Alexander A.; Oeztop, Baris; Shumovsky, Alexander S. [Faculty of Science, Bilkent University, Bilkent, Ankara, 06800 Turkey (Turkey)

2007-03-15

268

HISTORY MATCHING WITH UNCERTAINTY QUANTIFICATION

SUMMARY We present three methods for history matching and uncertainty quantification tested on a syn- thetic test case. The test case contains active grid blocks. There are six production wells in the reservoir. A Bayesian approach is used. Based on production data and well observations of permeability and porosity, samples from the posterior distribution for permeability and porosity are generated.

Lars Holden; Harald H. Soleng; Anne Randi

269

Spatial uncertainty and ecological models

Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

Jager, Yetta [ORNL; King, Anthony Wayne [ORNL

2004-07-01

270

Mechanism Design with Execution Uncertainty

We introduce the notion of fault tolerant mechanism design, which extends the stan- dard game theoretic framework of mechanism design to allow for uncertainty about execu- tion. Specifically, we define the problem of task allocation in which the private informa- tion of the agents is not only their costs to attempt the tasks, but also their probabili- ties of failure.

Ryan Porter; Amir Ronen; Yoav Shoham; Moshe Tennenholtz

2002-01-01

271

The first supplement for the international document Guide to Expression of Uncertainty in Measurement suggests to apply principle of maximum entropy in assigning a probability to a measurable quantity based on various types of information. This paper discusses the optimization algorithms in the maximum entropy distribution estimation. By an analysis to the characters of non-linear programming problem in this paper,

Fang Xinghua; Song Mingshun

2010-01-01

272

Structural Damage Assessment under Uncertainty

NASA Astrophysics Data System (ADS)

Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are isolated based on the integration of sensitivity analysis and statistical sampling, which minimizes the occurrence of false-damage indication due to uncertainty. To perform diagnostic decision-making under uncertainty, an evidential reasoning approach for damage assessment is developed for addressing the possible imprecision in the damage localization results. The newly developed damage detection and localization techniques are applied and validated through both vibration test data from literature and in house laboratory experiments.

Lopez Martinez, Israel

273

Principles of Safety Pharmacology

Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.).

Pugsley, M K; Authier, S; Curtis, M J

2008-01-01

274

[Principles of callus distraction].

Callus distraction is based on the principle of regenerating bone by continuous distraction of proliferating callus tissue. It has become the standard treatment of significant leg shortening and large bone defects. Due to many problems and complications, exact preoperative planning, operative technique and careful postoperative follow-up are essential. External fixators can be used for all indications of callus distraction. However, due to pin tract infections, pain and loss of mobility caused by soft tissue transfixation, fixators are applied in patients with open growth plates, simultaneous lengthening with continuous deformity corrections, and increased risk of infection. Distraction over an intramedullary nail allows removal of the external fixator at the end of distraction before callus consolidation (monorail method). The intramedullary nail protects newly formed callus tissue and reduces the risk of axial deviation and refractures. Recently developed, fully intramedullary lengthening devices eliminate fixator-associated complications and accelerate return to normal daily activities. This review describes principles of callus distraction, potential complications and their management. PMID:15452653

Hankemeier, S; Bastian, L; Gosling, T; Krettek, C

2004-10-01

275

Principle of relative locality

We propose a deepening of the relativity principle according to which the invariant arena for nonquantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them. This framework, in which absolute locality is replaced by relative locality, results from deforming energy-momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of energy-momentum space geometry, such as its curvature, torsion and nonmetricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of energy-momentum space with a metric compatible connection and constant curvature.

Amelino-Camelia, Giovanni [Dipartimento di Fisica, Universita 'La Sapienza', and Sez. Roma1 INFN, P. le A. Moro 2, 00185 Roma (Italy); Freidel, Laurent; Smolin, Lee [Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo, Ontario N2J 2Y5 (Canada); Kowalski-Glikman, Jerzy [Institute for Theoretical Physics, University of Wroclaw, Pl. Maxa Borna 9, 50-204 Wroclaw (Poland)

2011-10-15

276

Cosmology with Minimal Length Uncertainty Relations

NASA Astrophysics Data System (ADS)

We study the effects of the existence of a minimal observable length in the phase space of classical and quantum de Sitter (dS) and anti-de Sitter (AdS) cosmology. Since this length has been suggested in quantum gravity and string theory, its effects in the early universe might be expected. Adopting the existence of such a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between minisuperspace variables and their momenta operators. We extend these deformed commutating relations to the corresponding deformed Poisson algebra in the classical limit. Using the resulting Poisson and Heisenberg relations, we then construct the classical and quantum cosmology of dS and AdS models in a canonical framework. We show that in classical dS cosmology this effect yields an inflationary universe in which the rate of expansion is larger than that of the usual dS universe. Also, for the AdS model it is shown that the GUP might change the oscillatory nature of the corresponding cosmology. We also study the effects of the GUP in quantized models through approximate analytical solutions to the Wheeler-DeWitt (WD) equation, in the limit of a small scale factor for the universe, and compare the results with the ordinary quantum cosmology in each case.

Vakili, Babak

277

Principles for high-quality, high-value testing.

A survey of doctors working in two large NHS hospitals identified over 120 laboratory tests, imaging investigations and investigational procedures that they considered not to be overused. A common suggestion in this survey was that more training was required. And, this prompted the development of a list of core principles for high-quality, high-value testing. The list can be used as a framework for training and as a reference source. The core principles are: (1) Base testing practices on the best available evidence. (2) Apply the evidence on test performance with careful judgement. (3) Test efficiently. (4) Consider the value (and affordability) of a test before requesting it. (5) Be aware of the downsides and drivers of overdiagnosis. (6) Confront uncertainties. (7) Be patient-centred in your approach. (8) Consider ethical issues. (9) Be aware of normal cognitive limitations and biases when testing. (10) Follow the 'knowledge journey' when teaching and learning these core principles. PMID:22740357

Power, Michael; Fell, Greg; Wright, Michael

2013-02-01

278

Principles for high-quality, high-value testing

A survey of doctors working in two large NHS hospitals identified over 120 laboratory tests, imaging investigations and investigational procedures that they considered not to be overused. A common suggestion in this survey was that more training was required. And, this prompted the development of a list of core principles for high-quality, high-value testing. The list can be used as a framework for training and as a reference source. The core principles are: (1) Base testing practices on the best available evidence. (2) Apply the evidence on test performance with careful judgement. (3) Test efficiently. (4) Consider the value (and affordability) of a test before requesting it. (5) Be aware of the downsides and drivers of overdiagnosis. (6) Confront uncertainties. (7) Be patient-centred in your approach. (8) Consider ethical issues. (9) Be aware of normal cognitive limitations and biases when testing. (10) Follow the ‘knowledge journey’ when teaching and learning these core principles.

Power, Michael; Fell, Greg; Wright, Michael

2013-01-01

279

Quantification of uncertainty in geochemical reactions

NASA Astrophysics Data System (ADS)

Predictions of reactive transport in the subsurface are routinely compromised by both model (structural) and parametric uncertainties. We present a set of computational tools for quantifying these two types of uncertainties. The model uncertainty is resolved at the molecular scale where epistemic uncertainty incorporates aleatory uncertainty. The parametric uncertainty is resolved at both molecular and continuum (Darcy) scales. We use the proposed approach to quantify uncertainty in modeling the sorption of neptunium through a competitive ion exchange. This radionuclide is of major concern for various high-level waste storage projects because of its relatively long half-life and its high-solubility and low-sorption properties. We demonstrate how parametric and model uncertainties affect one's ability to estimate the distribution coefficient. The uncertainty quantification tools yield complete probabilistic descriptions of key parameters affecting the fate and migration of neptunium in the subsurface rather than the lower statistical moments. This is important, since these distributions are highly skewed.

Srinivasan, Gowri; Tartakovsky, Daniel M.; Robinson, Bruce A.; Aceves, Alejandro B.

2007-12-01

280

Experimental uncertainty estimation and statistics for data having interval uncertainty.

This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

2007-05-01

281

Evaluating the uncertainty of input quantities in measurement models

NASA Astrophysics Data System (ADS)

The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

Possolo, Antonio; Elster, Clemens

2014-06-01

282

Scientific basis for the Precautionary Principle.

The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as "stressors", inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones. PMID:15990131

Vineis, Paolo

2005-09-01

283

Dynamical principles in neuroscience

Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?.

Rabinovich, Mikhail I.; Varona, Pablo; Selverston, Allen I.; Abarbanel, Henry D. I. [Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and GNB, Departamento de Ingenieria Informatica, Universidad Autonoma de Madrid, 28049 Madrid, Spain and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States) and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States); Department of Physics and Marine Physical Laboratory, Scripps Institution of Oceanography and Institute for Nonlinear Science, University of California, San Diego, 9500 Gilman Drive 0402, La Jolla, California 92093-0402 (United States)

2006-10-15

284

There is a growing pressure on clinical chemistry laboratories to conform to quality standards that require the evaluation and expression of the uncertainty of results of measurement. Nevertheless, there is some reluctance to accept the uncertainty concept in the analytical community due to difficulty in evaluating uncertainty in practice. For example, often the uncertainty of some uncertainty components is not known very well in clinical chemistry measurements, such as those associated with matrix effects or with the values of the calibrators. Moreover, it is not clear how to interpret uncertainty in relation to diagnostic criteria, reference ranges and other decision limits in clinical chemistry practice. Hence, the value of reporting the uncertainty of the measurement result is not obvious. In this paper it is suggested a relatively simple, logical procedure for evaluating measurement uncertainty based on the principles in the Guide for the Expression of Uncertainty of Measurement (GUM). The measurement process is partitioned into elements that are well known to the analyst, namely sampling, calibration, and analysis. The corresponding model function expresses the result of a measurement as the value obtained by the analytical procedure multiplied by the correction factors for sampling bias, for bias caused by the calibrators, and for other types of bias. Under normal conditions, when the measurement procedure is validated and corrected for all known bias, the expected value of each correction factor is one. The uncertainty that remains with regard to sampling, manufacturing of calibrators and other types of bias is combined with the analytical imprecision to yield a combined uncertainty of a result of measurement. The advantages of this approach are: (i) Data from the method validation, internal quality control and from participation in external quality control schemes can be used as input in the uncertainty evaluation process. (ii) The partition of the measurement into well-defined tasks highlights the different responsibilities of the clinical chemistry laboratory and of the manufacturer of reagents and calibrators. (iii) The approach can be used to harmonize the uncertainty evaluation process, which is particularly relevant for laboratories seeking accreditation under ISO 17025. The application of the proposed model is demonstrated by evaluating the uncertainty of a result of a measurement of prolactin in human serum. In the example it is shown how to treat the uncertainty associated with a calibrator supplied with a commercial analytical kit, and how to evaluate the uncertainty associated with matrix effects. PMID:11758604

Kristiansen, J

2001-10-01

285

Quantifying the uncertainty in heritability.

The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large. PMID:24670270

Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

2014-05-01

286

Uncertainty in flood risk mapping

NASA Astrophysics Data System (ADS)

A flood refers to a sharp increase of water level or volume in rivers and seas caused by sudden rainstorms or melting ice due to natural factors. In this paper, the flooding of riverside urban areas caused by sudden rainstorms will be studied. In this context, flooding occurs when the water runs above the level of the minor river bed and enters the major river bed. The level of the major bed determines the magnitude and risk of the flooding. The prediction of the flooding extent is usually deterministic, and corresponds to the expected limit of the flooded area. However, there are many sources of uncertainty in the process of obtaining these limits, which influence the obtained flood maps used for watershed management or as instruments for territorial and emergency planning. In addition, small variations in the delineation of the flooded area can be translated into erroneous risk prediction. Therefore, maps that reflect the uncertainty associated with the flood modeling process have started to be developed, associating a degree of likelihood with the boundaries of the flooded areas. In this paper an approach is presented that enables the influence of the parameters uncertainty to be evaluated, dependent on the type of Land Cover Map (LCM) and Digital Elevation Model (DEM), on the estimated values of the peak flow and the delineation of flooded areas (different peak flows correspond to different flood areas). The approach requires modeling the DEM uncertainty and its propagation to the catchment delineation. The results obtained in this step enable a catchment with fuzzy geographical extent to be generated, where a degree of possibility of belonging to the basin is assigned to each elementary spatial unit. Since the fuzzy basin may be considered as a fuzzy set, the fuzzy area of the basin may be computed, generating a fuzzy number. The catchment peak flow is then evaluated using fuzzy arithmetic. With this methodology a fuzzy number is obtained for the peak flow, which indicates all possible peak flow values and the possibility of their occurrence. To produce the LCM a supervised soft classifier is used to perform the classification of a satellite image and a possibility distribution is assign to the pixels. These extra data provide additional land cover information at the pixel level and allow the assessment of the classification uncertainty, which is then considered in the identification of the parameters uncertainty used to compute peak flow. The proposed approach was applied to produce vulnerability and risk maps that integrate uncertainty in the urban area of Leiria, Portugal. A SPOT - 4 satellite image and DEMs of the region were used and the peak flow was computed using the Soil Conservation Service method. HEC-HMS, HEC-RAS, Matlab and ArcGIS software programs were used. The analysis of the results obtained for the presented case study enables the order of magnitude of uncertainty on the watershed peak flow value and the identification of the areas which are more susceptible to flood risk to be identified.

Gonçalves, Luisa M. S.; Fonte, Cidália C.; Gomes, Ricardo

2014-05-01

287

Predictive uncertainty in environmental modelling.

Artificial neural networks have proved an attractive approach to non-linear regression problems arising in environmental modelling, such as statistical downscaling, short-term forecasting of atmospheric pollutant concentrations and rainfall run-off modelling. However, environmental datasets are frequently very noisy and characterized by a noise process that may be heteroscedastic (having input dependent variance) and/or non-Gaussian. The aim of this paper is to review existing methodologies for estimating predictive uncertainty in such situations and, more importantly, to illustrate how a model of the predictive distribution may be exploited in assessing the possible impacts of climate change and to improve current decision making processes. The results of the WCCI-2006 predictive uncertainty in environmental modelling challenge are also reviewed, suggesting a number of areas where further research may provide significant benefits. PMID:17531441

Cawley, Gavin C; Janacek, Gareth J; Haylock, Malcolm R; Dorling, Stephen R

2007-05-01

288

A Qualitative Approach to Uncertainty

NASA Astrophysics Data System (ADS)

We focus on modelling dual epistemic attitudes (belief-disbelief, knowledge-ignorance, like-dislike) of an agent. This provides an interesting way to express different levels of uncertainties explicitly in the logical language. After introducing a dual modal framework, we discuss the different possibilities of an agent's attitude towards a proposition that can be expressed in this framework, and provide a preliminary look at the dynamics of the situation.

Ghosh, Sujata; Velázquez-Quesada, Fernando R.

289

Parametric Uncertainty Modeling using LFTs

In this paper a general approach for modelling structured real-valued parametric perturbations is presented. It is based on a decomposition of perturbations into linear fractional transformations (LFTs), and is applicable to rational multi-dimensional (ND) polynomial perturbations of entries in state-space models. Model reduction is used to reduce the size of the uncertainty structure. The procedure will be applied for the

Paul Lambrechts; Jan Terlouw; Samir Bennani; Maarten Steinbuch

1993-01-01

290

Fuzzy-algebra uncertainty assessment

A significant number of analytical problems (for example, abnormal-environment safety analysis) depend on data that are partly or mostly subjective. Since fuzzy algebra depends on subjective operands, we have been investigating its applicability to these forms of assessment, particularly for portraying uncertainty in the results of PRA (probabilistic risk analysis) and in risk-analysis-aided decision-making. Since analysis results can be a major contributor to a safety-measure decision process, risk management depends on relating uncertainty to only known (not assumed) information. The uncertainties due to abnormal environments are even more challenging than those in normal-environment safety assessments; and therefore require an even more judicious approach. Fuzzy algebra matches these requirements well. One of the most useful aspects of this work is that we have shown the potential for significant differences (especially in perceived margin relative to a decision threshold) between fuzzy assessment and probabilistic assessment based on subtle factors inherent in the choice of probability distribution models. We have also shown the relation of fuzzy algebra assessment to ``bounds`` analysis, as well as a description of how analyses can migrate from bounds analysis to fuzzy-algebra analysis, and to probabilistic analysis as information about the process to be analyzed is obtained. Instructive examples are used to illustrate the points.

Cooper, J.A. [Sandia National Labs., Albuquerque, NM (United States); Cooper, D.K. [Naval Research Lab., Washington, DC (United States)

1994-12-01

291

Uncertainty propagation in nuclear forensics.

Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined. PMID:24607529

Pommé, S; Jerome, S M; Venchiarutti, C

2014-07-01

292

Uncertainty compliant design flood estimation

NASA Astrophysics Data System (ADS)

infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of Uncertainty Compliant Design (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimizes the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.

Botto, A.; Ganora, D.; Laio, F.; Claps, P.

2014-05-01

293

Quantifying Uncertainty in Epidemiological Models

Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

Ramanathan, Arvind [ORNL; Jha, Sumit Kumar [University of Central Florida

2012-01-01

294

Quantification of uncertainties in composites

NASA Technical Reports Server (NTRS)

An integrated methodology is developed for computationally simulating the probabilistic composite material properties at all composite scales. The simulation requires minimum input consisting of the description of uncertainties at the lowest scale (fiber and matrix constituents) of the composite and in the fabrication process variables. The methodology allows the determination of the sensitivity of the composite material behavior to all the relevant primitive variables. This information is crucial for reducing the undesirable scatter in composite behavior at its macro scale by reducing the uncertainties in the most influential primitive variables at the micro scale. The methodology is computationally efficient. The computational time required by the methodology described herein is an order of magnitude less than that for Monte Carlo Simulation. The methodology has been implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of the methodology/code are demonstrated by simulating the uncertainties in the heat-transfer, thermal, and mechanical properties of a typical laminate and comparing the results with the Monte Carlo simulation method and experimental data. The important observation is that the computational simulation for probabilistic composite mechanics has sufficient flexibility to capture the observed scatter in composite properties.

Liaw, D. G.; Singhal, S. N.; Murthy, P. L. N.; Chamis, Christos C.

1993-01-01

295

The precautionary principle is incoherent.

This article argues that no version of the precautionary principle can be reasonably applied to decisions that may lead to fatal outcomes. In support of this strong claim, a number of desiderata are proposed, which reasonable rules for rational decision making ought to satisfy. Thereafter, two impossibility theorems are proved, showing that no version of the precautionary principle can satisfy the proposed desiderata. These theorems are directly applicable to recent discussions of the precautionary principle in medicine, biotechnology, environmental management, and related fields. The impossibility theorems do not imply, however, that the precautionary principle is of no relevance at all in policy discussions. Even if it is not a reasonable rule for rational decision making, it is possible to interpret the precautionary principle in other ways, e.g., as an argumentative tool or as an epistemic principle favoring a reversed burden of proof. PMID:16834620

Peterson, Martin

2006-06-01

296

Uncertainty Analysis of Instrument Calibration and Application.

National Technical Information Service (NTIS)

Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of indiv...

J. S. Tripp P. Tcheng

1999-01-01

297

Measurement Uncertainty Considerations for Coordinate Measuring Machines.

National Technical Information Service (NTIS)

The report examines some uncertainty considerations for dimensional measurements performed on a three axis coordiante measuring machine (CMM). The interaction between measurement uncertainty and part tolerance is briefly presented, and the factors affecti...

B. Borchardt, G. Caskey, S. D. Phillips

1993-01-01

298

Extended Forward Sensitivity Analysis for Uncertainty Quantification.

National Technical Information Service (NTIS)

This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a black box approach. The simulation tool is treated as an unk...

H. Zhao V. A. Mousseau

2008-01-01

299

Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

NASA Astrophysics Data System (ADS)

A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

300

Uncertainty in visual processes predicts geometrical optical illusions.

It is proposed in this paper that many geometrical optical illusions, as well as illusory patterns due to motion signals in line drawings, are due to the statistics of visual computations. The interpretation of image patterns is preceded by a step where image features such as lines, intersections of lines, or local image movement must be derived. However, there are many sources of noise or uncertainty in the formation and processing of images, and they cause problems in the estimation of these features; in particular, they cause bias. As a result, the locations of features are perceived erroneously and the appearance of the patterns is altered. The bias occurs with any visual processing of line features; under average conditions it is not large enough to be noticeable, but illusory patterns are such that the bias is highly pronounced. Thus, the broader message of this paper is that there is a general uncertainty principle which governs the workings of vision systems, and optical illusions are an artifact of this principle. PMID:14751556

Fermüller, Cornelia; Malm, Henrik

2004-03-01

301

Calibration uncertainty of assembled array hydrophones

This paper presents a method to predict the uncertainty of a procedure for calibration of fully assembled array hydrophones in a shallow water environment. All error sources (due to measurement elements, instruments, background noise, calibration setup and environment) of the calibration procedure are defined and combine to predict the final calibration uncertainty theoretically. The methodology for predicting the calibration uncertainty

Y. Gao; P. Harvey; P. Cooper; P. Baker

2010-01-01

302

Optimization Under Uncertainty in Online Trading Agents

Reasoning about uncertainty is an increasingly important aspect of automated decision making in domains such as airline crew scheduling, vehicle routing, and supply chain management. In this thesis, I examine the impact of various types of uncertainty on automated reasoning in such domains, as well as eectiv e methods for addressing the uncertainty. The specic problem facing agents in the

Michael Benisch

303

Inflation and inflation uncertainty in Turkey

This article investigates the relationship between inflation and inflation uncertainty, and the impact of monetary policy on this relationship using monthly Turkish inflation data over January 1984 to–October 2005. The results from various types of GARCH-M models indicate that higher inflation rates lead to greater inflation uncertainty. On the other hand, the effect of inflation uncertainty on inflation is found

Sami Keskek; Mehmet Orhan

2010-01-01

304

Assessment of Uncertainty-Infused Scientific Argumentation

ERIC Educational Resources Information Center

Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

2014-01-01

305

Magnetic Core Memory Principles

NSDL National Science Digital Library

A researcher from the Department of Physics and Astronomy at the University of Glasgow provides this website on Magnetic RAM (MRAM) -- a non-volatile memory storage system similar to Flash memory except that it uses less power and switches faster. Predicting that "2005 could see mass production of MRAM parts" to be used in powering instant-on computers and computers that are in stand-by power-savings mode (as is currently done with PDAs and laptops), the author reviews some of the physical challenges yet to be overcome. The website provides some basic information on magnetic memory and binary notation, as well as sections on: the Principle of the Magnetic Memory, The Rectangular Hysterisis Loop, A Magnetic Memory Element, Arrangement of Magnetic Core Memories, Relation between the Decimal and Binary Codes, How Numbers Are Stored in a Memory, How a Binary-Coded Decimal Digit is 'written in,' How a Digit is 'read out,' and a Complete Wiring Diagram of a Matrix Plane.

Doherty, Frederico A.

2008-01-23

306

ERIC Educational Resources Information Center

Compares three theories examining the role of communication in producing and coping with subjective uncertainty. Notes that uncertainty reduction theory offers axioms and derived theorems that describe communicative and noncommunicative causes and consequences of uncertainty. Compares meanings of "uncertainty" in the three theories as well as the…

Bradac, James J.

2001-01-01

307

NASA Astrophysics Data System (ADS)

SummarySince the topographical data obtained from LiDAR (Light Detection and Ranging) measurements is superior in resolution and accuracy as compared to conventional geospatial data, over the last decade aerial LiDAR (Light Detection and Ranging) has been widely used for obtaining geospatial information. However, digital terrain models made from LiDAR data retain some degree of uncertainty as a result of the measurement principles and the operational limitations of LiDAR surveying. LiDAR cannot precisely measure topographical elements such as ground undulation covered by vegetation, curbstones, etc. Such instrumental and physical uncertainties may impact an estimated result in an inundation flow simulation. Meanwhile, how much and how these topographical uncertainties affect calculated results is not understood. To evaluate the effect of topographical uncertainty on the calculated inundation flow, three representative terrains were prepared that included errors in elevation. Here, the topographical uncertainty that was introduced was generated using a fractal algorithm in order to represent the spatial structure of the elevation uncertainty. Then, inundation flows over model terrains were calculated with an unstructured finite volume flow model that solved shallow water equations. The sensitivity of the elevation uncertainty on the calculated inundated propagation, especially the local flow velocity, was evaluated. The predictability of inundation flow over complex topography is discussed, as well as its relationship to topographical features.

Tsubaki, Ryota; Kawahara, Yoshihisa

2013-04-01

308

The propagation of uncertainty with calibration equations

NASA Astrophysics Data System (ADS)

A new method for propagating uncertainty, based on interpolation theory, is developed to solve the problem of propagating uncertainty in linear interpolating equations. The method is extended to nonlinear equations, and to over-determined linear or nonlinear equations fitted by least-squares methods. The paper also provides an algebraic explanation of the rationale for calibration, describes some effects of correlation on uncertainty propagation and investigates the properties of interpolation errors. Several examples are described showing the application of the method to polynomial interpolation, the amplification of uncertainty with extrapolation and the effects of correlation between uncertainties in corrections.

White, D. R.; Saunders, P.

2007-07-01

309

Uncertainty assessment in probabilistic risk assessment

This paper focuses on our proposal for the different roles that data and expert opinion play in uncertainty analysis. Parameters for which reliable data exist are estimated by classical statistical techniques. Their uncertainty bounds are statistical confidence limits. Uncertainty about data-free parameters is expressed as a range, or set, of plausible values, with no probabilistic connotations. For parameters with both data and opinion sources, conditional confidence limits can be used to assess both total uncertainty, and the separate contributions of data-based and data-free uncertainties.

Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

1985-01-01

310

Error models for uncertainty quantification

NASA Astrophysics Data System (ADS)

In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing spatial uncertainty using distances and kernels", Math Geosci (2009) [2] P. Jenny et al., "Multi-Scale finite-volume method for elliptic problems in subsurface flow simulation", J. Comp. Phys., 187(1) (2003) [3] I. Lunati and S.H. Lee, "An operator formulation of the multiscale finite-volume method with correction function", Multiscale Model. Simul. 8(1) (2009) [4] G. Mariethoz, P. Renard, and J. Straubhaar "The Direct Sampling method to perform multiple-point geostatistical simulations", Water Resour. Res., 46 (2010) [5] P. Bayer et al., "Three-dimensional high resolution fluvio-glacial aquifer analog", J. Hydro 405 (2011) 19

Josset, L.; Scheidt, C.; Lunati, I.

2012-12-01

311

[Ethical principles in psychiatric action].

There is no specific psychiatric ethic. The ethical principles for practical actions in psychiatry have to be adapted on the basis of the generally accepted ethical principles, which are based on psychobiologically developed ethic of love: honesty, discretion, empathy, patience, distance, consistency, accountability, tolerance, economic neutrality. PMID:24983582

Rüther, Eckart

2014-07-01

312

Principles of instructed language learning

This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on meaning and on form, the need to

Rod Ellis

2005-01-01

313

Design Principles for Children's Technology

Designers of children's technology and software face distinctive challenges. Many design principles used for adult interfaces cannot be applied to children's products because the needs, skills, and expectations of this user population are drastically different than those of adults. In recent years, designers have started developing design principles for children, but this work has not been collected in one place.

Sonia Chiasson; Carl Gutwin

314

Principles of Instructed Language Learning

ERIC Educational Resources Information Center

This article represents an attempt to draw together findings from a range of second language acquisition studies in order to formulate a set of general principles for language pedagogy. These principles address such issues as the nature of second language (L2) competence (as formulaic and rule-based knowledge), the contributions of both focus on…

Ellis, Rod

2005-01-01

315

Data Anlaysis: Fundamental Counting Principle

NSDL National Science Digital Library

This lesson plan presents an activity where students use charts and tree diagrams to show the possible outcomes of probability experiments and the likelihood of each event. In the plan the teacher guides the class to understand and apply the fundamental counting principle. Two independent worksheets provide students with more practice creating sample spaces and applying the fundamental counting principle.

2012-01-01

316

Reflection principles in computational logic

We introduce the concept of reflection principle as a knowledge representation paradigm in a computational logic setting. Reflection principles are expressed as certain kinds of logic schemata intended to capture the basic properties of the domain knowledge to be modelled. Reflection is then used to instantiate these schemata to answer specific queries about the domain. This differs from other approaches

Jonas Barklund; Pierangelo Dell'acqua; Stefania Costantini; Gaetano Aurelio Lanzarone

2000-01-01

317

Computational principles of movement neuroscience

Unifying principles of movement have emerged from the computational study of motor control. We review several of these principles and show how they apply to processes such as motor planning, control, estimation, prediction and learning. Our goal is to demonstrate how specific models emerging from the computational approach provide a theoretical framework for movement neuroscience.

Zoubin Ghahramani; Daniel M. Wolpert

2000-01-01

318

Quantum principles and free particles. [evaluation of partitions

NASA Technical Reports Server (NTRS)

The quantum principles that establish the energy levels and degeneracies needed to evaluate the partition functions are explored. The uncertainty principle is associated with the dual wave-particle nature of the model used to describe quantized gas particles. The Schroedinger wave equation is presented as a generalization of Maxwell's wave equation; the former applies to all particles while the Maxwell equation applies to the special case of photon particles. The size of the quantum cell in phase space and the representation of momentum as a space derivative operator follow from the uncertainty principle. A consequence of this is that steady-state problems that are space-time dependent for the classical model become only space dependent for the quantum model and are often easier to solve. The partition function is derived for quantized free particles and, at normal conditions, the result is the same as that given by the classical phase integral. The quantum corrections that occur at very low temperatures or high densities are derived. These corrections for the Einstein-Bose gas qualitatively describe the condensation effects that occur in liquid helium, but are unimportant for most practical purposes otherwise. However, the corrections for the Fermi-Dirac gas are important because they quantitatively describe the behavior of high-density conduction electron gases in metals and explain the zero point energy and low specific heat exhibited in this case.

1976-01-01

319

Conditional Uncertainty in Anthropogenic Global Climate Change

NASA Astrophysics Data System (ADS)

Although, the uncertainty associated with human-induced climate change is less that in many other human activities such as economic management and warfare, the uncertainties in the climate system have assumed a disproportionate profile in public debate. Achieving improved public understanding is dependent on consistent use of the various categories of change and their respective uncertainties. Probably the most important distinction to be made is between uncertainties associated with uncertain societal choices and uncertainties associated with the consequences of such choices. For the biogeochemical system, categories of uncertainty are adapted from those used in the study of uncertainty for the REgional Carbon Assessment and Processes (RECCAP) study. These are then extended and applied to the discussion of the combined carbon-climate system. Characterising uncertainties in future change requires a consistent approach to propagating into the future the uncertainties associated with the past and present state of the climate system. Again, previous analysis for the carbon system is extended to the carbon-climate system. The potential category ambiguities that arise from feedbacks between climate and carbon are identified and resolved. A consistent characterisation of the uncertainties in the earth system provides a basis for factoring the overall uncertainty into human and natural contributions.

Enting, I. G.

2012-12-01

320

Performance of Trajectory Models with Wind Uncertainty

NASA Technical Reports Server (NTRS)

Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.

Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.

2009-01-01

321

Hydrotectonics; principles and relevance

Hydrotectonics combines the principles of hydraulics and rock mechanics. The hypothesis assumes that: (1) no faults are truly planar, (2) opposing noncongruent wavy wallrock surfaces form chambers and bottlenecks along the fault, and (3) most thrusting occurs beneath the water table. These physical constraints permit the following dynamics. Shear displacement accompanying faulting must constantly change the volume of each chamber. Addition of ground water liquefies dry fault breccia to a heavy incompressible viscous muddy breccia I call fault slurry. When the volume of a chamber along a thrust fault decreases faster than its fault slurry can escape laterally, overpressurized slurry is hydraulically injected into the base of near-vertical fractures in the otherwise impervious overriding plate. Breccia pipes commonly form where such fissures intersect. Alternating decrease and increase in volume of the chamber subjects this injection slurry to reversible surges that not only raft and abrade huge clasts sporadically spalled from the walls of the conduit but also act as a forceful hydraulic ram which periodically widens the conduit and extends its top. If the pipe perforates a petroleum reservoir, leaking hydrocarbons float to its top. Sudden faulting may generate a powerful water hammer that can be amplified at some distal narrow ends of the anastomosing plumbing system, where the shock may produce shatter cones. If vented on the Earth's surface, the muddy breccia, now called extrusion slurry, forms a mud volcano. This hypothesis suggests that many highly disturbed features presently attributed to such catastrophic processes as subsurface explosions or meteorite impacts are due to the rheology of tectonic slurry in an intermittently reactivated pressure-relief tube rooted in a powerful reciprocating hydrotectonic pump activated by a long-lived deep-seated thrust fault.

Kopf, R. W.

1982-01-01

322

Periodontal diseases are among the most common diseases affecting humans. Dental biofilm is a contributor to the etiology of most periodontal diseases. It is also widely accepted that immunological and inflammatory responses to biofilm components are manifested by signs and symptoms of periodontal disease. The outcome of such interaction is modulated by risk factors (modifiers), either inherent (genetic) or acquired (environmental), significantly affecting the initiation and progression of different periodontal disease phenotypes. While definitive genetic determinants responsible for either susceptibility or resistance to periodontal disease have yet to be identified, many factors affecting the pathogenesis have been described, including smoking, diabetes, obesity, medications, and nutrition. Currently, periodontal diseases are classified based upon clinical disease traits using radiographs and clinical examination. Advances in genomics, molecular biology, and personalized medicine may result in new guidelines for unambiguous disease definition and diagnosis in the future. Recent studies have implied relationships between periodontal diseases and systemic conditions. Answering critical questions regarding host-parasite interactions in periodontal diseases may provide new insight in the pathogenesis of other biomedical disorders. Therapeutic efforts have focused on the microbial nature of the infection, as active treatment centers on biofilm disruption by non-surgical mechanical debridement with antimicrobial and sometimes anti-inflammatory adjuncts. The surgical treatment aims at gaining access to periodontal lesions and correcting unfavorable gingival/osseous contours to achieve a periodontal architecture that will provide for more effective oral hygiene and periodontal maintenance. In addition, advances in tissue engineering have provided innovative means to regenerate/repair periodontal defects, based upon principles of guided tissue regeneration and utilization of growth factors/biologic mediators. To maintain periodontal stability, these treatments need to be supplemented with long-term maintenance (supportive periodontal therapy) programs. PMID:23240942

Dentino, Andrew; Lee, Seokwoo; Mailhot, Jason; Hefti, Arthur F

2013-02-01

323

Aspects of universally valid Heisenberg uncertainty relation

NASA Astrophysics Data System (ADS)

A numerical illustration of a universally valid Heisenberg uncertainty relation, which was proposed recently, is presented by using the experimental data on spin-measurements by J. Erhart et al. [Nat. Phys. 8, 185 (2012)]. This uncertainty relation is closely related to a modified form of the Arthurs-Kelly uncertainty relation, which is also tested by the spin-measurements. The universally valid Heisenberg uncertainty relation always holds, but both the modified Arthurs-Kelly uncertainty relation and the Heisenberg error-disturbance relation proposed by Ozawa, which was analyzed in the original experiment, fail in the present context of spin-measurements, and the cause of their failure is identified with the assumptions of unbiased measurement and disturbance. It is also shown that all the universally valid uncertainty relations are derived from Robertson's relation and thus the essence of the uncertainty relation is exhausted by Robertson's relation, as is widely accepted.

Fujikawa, Kazuo; Umetsu, Koichiro

2013-01-01

324

Neural correlates of intolerance of uncertainty.

Many future events are unpredictable, which is considered unacceptable by individuals with an intolerance of uncertainty (IU). We investigated the influence of two related personality traits, IU and habitual worrying on neural correlates of affective uncertainty with functional magnetic resonance imaging. Thirty females viewed a warning cue that always preceded an aversive picture, a safety cue that always preceded a neutral picture and an uncertainty cue that signaled that an aversive or a neutral picture might be shown (probability: 50%:50%). The processing of uncertainty was associated with activation of the posterior frontomedian cortex (PFMC), the dorsolateral prefrontal cortex, and the anterior cingulate cortex. IU and habitual worrying were positively correlated with amygdala activity during experienced uncertainty. Moreover, IU correlated negatively with PFMC activity. This response pattern might reflect that uncertainty is threatening to individuals high in IU and that they lack adequate cognitive mechanism to cope with the uncertainty. PMID:20570602

Schienle, Anne; Köchel, Angelika; Ebner, Franz; Reishofer, Gernot; Schäfer, Axel

2010-08-01

325

Radiologist Uncertainty and the Interpretation of Screening

Objective To determine radiologists’ reactions to uncertainty when interpreting mammography and the extent to which radiologist uncertainty explains variability in interpretive performance. Methods The authors used a mailed survey to assess demographic and clinical characteristics of radiologists and reactions to uncertainty associated with practice. Responses were linked to radiologists’ actual interpretive performance data obtained from 3 regionally located mammography registries. Results More than 180 radiologists were eligible to participate, and 139 consented for a response rate of 76.8%. Radiologist gender, more years interpreting, and higher volume were associated with lower uncertainty scores. Positive predictive value, recall rates, and specificity were more affected by reactions to uncertainty than sensitivity or negative predictive value; however, none of these relationships was statistically significant. Conclusion Certain practice factors, such as gender and years of interpretive experience, affect uncertainty scores. Radiologists’ reactions to uncertainty do not appear to affect interpretive performance.

Carney, Patricia A.; Elmore, Joann G.; Abraham, Linn A.; Gerrity, Martha S.; Hendrick, R. Edward; Taplin, Stephen H.; Barlow, William E.; Cutter, Gary R.; Poplack, Steven P.; D'Orsi, Carl J.

2011-01-01

326

Few Group Collapsing of Covariance Matrix Data Based on a Conservation Principle

A new algorithm for a rigorous collapsing of covariance data is proposed, derived, implemented, and tested. The method is based on a conservation principle that allows the uncertainty calculated in a fine group energy structure for a specific integral parameter, using as weights the associated sensitivity coefficients, to be preserved at a broad energy group structure.

H. Hiruta; G. Palmiotti; M. Salvatores; R. Arcilla, Jr.; R. D. McKnight; G. Aliberti; P. Oblozinsky; W. S. Yang

2008-12-01

327

A principle with quality assurance of ion chromatography (IC) is presented. Since the majority of scientists and costumers are interested in the determination of the true amount of analyte in real samples, the focus of attention should be directed towards the concept of accuracy rather than focussing on precision. By exploiting the principle of pooled calibrations and retainment of all outliers it was possible to obtain full correspondence between calibration uncertainty and repetition uncertainty, which for the first time evidences statistical control in experiments with ion chromatography. Anions of bromide were analysed and the results were subjected to quality assurance (QA). It was found that the limit of quantification (LOQ) was significantly underestimated by up to a factor of 30 with respect to the determination of concentration of unknowns. The concept of lower-limit of analysis (LLA) and upper-limit of analysis (ULA) were found to provide more acceptable limits with respect to reliable analysis with a limited number of repetitions. An excellent correspondence was found between calibration uncertainty and repetition uncertainty. These findings comply with earlier investigations of method validations where it was found that the principle of pooled calibrations provides a more realistic picture of the analytical performance with the drawback, however, that generally higher levels of uncertainties should be accepted, as compared to contemporary literature values. The implications to the science analytical chemistry in general and to method validations in particular are discussed. PMID:23040989

Andersen, Jens E T; Mikolajczak, Maria; Wojtachnio-Zawada, Katarzyna Olga; Nicolajsen, Henrik Vigan

2012-11-01

328

NASA Astrophysics Data System (ADS)

At the end of the XIXth century, physics was dominated by two main theories: classical (or Newtonian) mechanics and electromagnetism. To be entirely correct, we should add thermodynamics, which seemed to be grounded on different principles, but whose links with mechanics were progressively better understood thanks to the work of Maxwell and Boltzmann, among others. Classical mechanics, born with Galileo and Newton, claimed to explain the motion of lumps of matter under the action of forces. The paradigm for a lump of matter is a particle, or a corpuscle, which one can intuitively think of as a billiard ball of tiny dimensions, and which will be dubbed a micro-billiard ball in what follows. The second main component of XIXth century physics, electromagnetism, is a theory of the electric and magnetic fields and also of optics, thanks to the synthesis between electromagnetism and optics performed by Maxwell, who understood that light waves are nothing other than a particular case of electromagnetic waves. We had, on the one hand, a mechanical theory where matter exhibiting a discrete character (particles) was carried along well localized trajectories and, on the other hand, a wave theory describing continuous phenomena which did not involve transport of matter. The two theories addressed different domains, the only obvious link being the law giving the force on a charged particle submitted to an electromagnetic field, or Lorentz force. In 1905, Einstein put an end to this dichotomic wave/particle view and launched two revolutions of physics: special relativity and quantum physics. First, he showed that Newton's equations of motion must be modified when the particle velocities are not negligible with respect to that of light: this is the special relativity revolution, which introduces in mechanics a quantity characteristic of optics, the velocity of light. However, this is an aspect of the Einsteinian revolution which will not interest us directly, with the exception of Chapter 7. Then Einstein introduced the particle aspect of light: in modern language, he introduced the quantum properties of the electromagnetic field, epitomized by the concept of photon. After briefly recalling the main properties of waves in classical physics, this chapter will lead us to the heart of the quantum world, elaborating on an example which is studied in some detail, the Mach-Zehnder interferometer. This apparatus is widely used today in physics laboratories, but we shall limit ourselves to a schematic description, at the level of what my experimental colleagues would call "a theorist's version of an interferometer".

Bellac, Michel Le

2014-11-01

329

Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations. PMID:19874664

Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

2009-10-01

330

Uncertainty Analysis of Instrument Calibration and Application

NASA Technical Reports Server (NTRS)

Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

Tripp, John S.; Tcheng, Ping

1999-01-01

331

Novel Mathematical and Computational Techniques for Robust Uncertainty Quantification.

National Technical Information Service (NTIS)

Uncertainty quantification refers to a broad set of techniques for understanding the impact of uncertainties in complicated mechanical and physical systems. In this context 'uncertainty' can take on many meanings. Aleatoric uncertainty refers to inherent ...

D. Gottlieb J. Hesthaven P. Dupuis

2011-01-01

332

Principles of Pharmacotherapy: I. Pharmacodynamics

This paper and the ensuing series present the principles guiding and affecting the ability of drugs to produce therapeutic benefit or untoward harm. The principles of pharmacodynamics and pharmacokinetics, the physiologic basis of adverse drug reactions and suitable antidotal therapy, and the biologic basis of drug allergy, drug-drug interactions, pharmacogenetics, teratology and hematologic reactions to chemicals are explored. These principles serve to guide those administering and using drugs to attain the maximum benefit and least attendant harm from their use. Such is the goal of rational therapeutics.

Pallasch, Thomas J.

1988-01-01

333

Ethical principles--emergency medicine.

Neither law nor religion, bioethics absorbs and applies elements of both. Its theories, principles, and methods stem from various philosophical schools. Practitioners use case-based reasoning to apply bioethics to clinical situations, usually giving most weight to patients' autonomy and values, but also incorporating other relevant bioethical principles, including those encompassed in professional oaths and codes. Emergency clinicians must be able to recognize bioethical dilemmas, have action plans based on their readings and discussions, and have a method through which to apply ethical principles in clinical settings. This article provides an overview of ethical considerations and guidelines for emergency clinicians. PMID:16877128

Iserson, Kenneth V

2006-08-01

334

OECD Principles of Corporate Governance

NSDL National Science Digital Library

The "Organisation for Economic Co-operation and Development Principles of Corporate Governance" sets out a structure for directing and controlling corporate businesses. This document (html or .pdf) consists of five sections detailing the principles: "The rights of shareholders," "The equitable treatment of shareholders," "The role of stakeholders in corporate governance," "Disclosure and transparency," and "The responsibilities of the board," as well as annotations for each of the sections. Be sure to visit the OECD Principles of Corporate Governance Q&A page, linked at the top of the page.

335

NASA Astrophysics Data System (ADS)

It is shown that the Bekenstein-Hawking entropy of black holes can accept a correction that effects on the radiation tunneling probability. By assumption of a spatially flat universe accompanied with expansion of metric, we could obtain an expression for entropy of black hole that is changing with respect to time and Bekenstein-Hawking temperature.

Khani, F.; Baghbani, R.; Darvishi, M. T.

2013-01-01

336

NASA Technical Reports Server (NTRS)

A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

1993-01-01

337

NASA Technical Reports Server (NTRS)

A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.

Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro

1993-01-01

338

Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.

ERIC Educational Resources Information Center

Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill. (PKP)

Wassermann, Selma

2001-01-01

339

Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P. [Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany); Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg, Germany and Department of Radiation Oncology, University Clinic Heidelberg, 69120 Heidelberg (Germany); Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany)

2012-04-15

340

Theoretical uncertainty in baryon oscillations

We discuss the systematic uncertainties in the recovery of dark energy properties from the use of baryon acoustic oscillations as a standard ruler. We demonstrate that while unknown relativistic components in the universe prior to recombination would alter the sound speed, the inferences for dark energy from low-redshift surveys are unchanged so long as the microwave background anisotropies can measure the redshift of matter-radiation equality, which they can do to sufficient accuracy. The mismeasurement of the radiation and matter densities themselves (as opposed to their ratio) would manifest as an incorrect prediction for the Hubble constant at low-redshift. In addition, these anomalies do produce subtle but detectable features in the microwave anisotropies.

Eisenstein, Daniel [Steward Observatory, University of Arizona, Tucson, Arizona 85721 (United States); White, Martin [Departments of Physics and Astronomy, University of California, Berkeley, California 94720 (United States)

2004-11-15

341

What does the precautionary principle mean for evidence-based dentistry?

The precautionary principle calls for preventive actions in the face of uncertain information about risks. It serves as a compass to better guide more health-protective decisions in the face of complex risks. Applying precaution requires thinking more broadly about risks, taking an interdisciplinary approach to science and policy, and considering a wide range of alternatives to potentially harmful activities. While often criticized as antiscientific, the precautionary principle represents a challenge to scientists and public health professionals to develop newer and more effective tools for characterizing and preventing complex risks, in addition to being more explicit about uncertainties. This article examines the role and application of precaution in the context of dental practice, where activities that may convey risks also have public health benefits, and risk trade offs are a possibility. We conclude that the precautionary principle is not at odds with, but rather complements evidence-based practice in situations of scientific uncertainty and complex risks. PMID:17138389

Tickner, Joel; Coffin, Melissa

2006-03-01

342

Optimization principles for convective heat transfer

Optimization for convective heat transfer plays a significant role in energy saving and high-efficiency utilizing. We compared two optimization principles for convective heat transfer, the minimum entropy generation principle and the entransy dissipation extremum principle, and analyzed their physical implications and applicability. We derived the optimization equation for each optimization principle. The theoretical analysis indicates that both principles can be

Qun Chen; Moran Wang; Ning Pan; Zeng-Yuan Guo

2009-01-01

343

Extrema Principles Of Dissipation In Fluids

NASA Technical Reports Server (NTRS)

Report discusses application of principle of least action and other variational or extrema principles to dissipation of energy and production of entropy in fluids. Principle of least action applied successfully to dynamics of particles and to quantum mechanics, but not universally accepted that variational principles applicable to thermodynamics and hydrodynamics. Report argues for applicability of some extrema principles to some simple flows.

Horne, W. Clifton; Karamcheti, Krishnamurty

1991-01-01

344

Evacuation decision-making: process and uncertainty

The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.

Mileti, D.; Sorensen, J.; Bogard, W.

1985-09-01

345

Environmental load uncertainties for offshore structures

A methodology for assessing the effect of different sources of uncertainty on the calculation of load effect on offshore structures is presented. A consistent classification of uncertainties was adopted and used as a basis to develop models to estimate the effect of different uncertainties on specified design loads. It is shown that distribution parameter uncertainties arising from limitations on the quantity of statistical data are not likely to have a significant effect on design loads. By contrast, model uncertainties can greatly increase the design loads, and the increase is sensitive to the probabilistic models used to describe model error. The methodology and results can be used by design engineers to take model uncertainties into account in estimating specified loads. They also form the basis for developing and calibrating a new information-sensitive code format.

Nessim, M.A.; Hong, H.P. [Centre for Engineering Research Inc., Edmonton, Alberta (Canada); Jordaan, I.J. [Memorial Univ. of Newfoundland, St. John`s, Newfoundland (Canada). Faculty of Engineering and Applied Science

1995-11-01

346

NOTE: Uncertainty analysis in polymer gel dosimetry

NASA Astrophysics Data System (ADS)

Verification of advanced radiotherapy treatment modalities requires measurement of three-dimensional absorbed dose distributions with high spatial resolution and precision. Polymer gel dosimeters combined with magnetic resonance imaging may be able to fulfil this requirement. However, verification requires that the uncertainty in the dosimeter is well known. One method of estimating the overall uncertainty in polymer gel dosimeters involves the propagation of the uncertainty in the R2 (nuclear magnetic resonance relaxation rate) map and the uncertainties in the calibration data. This work shows that using this method with current data suggests that the lowest uncertainty currently obtainable is about 3% at 8 Gy and 7% at 2 Gy. Furthermore, the most significant reductions in overall uncertainty will be achieved by reducing the noise in the R2 map.

Baldock, C.; Murry, P.; Kron, T.

1999-11-01

347

Uncertainty analysis in polymer gel dosimetry.

Verification of advanced radiotherapy treatment modalities requires measurement of three-dimensional absorbed dose distributions with high spatial resolution and precision. Polymer gel dosimeters combined with magnetic resonance imaging may be able to fulfil this requirement. However, verification requires that the uncertainty in the dosimeter is well known. One method of estimating the overall uncertainty in polymer gel dosimeters involves the propagation of the uncertainty in the R2 (nuclear magnetic resonance relaxation rate) map and the uncertainties in the calibration data. This work shows that using this method with current data suggests that the lowest uncertainty currently obtainable is about 3% at 8 Gy and 7% at 2 Gy. Furthermore, the most significant reductions in overall uncertainty will be achieved by reducing the noise in the R2 map. PMID:10588291

Baldock, C; Murry, P; Kron, T

1999-11-01

348

The Precautionary Principle Also Applies to Public Health Actions

The precautionary principle asserts that the burden of proof for potentially harmful actions by industry or government rests on the assurance of safety and that when there are threats of serious damage, scientific uncertainty must be resolved in favor of prevention. Yet we in public health are sometimes guilty of not adhering to this principle. Examples of actions with unintended negative consequences include the addition of methyl tert-butyl ether to gasoline in the United States to decrease air pollution, the drilling of tube wells in Bangladesh to avoid surface water microbial contamination, and villagewide parenteral antischistosomiasis therapy in Egypt. Each of these actions had unintended negative consequences. Lessons include the importance of multidisciplinary approaches to public health and the value of risk–benefit analysis, of public health surveillance, and of a functioning tort system—all of which contribute to effective precautionary approaches.

Goldstein, Bernard D.

2001-01-01

349

Some Aspects of uncertainty in computational fluid dynamics results

NASA Technical Reports Server (NTRS)

Uncertainties are inherent in computational fluid dynamics (CFD). These uncertainties need to be systematically addressed and managed. Sources of these uncertainty analysis are discussed. Some recommendations are made for quantification of CFD uncertainties. A practical method of uncertainty analysis is based on sensitivity analysis. When CFD is used to design fluid dynamic systems, sensitivity-uncertainty analysis is essential.

Mehta, U. B.

1991-01-01

350

Energy Principle with Global Invariants.

National Technical Information Service (NTIS)

A variational principle is proposed for constructing equilibria with minimum energy in a toroidal plasma. The total energy is minimized subject to global invariants which act as constraints during relaxation of the plasma. These global integrals of motion...

A. Bhattacharjee R. L. Dewar

1981-01-01

351

Fundamental principles of particle detectors

This paper goes through the fundamental physics of particles-matter interactions which is necessary for the detection of these particles with detectors. A listing of 41 concepts and detector principles are given. 14 refs., 11 figs.

Fernow, R.C.

1988-01-01

352

Modern Measurement Principles Moderne Messprinzipien.

National Technical Information Service (NTIS)

Recent measurement techniques involving piezoelectric sensors, piezoresistive sensors, sensors using the servo principle, incremental sensors, coded sensors, gyroscopic technique, strain gages, radar, optical sensors, and lasers were evaluated. An example...

F. Mettin

1977-01-01

353

Get Provoked: Applying Tilden's Principles.

ERIC Educational Resources Information Center

This address given to the Division of Interpretation, Yellowstone National Park, Interpretive Training, June 1993, examines successes and failures in interpretive programs for adults and children in light of Tilden's principles. (LZ)

Shively, Carol A.

1995-01-01

354

Principles of the Dynamic Theory.

National Technical Information Service (NTIS)

Generalizations of the classical Thermodynamic Laws are adopted as the fundamental principles of the proposed theory, hereafter called the Dynamic Theory. An important role is played by an integrating factor which makes the energy exchange with the enviro...

P. E. Williams

1977-01-01

355

A precautionary principle for dual use research in the life sciences.

Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. PMID:19594724

Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan

2011-01-01

356

Experimental joint quantum measurements with minimum uncertainty.

Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the three-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.999?98(6) allow us to verge upon the fundamental limits of measurement uncertainty. PMID:24483993

Ringbauer, Martin; Biggerstaff, Devon N; Broome, Matthew A; Fedrizzi, Alessandro; Branciard, Cyril; White, Andrew G

2014-01-17

357

Robust Preliminary Space Mission Design under Uncertainty

\\u000a This chapter presents the design of a space mission at a preliminary stage, when uncertainties are high. At this particular\\u000a stage, an insufficient consideration for uncertainty could lead to a wrong decision on the feasibility of the mission. Contrary\\u000a to the traditional margin approach, the methodology presented here explicitly introduces uncertainties in the design process.\\u000a The overall system design is

Massimiliano Vasile; Nicolas Croisard

358

Importance of scientific uncertainty in decision making

Uncertainty in environmental decision making should not be thought of as a problem that is best ignored. In fact, as is illustrated\\u000a in a simple example, we often informally make use of awareness of uncertainty by hedging decisions away from large losses.\\u000a This hedging can be made explicit and formalized using the methods of decision analysis. While scientific uncertainty is

Kenneth H. Reckhow

1994-01-01

359

Uncertainty analysis of emergency cooling system flows

The Emergency Cooling System (ECS) for SRP reactors is designed to provide core cooling for postulated incidents. Recently, tests were completed in L Reactor to better define pipeline and fitting loss coefficients that are needed to predict ECS flows. Because the existing flow data are primarily based on experimental data, an experimental uncertainty exists in those flows and respective loss coefficients. The uncertainty analysis and resulting uncertainties of the data are discussed in this report.

Mcallister, Jr, J E

1988-01-01

360

Geostatistical modelling of uncertainty in soil science

This paper addresses the issue of modelling the uncertainty about the value of continuous soil attributes, at any particular unsampled location (local uncertainty) as well as jointly over several locations (multiple-point or spatial uncertainty). Two approaches are presented: kriging-based and simulation-based techniques that can be implemented within a parametric (e.g. multi-Gaussian) or non-parametric (indicator) frameworks. As expected in theory and

P. Goovaerts

2001-01-01

361

Principles of Scanning Probe Microscopy

NSDL National Science Digital Library

This site offers a beautifully illustrated introduction to the principles of scanning probe microscopy. The text is interspersed with links to additional information, much of it from the Interface Physics Group at Leiden University. There are several animations included and links to visual galleries which illustrate both principles and utilization of scanning probe microscopy. An additional "links" page takes the user to sites of research groups involved in ongoing developmental work in surface science.

Frenken, Joost

2011-04-07

362

Spectral optimization and uncertainty quantification in combustion modeling

NASA Astrophysics Data System (ADS)

Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.

Sheen, David Allan

363

Developmental principles: fact or fiction.

While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, "whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm". We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them. PMID:22489210

Durston, A J

2012-01-01

364

Developmental Principles: Fact or Fiction

While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, “whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm”. We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them.

Durston, A. J.

2012-01-01

365

Visualizing node attribute uncertainty in graphs

NASA Astrophysics Data System (ADS)

Visualizations can potentially misrepresent information if they ignore or hide the uncertainty that are usually present in the data. While various techniques and tools exist for visualizing uncertainty in scientific visualizations, there are very few tools that primarily focus on visualizing uncertainty in graphs or network data. With the popularity of social networks and other data sets that are best represented by graphs, there is a pressing need for visualization systems to show uncertainty that are present in the data. This paper focuses on visualizing a particular type of uncertainty in graphs - we assume that nodes in a graph can have one or more attributes, and each of these attributes may have an uncertainty associated with it. Unlike previous efforts in visualizing node or edge uncertainty in graphs by changing the appearance of the nodes or edges, e.g. by blurring, the approach in this paper is to use the spatial layout of the graph to represent the uncertainty information. We describe a prototype tool that incorporates several uncertainty-to-spatial-layout mappings and describe a scenario showing how it might be used for a visual analysis task.

Cesario, Nathaniel; Pang, Alex; Singh, Lisa

2011-01-01

366

Updated uncertainty budgets for NIST thermocouple calibrations

NASA Astrophysics Data System (ADS)

We have recently updated the uncertainty budgets for calibrations in the NIST Thermocouple Calibration Laboratory. The purpose for the updates has been to 1) revise the estimated values of the relevant uncertainty elements to reflect the current calibration facilities and methods, 2) provide uncertainty budgets for every standard calibration service offered, and 3) make the uncertainty budgets more understandable to customers by expressing all uncertainties in units of temperature (°C) rather than emf. We have updated the uncertainty budgets for fixed-point calibrations of type S, R, and B thermocouples and comparison calibrations of type R and S thermocouples using a type S reference standard. In addition, we have constructed new uncertainty budgets for comparison calibrations of type B thermocouples using a type B reference standard as well as using both a type S and type B reference standard (for calibration over a larger range). We have updated the uncertainty budgets for comparison calibrations of base-metal thermocouples using a type S reference standard and alternately using a standard platinum resistance thermometer reference standard. Finally, we have constructed new uncertainty budgets for comparison tests of noble-metal and base-metal thermoelements using a type S reference standard. A description of these updates is presented in this paper.

Meyer, C. W.; Garrity, K. M.

2013-09-01

367

Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, through the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.

Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

2013-12-01

368

NASA Astrophysics Data System (ADS)

With growing interest in understanding the magnitudes and sources of uncertainty in hydrological modeling, the difficult problem of characterizing model structure adequacy is now attracting considerable attention. Here, we examine this problem via a model-structure-independent approach based in information theory. In particular, we (a) discuss how to assess and compute the information content in multivariate hydrological data, (b) present practical methods for quantifying the uncertainty and shared information in data while accounting for heteroscedasticity, (c) show how these tools can be used to estimate the best achievable predictive performance of a model (for a system given the available data), and (d) show how model adequacy can be characterized in terms of the magnitude and nature of its aleatory uncertainty that cannot be diminished (and is resolvable only up to specification of its density), and its epistemic uncertainty that can, in principle, be suitably resolved by improving the model. An illustrative modeling example is provided using catchment-scale data from three river basins, the Leaf and Chunky River basins in the United States and the Chuzhou basin in China. Our analysis shows that the aleatory uncertainty associated with making catchment simulations using this data set is significant (˜50%). Further, estimated epistemic uncertainties of the HyMod, SAC-SMA, and Xinanjiang model hypotheses indicate that considerable room for model structural improvements remain.

Gong, Wei; Gupta, Hoshin V.; Yang, Dawen; Sricharan, Kumar; Hero, Alfred O.

2013-04-01

369

The principle of finiteness - a guideline for physical laws

NASA Astrophysics Data System (ADS)

I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.

Sternlieb, Abraham

2013-04-01

370

16 CFR 260.6 - General principles.

Code of Federal Regulations, 2010 CFR

...CLAIMS Â§ 260.6 General principles. The following general principles apply to all environmental...relative type size and proximity to the claim being qualified...exceptions to this general principle. For example, if an...

2009-01-01

371

Hamilton's principle applied to piezomagnetism and related variational principles

NASA Astrophysics Data System (ADS)

In piezomagnetism, the fundamental equations have been developed in differential form [e.g., V. I. Alshits and A. N. Darinskii, Wave Motion 15, 265-283 (1992)]. Alternatively, they may be expressed in variational form with its well-known features; this is the topic of this paper. First, the magnetic vector, that is, the gradient of the magnetic potential, is introduced [cf. the authors, Int. J. Solids Struct. 40, 4699-4706 (2003)]. Second, the sufficient conditions based on the energy argument are enumerated for a unique solution in the fundamental equations. Third, Hamilton's principle is stated and a three-field variational principle is obtained. The principle yields only the divergence equations and some natural boundary conditions, and it has the remaining fundamental equations as its constraint conditions. The conditions are generally undesirable in computation, and they are accordingly removed through an involutory transformation [e.g., the authors, Int. J. Eng. Sci. 40, 457-489 (2002)]. Thus, a unified variational principle operating on all the field variables is derived in piezomagnetism. The principle is shown, as special cases, to recover some of earlier ones. [Work supported by TUBA.

Dokmeci, M. Cengiz; Altay, Gulay

2001-05-01

372

As tensions between payers, responsible for ensuring prudent and principled use of scarce resources, and both providers and patients, who legitimately want access to technologies from which they could benefit, continue to mount, interest in approaches to managing the uncertainty surrounding the introduction of new health technologies has heightened. The purpose of this project was to compile an inventory of

Tania Stafinski; Christopher J. McCabe; Devidas Menon

2010-01-01

373

Uncertainty Analysis of CROPGRO-Cotton Model

NASA Astrophysics Data System (ADS)

An application of crop simulation models have become an inherent part of research and decision making process. As many decision making processes solely rely on the results obtained from simulation models, consideration of model uncertainties along with model accuracy in decision making processes have also become increasingly important. Newly developed crop model, CROPGRO - Cotton model is complex simulation model that has been heavily parameterized. The values of those parameters were obtained from literature which also carries uncertainties. True uncertainty associated with important model parameters were not known. The objective of this study was to estimate uncertainties associated with model parameters and associated uncertainties in model outputs. The uncertainty assessment was carried out using widely accepted Geenralized Likelihood Uncertainty Estimation (GLUE technique. Dataset on this analysis was collected from four different experiments at three geographic locations. Primary results show that the amount of uncertainties in model input parameters were narrowed down significantly from the priori knowledge of selected parameters. The expected means of parameters obtained from their posterior distributions were not considerably different from their prior means and default values in the model. However, importantly the coefficient of variation of those parameters were reduced considerably. Maximum likelihood estimates of selected parameter improved the model performance. The fitting of the model to measured LAI, and biomass components was reasonably well with R-squared values for total above ground biomass for all four sites ranging between 0.86 and 0.98. Approximate reduction of uncertainties in input parameters ranged between 25%-85% and corresponding model output uncertainties reductions ranged between 62%-76%. Most of the measurements were covered within the 95% confidence interval estimated from 2.5% and 97.5% quantiles of cumulative distributions of model outputs generated from posterior distribution of model parameters. The study demonstrated an efficient prediction of uncertainties in model input and outputs using a widely accepted GLUE methodology.

Pathak, T. B.; Jones, J. W.; Fraisse, C.; Wright, D.; Hoogenboom, G.; Judge, J.

2009-12-01

374

A double-slit ‘which-way’ experiment on the complementarity–uncertainty debate

A which-way measurement in Young's double-slit will destroy the interference pattern. Bohr claimed this complementarity between wave- and particle-behaviour is enforced by Heisenberg's uncertainty principle: distinguishing two positions at a distance s apart transfers a random momentum q ? ?\\/s to the particle. This claim has been subject to debate: Scully et al (1991 Nature 351 111) asserted that in

R. Mir; J. S. Lundeen; M. W. Mitchell; A. M. Steinberg; J. L. Garretson; H. M. Wiseman

2007-01-01

375

If the risk management for the professional use of dispersive nanomaterials is hampered by a lack of reliable information, the reliable manager and the policy makers have to chose to make the precautionary principle operational for nanotech workplace. This study presents some tools that can be useful for the health & safety manager and for nanotech workers to deal with uncertainties in the nano-workplace. PMID:21485779

van Broekhuizen, Pieter

2011-02-01

376

Classical continuum theories are formulated based on the assumption of large scale separation. For scale-coupling problems involving uncertainties, novel multiscale methods are desired. In this study, by employing the generalized variational principles, a Green-function-based multiscale method is formulated to decompose a boundary value problem with random microstructure into a slow scale deterministic problem and a fast scale stochastic one. The

X. Frank Xu; Xi Chen; Lihua Shen

2009-01-01

377

EDITORIAL: Squeezed states and uncertainty relations

NASA Astrophysics Data System (ADS)

This special issue of Journal of Optics B: Quantum and Semiclassical Optics is composed mainly of extended versions of talks and papers presented at the Eighth International Conference on Squeezed States and Uncertainty Relations held in Puebla, Mexico on 9-13 June 2003. The Conference was hosted by Instituto de Astrofísica, Óptica y Electrónica, and the Universidad Nacional Autónoma de México. This series of meetings began at the University of Maryland, College Park, USA, in March 1991. The second and third workshops were organized by the Lebedev Physical Institute in Moscow, Russia, in 1992 and by the University of Maryland Baltimore County, USA, in 1993, respectively. Afterwards, it was decided that the workshop series should be held every two years. Thus the fourth meeting took place at the University of Shanxi in China and was supported by the International Union of Pure and Applied Physics (IUPAP). The next three meetings in 1997, 1999 and 2001 were held in Lake Balatonfüred, Hungary, in Naples, Italy, and in Boston, USA, respectively. All of them were sponsored by IUPAP. The ninth workshop will take place in Besançon, France, in 2005. The conference has now become one of the major international meetings on quantum optics and the foundations of quantum mechanics, where most of the active research groups throughout the world present their new results. Accordingly this conference has been able to align itself to the current trend in quantum optics and quantum mechanics. The Puebla meeting covered most extensively the following areas: quantum measurements, quantum computing and information theory, trapped atoms and degenerate gases, and the generation and characterization of quantum states of light. The meeting also covered squeeze-like transformations in areas other than quantum optics, such as atomic physics, nuclear physics, statistical physics and relativity, as well as optical devices. There were many new participants at this meeting, particularly from Latin American countries including, of course, Mexico. There were many talks on the subjects traditionally covered in this conference series, including quantum fluctuations, different forms of squeezing, unlike kinds of nonclassical states of light, and distinct representations of the quantum superposition principle, such as even and odd coherent states. The entanglement phenomenon, frequently in the form of the EPR paradox, is responsible for the main advantages of quantum engineering compared with classical methods. Even though entanglement has been known since the early days of quantum mechanics, its properties, such as the most appropriate entanglement measures, are still under current investigation. The phenomena of dissipations and decoherence of the initial pure states are very important because the fast decoherence can destroy all the advantages of quantum processes in teleportation, quantum computing and image processing. Due to this, methods of controlling the decoherence, such as by the use of different kinds of nonlinearities and deformations, are also under study. From the very beginning of quantum mechanics, the uncertainty relations were basic inequalities distinguishing the classical and quantum worlds. Among the theoretical methods for quantum optics and quantum mechanics, this conference covered phase space and group representations, such as the Wigner and probability distribution functions, which provide an alternative approach to the Schr\\"odinger or Heisenberg picture. Different forms of probability representations of quantum states are important tools to be applied in studying various quantum phenomena, such as quantum interference, decoherence and quantum tomography. They have been established also as a very useful tool in all branches of classical optics. From the mathematical point of view, it is well known that the coherent and squeezed states are representations of the Lorentz group. It was noted throughout the conference that another form of the Lorentz group, namely, the 2 x 2 representation of the SL(2,c) group, is becoming

Jauregue-Renaud, Rocio; Kim, Young S.; Man'ko, Margarita A.; Moya-Cessa, Hector

2004-06-01

378

Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

2006-09-01

379

Uncertainty reasoning in expert systems

NASA Technical Reports Server (NTRS)

Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

Kreinovich, Vladik

1993-01-01

380

Strong majorization entropic uncertainty relations

NASA Astrophysics Data System (ADS)

We analyze entropic uncertainty relations in a finite-dimensional Hilbert space and derive several strong bounds for the sum of two entropies obtained in projective measurements with respect to any two orthogonal bases. We improve the recent bounds by Coles and Piani [P. Coles and M. Piani, Phys. Rev. A 89, 022112 (2014), 10.1103/PhysRevA.89.022112], which are known to be stronger than the well-known result of Maassen and Uffink [H. Maassen and J. B. M. Uffink, Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103]. Furthermore, we find a bound based on majorization techniques, which also happens to be stronger than the recent results involving the largest singular values of submatrices of the unitary matrix connecting both bases. The first set of bounds gives better results for unitary matrices close to the Fourier matrix, while the second one provides a significant improvement in the opposite sectors. Some results derived admit generalization to arbitrary mixed states, so that corresponding bounds are increased by the von Neumann entropy of the measured state. The majorization approach is finally extended to the case of several measurements.

Rudnicki, ?ukasz; Pucha?a, Zbigniew; ?yczkowski, Karol

2014-05-01

381

Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of the prediction error. These proposed dynamical uncertainty models results in an upper and lower

P. F. Odgaard; J. Stoustrup; B. Mataji

382

Microform calibration uncertainties of Rockwell diamond indenters

The Rockwell hardness test is a mechanical testing method for evaluating a property of metal products. National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of calibration and check standards, and calibration and uncertainty calculation

J. F. Song; F. F. Jr. Rudder; T. V. Vorburger; J. H. Smith

1995-01-01

383

Uncertainties in air quality model predictions

As a result of several air quality model evaluation exercises involving a large number of source scenarios and types of models, it is becoming clear that the magnitudes of the uncertainties in model predictions are similar from one application to another. When considering continuous point sources and receptors at distances of about 0.1 km to 1 km downwind, the uncertainties

S. R. Hanna

1993-01-01

384

Worry, Intolerance of Uncertainty, and Statistics Anxiety

ERIC Educational Resources Information Center

Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

Williams, Amanda S.

2013-01-01

385

Observations on the worst case uncertainty

NASA Astrophysics Data System (ADS)

The paper discuss the computation of the worst case uncertainty (WCU) in common measurement problems. The usefulness of computing the WCU besides the standard uncertainty is illustrated. A set of equations to compute the WCU in almost all practical situations is presented. The application of the equations to real-world cases is shown.

Fabbiano, Laura; Giaquinto, Nicola; Savino, Mario; Vacca, Gaetano

2013-09-01

386

Efficient uncertainty analyses using fast probability integration

Fast probability integration (FPI) algorthms are adapted, extended and used to perform nuclear engineering uncertainty analyses. Methods are presented to improve the efficiency and precision of FPI for frequently encountered input distributions, to permit quick estimates of extreme model output quantiles and to provide appropriate sensitivity and uncertainty importance measures. Advantages and disadvantages of FPI as a stand-alone method are

F. Eric Haskin; Bevan D. Staple; Chuanyi Ding

1996-01-01

387

Evacuation decision-making: process and uncertainty

The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical

D. Mileti; J. Sorensen; W. Bogard

1985-01-01

388

MODELING TECHNICAL TRADE BARRIERS UNDER UNCERTAINTY

As traditional forms of agricultural protection continue to decline, agricultural interests will likely seek alternative protection in the form of technical barriers. A flexible framework for theoretically and empirically analyzing technical barriers under various sources of uncertainty is derived. Attention is focused on uncertainty arising from the variation in the product attribute levels, a source not yet considered by the

Alan P. Ker

2000-01-01

389

Spiritual uncertainty: exemplars of 2 hospice patients.

Spirituality is important to persons approaching the end of life. The ambiguous nature of dying and spirituality creates many opportunities for uncertainty. This article presents 2 exemplars from hospice patients about the different ways that spiritual uncertainty affected their dying experience. PMID:24919092

Stephenson, Pamela Shockey

2014-01-01

390

Teaching Measurement and Uncertainty the GUM Way

NASA Astrophysics Data System (ADS)

This paper describes a course aimed at developing understanding of measurement and uncertainty in the introductory physics laboratory. The course materials, in the form of a student workbook, are based on the probabilistic framework for measurement as recommended by the International Organization for Standardization in their publication Guide to the Expression of Uncertainty in Measurement (GUM).

Buffler, Andy; Allie, Saalih; Lubben, Fred

2008-12-01

391

Identification with modeling uncertainty and reconfigurable control

The problem of obtaining reliable estimates of uncertainty in the parameters identified through a least-squares algorithm is discussed. Estimates based on a stochastic analysis, an analysis assuming bounded noise, and a sensitivity analysis are reviewed. The results are compared and illustrated using experimental data obtained on a DC motor. The need for methods of estimation of uncertainty is justified in

Marc Bodson

1993-01-01

392

Uncertainty analysis of statistical downscaling methods

Three downscaling models namely Statistical Down-Scaling Model (SDSM), Long Ashton Research Station Weather Generator (LARS-WG) model and Artificial Neural Network (ANN) model have been compared in terms various uncertainty assessments exhibited in their downscaled results of daily precipitation, daily maximum and minimum temperatures. In case of daily maximum and minimum temperature, uncertainty is assessed by comparing monthly mean and variance

Mohammad Sajjad Khan; Paulin Coulibaly; Yonas Dibike

2006-01-01

393

The Marketing Mix Decision Under Uncertainty

This paper develops a marketing mix model under uncertainty using the Capital Asset Pricing Model (CAPM) valuation framework. The model is general because it treats price, advertising and personal selling simultaneously, and allows for general patterns of uncertainty. Because the manager often lacks precise quantitative information about the sales response function, the analysis focuses on the qualitative properties of the

Harsharanjeet S. Jagpal; Ivan E. Brick

1982-01-01

394

Disaggregated total uncertainty measure for credal sets

We present a new approach to measure uncertainty\\/information applicable to theories based on convex sets of probability distributions, also called credal sets. A definition of a total disaggregated uncertainty measure on credal sets is proposed in this paper motivated by recent outcomes. This definition is based on the upper and lower values of Shannon's entropy for a credal set. We

J. Abellán; G. J. Klir; S. Moral

2006-01-01

395

Impact of uncertainty on modeling and testing

NASA Technical Reports Server (NTRS)

A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

Coleman, Hugh W.; Brown, Kendall K.

1995-01-01

396

Designing for Uncertainty: Three Approaches 1

Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described using data from recent surveys. Many of these data are

Scott Bennett

2007-01-01

397

Critical analysis of uncertainties during particle filtration

NASA Astrophysics Data System (ADS)

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters ``maximum critical retention concentration'' and ``minimum injection velocity'' and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental ``critical retention concentration vs velocity''-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data.

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

398

Critical analysis of uncertainties during particle filtration.

Using the law of propagation of uncertainties we show how equipment- and measurement-related uncertainties contribute to the overall combined standard uncertainties (CSU) in filter permeability and in modelling the results for polystyrene latex microspheres filtration through a borosilicate glass filter at various injection velocities. Standard uncertainties in dynamic viscosity and volumetric flowrate of microspheres suspension have the greatest influence on the overall CSU in filter permeability which excellently agrees with results obtained from Monte Carlo simulations. Two model parameters "maximum critical retention concentration" and "minimum injection velocity" and their uncertainties were calculated by fitting two quadratic mathematical models to the experimental data using a weighted least squares approximation. Uncertainty in the internal cake porosity has the highest impact on modelling uncertainties in critical retention concentration. The model with the internal cake porosity reproduces experimental "critical retention concentration vs velocity"-data better than the second model which contains the total electrostatic force whose value and uncertainty have not been reliably calculated due to the lack of experimental dielectric data. PMID:23020418

Badalyan, Alexander; Carageorgos, Themis; Bedrikovetsky, Pavel; You, Zhenjiang; Zeinijahromi, Abbas; Aji, Keyiseer

2012-09-01

399

Identification of Uncertainties Caused by Flow Qualities

It is necessary to assess flow qualities more precisely to estimate uncertainties of the wind tunnel test data at higher angles of attack and lower Mach numbers. Then the uncertainties caused by the flow qualities attempted to be identified by repeat tests at different positions in a supersonic wind tunnel. First, the applied conditions were described through the discussions of

Shinji Nagai; Junichi Akatsuka; Hidetoshi Iijima; Hiroshi Kanda; Mitsunori Watanabe; Mamoru Sato

2007-01-01

400

Uncertainty Propagation in an Ecosystem Nutrient Budget.

New aspects and advancements in classical uncertainty propagation methods were used to develop a nutrient budget with associated error for a northern Gulf of Mexico coastal embayment. Uncertainty was calculated for budget terms by propagating the standard error and degrees of fr...

401

DO MODEL UNCERTAINTY WITH CORRELATED INPUTS

The effect of correlation among the input parameters and variables on the output uncertainty of the Streeter-Phelps water quality model is examined. hree uncertainty analysis techniques are used: sensitivity analysis, first-order error analysis, and Monte Carlo simulation. odifie...

402

The Economic Implications of Carbon Cycle Uncertainty

This paper examines the implications of uncertainty in the carbon-cycle for the cost of stabilizing carbon-dioxide concentrations. We find that uncertainty in our understanding of the carbon-dioxide has significant implications for the costs of a climate stabilization policy, equivalent to a change in concentration target of up to 100 ppmv.

Smith, Steven J.; Edmonds, James A.

2006-10-17

403

Uncertainties in derived temperature-height profiles

NASA Technical Reports Server (NTRS)

Nomographs were developed for relating uncertainty in temperature T to uncertainty in the observed height profiles of both pressure p and density rho. The relative uncertainty delta T/T is seen to depend not only upon the relative uncertainties delta P/P or delta rho/rho, and to a small extent upon the value of T or H, but primarily upon the sampling-height increment Delta h, the height increment between successive observations of p or delta. For a fixed value of delta p/p, the value of delta T/T varies inversely with Delta h. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperatures with unacceptably large uncertainties.

Minzner, R. A.

1974-01-01

404

Dealing with uncertainties - communication between disciplines

NASA Astrophysics Data System (ADS)

Climate adaptation research inevitably involves uncertainty issues - whether people are building a model, using climate scenarios, or evaluating policy processes. However, do they know which uncertainties are relevant in their field of work? And which uncertainties exist in the data from other disciplines that they use (e.g. climate data, land use, hydrological data) and how they propagate? From experiences in Dutch research programmes on climate change in the Netherlands we know that disciplines often deal differently with uncertainties. This complicates communication between disciplines and also with the various users of data and information on climate change and its impacts. In October 2012 an autumn school was organized within the Knowledge for Climate Research Programme in the Netherlands with as central theme dealing with and communicating about uncertainties, in climate- and socio-economic scenarios, in impact models and in the decision making process. The lectures and discussions contributed to the development of a common frame of reference (CFR) for dealing with uncertainties. The common frame contains the following: 1. Common definitions (typology of uncertainties, robustness); 2. Common understanding (why do we consider it important to take uncertainties into account) and aspects on which we disagree (how far should scientists go in communication?); 3. Documents that are considered important by all participants; 4. Do's and don'ts in dealing with uncertainties and communicating about uncertainties (e.g. know your audience, check how your figures are interpreted); 5. Recommendations for further actions (e.g. need for a platform to exchange experiences). The CFR is meant to help researchers in climate adaptation to work together and communicate together on climate change (better interaction between disciplines). It is also meant to help researchers to explain to others (e.g. decision makers) why and when researchers agree and when and why they disagree, and on what exactly. During the presentation some results of this autumn school will be presented.

Overbeek, Bernadet; Bessembinder, Janette

2013-04-01

405

Treatment of Uncertainties in Probabilistic Tsunami Hazard

NASA Astrophysics Data System (ADS)

Over the last few years, we have developed a framework for developing probabilistic tsunami inundation maps, which includes comprehensive quantification of earthquake recurrence as well as uncertainties, and applied it to the development of a tsunami hazard map of California. The various uncertainties in tsunami source and propagation models are an integral part of a comprehensive probabilistic tsunami hazard analysis (PTHA), and often drive the hazard at low probability levels (i.e. long return periods). There is no unique manner in which uncertainties are included in the analysis although in general, we distinguish between "natural" or aleatory variability, such as slip distribution and event magnitude, and uncertainties due to an incomplete understanding of the behavior of the earth, called epistemic uncertainties, such as scaling relations and rupture segmentation. Aleatory uncertainties are typically included through integration over distribution functions based on regression analyses, whereas epistemic uncertainties are included using logic trees. We will discuss how the different uncertainties were included in our recent probabilistic tsunami inundation maps for California, and their relative importance on the final results. Including these uncertainties in offshore exceedance waveheights is straightforward, but the problem becomes more complicated once the non-linearity of near-shore propagation and inundation are encountered. By using the probabilistic off-shore waveheights as input level for the inundation models, the uncertainties up to that point can be included in the final maps. PTHA provides a consistent analysis of tsunami hazard and will become an important tool in diverse areas such as coastal engineering and land use planning. The inclusive nature of the analysis, where few assumptions are made a-priori as to which sources are significant, means that a single analysis can provide a comprehensive view of the hazard and its dominant sources. Furthermore, by considering fractile results in addition to average and/or median hazard, we can extend its usefulness to areas where worst-case scenarios are usually preferred, such as evacuation planning.

Thio, H. K.

2012-12-01

406

The 4th Thermodynamic Principle?

It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

Montero Garcia, Jose de la Luz [Institute for Scientific and Technological Information (IDICT), National Capitol, Havana (Cuba); Novoa Blanco, Jesus Francisco

2007-04-28

407

Variational Principles of Extensible Elastica

NASA Astrophysics Data System (ADS)

An extensible elastica is a rigorous mathematical model of the Bernoulli-Euler beam whose cross-sections remain plane and normal to the axis after deformations. The principle of virtual work for the extensible elastica expressed in terms of the normal strain and rotation of the axis is derived from the principle of virtual work in the three-dimensional elasticity. And it is shown that the derived principle yields the exact equilibrium equations for a beam in the large deformations and rotations. Utilizing linear constitutive equations, we get the theorem of stationary potential energy expressed also in terms of the axial strain and rotation. And, from the Trefftz criterion on the second variation of the potential energy, we get the buckling equations for the extensible elastica, which give the buckling load higher than the Euler load for a cantilever elastica subjected to compressive end load.

Kondo, Kyohei

408

Using Interpolation to Estimate System Uncertainty in Gene Expression Experiments

The widespread use of high-throughput experimental assays designed to measure the entire complement of a cell's genes or gene products has led to vast stores of data that are extremely plentiful in terms of the number of items they can measure in a single sample, yet often sparse in the number of samples per experiment due to their high cost. This often leads to datasets where the number of treatment levels or time points sampled is limited, or where there are very small numbers of technical and/or biological replicates. Here we introduce a novel algorithm to quantify the uncertainty in the unmeasured intervals between biological measurements taken across a set of quantitative treatments. The algorithm provides a probabilistic distribution of possible gene expression values within unmeasured intervals, based on a plausible biological constraint. We show how quantification of this uncertainty can be used to guide researchers in further data collection by identifying which samples would likely add the most information to the system under study. Although the context for developing the algorithm was gene expression measurements taken over a time series, the approach can be readily applied to any set of quantitative systems biology measurements taken following quantitative (i.e. non-categorical) treatments. In principle, the method could also be applied to combinations of treatments, in which case it could greatly simplify the task of exploring the large combinatorial space of future possible measurements.

Falin, Lee J.; Tyler, Brett M.

2011-01-01

409

Investment, regulation, and uncertainty: Managing new plant breeding techniques.

As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases. This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

2014-01-01

410

The legal status of Uncertainty

NASA Astrophysics Data System (ADS)

An exponential improvement of numerical weather prediction (NWP) models was observed during the last decade (Lynch, 2008). Civil Protection (CP) systems exploited Meteo services in order to redeploy their actions towards the prediction and prevention of events rather than towards an exclusively response-oriented mechanism1. Nevertheless, experience tells us that NWP models, even if assisted by real time observations, are far from being deterministic. Complications frequently emerge in medium to long range forecasting, which are subject to sudden modifications. On the other hand, short term forecasts, if seen through the lens of criminal trials2, are to the same extent, scarcely reliable (Molini et al., 2009). One particular episode related with wrong forecasts, in the Italian panorama, has deeply frightened CP operators as the NWP model in force missed a meteorological adversity which, in fact, caused death and dealt severe damage in the province of Vibo Valentia (2006). This event turned into a very discussed trial, lasting over three years, and intended against whom assumed the legal position of guardianship within the CP. A first set of data is now available showing that in concomitance with the trial of Vibo Valentia the number of alerts issued raised almost three folds. We sustain the hypothesis that the beginning of the process of overcriminalization (Husak, 2008) of CPs is currently increasing the number of false alerts with the consequent effect of weakening alert perception and response by the citizenship (Brezntiz, 1984). The common misunderstanding of such an issue, i.e. the inherent uncertainty in weather predictions, mainly by prosecutors and judges, and generally by whom deals with law and justice, is creating the basis for a defensive behaviour3 within CPs. This paper intends, thus, to analyse the social and legal relevance of uncertainty in the process of issuing meteo-hydrological alerts by CPs. Footnotes: 1 The Italian Civil Protection is working in this direction since 1992 (L. 225/92). An example of this effort is clearly given by the Prime Minister Decree (DPCM 20/12/2001 "Linee guida relative ai piani regionali per la programmazione delle attivita' di previsione, prevenzione e lotta attiva contro gli incendi boschivi - Guidelines for regional plans for the planning of prediction, prevention and forest fires fighting activities") that, already in 2001, emphasized "the most appropriate approach to pursue the preservation of forests is to promote and encourage prediction and prevention activities rather than giving priority to the emergency-phase focused on fire-fighting". 2 Supreme Court of the United States, In re Winship (No. 778), No. 778 argued: 20 January 1970, decided: 31 March 1970: Proof beyond a reasonable doubt, which is required by the Due Process Clause in criminal trials, is among the "essentials of due process and fair treatment" 3 In Kessler and McClellan (1996): "Defensive medicine is a potentially serious social problem: if fear of liability drives health care providers to administer treatments that do not have worthwhile medical benefits, then the current liability system may generate inefficiencies much larger than the costs of compensating malpractice claimants".

Altamura, M.; Ferraris, L.; Miozzo, D.; Musso, L.; Siccardi, F.

2011-03-01

411

Beyond Bellman's principle of optimality; the principle of \\

Bellman's principle of optimality and his dynamic programming technique for computing optimal sequential-decisions may not apply to problems involving uncertain, non-noisy exogenous-variables. In this paper, we show that if the uncertain behavior of non-noisy exogenous-variables can be modeled by a class of spline-expressions, with known basis-functions and unknown, \\

C. D. Johnson

2005-01-01

412

Green chemistry: principles and practice.

Green Chemistry is a relatively new emerging field that strives to work at the molecular level to achieve sustainability. The field has received widespread interest in the past decade due to its ability to harness chemical innovation to meet environmental and economic goals simultaneously. Green Chemistry has a framework of a cohesive set of Twelve Principles, which have been systematically surveyed in this critical review. This article covers the concepts of design and the scientific philosophy of Green Chemistry with a set of illustrative examples. Future trends in Green Chemistry are discussed with the challenge of using the Principles as a cohesive design system (93 references). PMID:20023854

Anastas, Paul; Eghbali, Nicolas

2010-01-01

413

Ilizarov principles of deformity correction

Ilizarov frames provide a versatile fixation system for the management of bony deformities, fractures and their complications. The frames give stability, soft tissue preservation, adjustability and functionality allowing bone to realise its full osteogenic potential. It is important that we have a clear and concise understanding of the Ilizarov principles of deformity correction to best make use of this fixation system. In this review article, the history of Ilizarov frame, the basic sciences behind it, the mechanical principles governing its use and the clinical use of the fixation system are discussed.

Spiegelberg, B; Parratt, T; Dheerendra, SK; Khan, WS; Jennings, R; Marsh, DR

2010-01-01

414

Grasping Objects with Environmentally Induced Position Uncertainty

Due to noisy motor commands and imprecise and ambiguous sensory information, there is often substantial uncertainty about the relative location between our body and objects in the environment. Little is known about how well people manage and compensate for this uncertainty in purposive movement tasks like grasping. Grasping objects requires reach trajectories to generate object-fingers contacts that permit stable lifting. For objects with position uncertainty, some trajectories are more efficient than others in terms of the probability of producing stable grasps. We hypothesize that people attempt to generate efficient grasp trajectories that produce stable grasps at first contact without requiring post-contact adjustments. We tested this hypothesis by comparing human uncertainty compensation in grasping objects against optimal predictions. Participants grasped and lifted a cylindrical object with position uncertainty, introduced by moving the cylinder with a robotic arm over a sequence of 5 positions sampled from a strongly oriented 2D Gaussian distribution. Preceding each reach, vision of the object was removed for the remainder of the trial and the cylinder was moved one additional time. In accord with optimal predictions, we found that people compensate by aligning the approach direction with covariance angle to maintain grasp efficiency. This compensation results in higher probability to achieve stable grasps at first contact than non-compensation strategies in grasping objects with directional position uncertainty, and the results provide the first demonstration that humans compensate for uncertainty in a complex purposive task.

Christopoulos, Vassilios N.; Schrater, Paul R.

2009-01-01

415

Uncertainty quantification approaches for advanced reactor analyses.

The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

Briggs, L. L.; Nuclear Engineering Division

2009-03-24

416

Uncertainty and global climate change research

The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

1994-06-01

417

Propagated Uncertainty in Scattering in Humidified Nephelometers

NASA Astrophysics Data System (ADS)

Atmospheric aerosols exert a cooling effect at the surface by directly scattering and absorbing incident sunlight and indirectly by serving as seeds for cloud droplets. They are highly variable liquid or solid particles suspended in gas phase whose climate impact is associated with their chemical composition and microphysical properties. One such aerosol property is the hygroscopic growth or increase in aerosol size and scattering with the uptake of water with increasing relative humidity (RH). Particle size is strongly linked to the wavelength of light scattered and absorbed. Defined as the parameter which characterizes the dispersion of the values about the measured quantity1, uncertainty can effectively place a measured value into perspective. Small uncertainties in instrument sensors can propagate to large errors in the measured hygroscopic growth of aerosols. The uncertainties in the aerosol scattering coefficients and hygroscopic growth fit parameter were calculated. Among the propagated uncertainties stems a considerable contribution from imprecise RH sensors. RH dependent uncertainty of the aerosol hygroscopic growth has never been reported in the literature; however, an increased uncertainty was calculated in aerosols with lower hygroscopic growth, particularly those in clean and wet conditions. 1. Cook, R. R., ASSESSMENT OF UNCERTAINTIES OF MEASUREMENT for calibration & testing laboratories. In National Association of Testing Authorities, Australia, 2002.

Morrow, H. A.; Jefferson, A.; Sherman, J. P.; Andrews, E.; Sheridan, P. J.; Hageman, D.; Ogren, J. A.

2013-12-01

418

Constructing the uncertainty of due dates.

By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make on how and when to give birth. Thirty-three women who were pregnant or had given birth within the past 2 years participated in key informant or small-group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait for birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type of uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

Vos, Sarah C; Anthony, Kathryn E; O'Hair, H Dan

2014-10-01

419

Quantifying uncertainty in global aerosol and forcing

NASA Astrophysics Data System (ADS)

Aerosol-cloud-climate effects are a major source of uncertainty in climate models so it is important to identify and quantify the sources of the uncertainty and thereby direct research efforts. Here we perform a variance-based sensitivity analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of uncertainty in model-estimated present-day CCN concentrations, cloud drop number concentrations and indirect forcing. New emulator techniques enable an unprecedented amount of statistical information to be extracted from a global aerosol model. Twenty-eight model parameters covering essentially all important aerosol processes and emissions were defined based on expert elicitation. A sensitivity analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each monthly model grid cell from an ensemble of 168 one-year model simulations covering the uncertainty space of the 28 parameters. Variance decomposition enables the importance of the parameters for CCN uncertainty to be ranked from local to global scales. Among the most important parameters are the sizes of primary particles and the cloud-processing of aerosol, but most of the parameters are important for CCN uncertainty somewhere on the globe. We also show that uncertainties in forcing over the industrial period are sensitive to a different set of parameters than those that are important for present-day CCN.

Carslaw, Ken; Lee, Lindsay; Reddington, Carly; Mann, Graham; Spracklen, Dominick; Stier, Philip; Pierce, Jeffrey

2013-04-01

420

Precaution, law and principles of good administration.

The precautionary principle is a legal principle concerned with the process of how decisions are made. In implementing and interpreting it regard must be had to the surrounding legal culture and in particular the principles of good administration in operation. Highlighting those principles emphasises that within a particular jurisdiction there is often very little agreement over their nature. Within the European Union contradictory principles are the product of: assumptions about risk problem-solving, the ambiguous nature of European administration, a concern with accountability in the face of recent food controversies, and the impact of international trade rules. These contradictory principles present a number of challenges for implementing the precautionary principle. PMID:16304930

Fisher, E

2005-01-01

421

PREDICTING HYDROLOGICAL MODELS UNCERTAINTY: USE OF MACHINE LEARNING

This paper presents a methodology for assessing total model uncertainty using machine learning techniques. Historical model errors are assumed to be indicator of total model uncertainty. The model uncertainty is measured in the form of the mode errors quantiles or prediction intervals (PIs) and such expression of uncertainty comprises all sources of uncertainty (e.g. model structure, model parameters, input data

Durga Lal Shrestha; Dimitri Solomatine

2006-01-01

422

Uncertainty quantification using interval modeling with performance sensitivity

In this paper an interval modeling approach for uncertainty quantification of a structure with significant parameter variation is presented. Model uncertainty can be categorized as dominant uncertainty due to structural variation, such as joint uncertainty and temperature change, and minor uncertainty associated with other factors. In this paper, a singular value decomposition (SVD) technique is used to decompose parameter variations

Jiann-Shiun Lew; Lucas G. Horta

2007-01-01

423

Uncertainty Considerations for Ballistic Limit Equations

NASA Technical Reports Server (NTRS)

The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA attempts to determine the overall risk associated with a particular mission by factoring in all known risks (and their corresponding uncertainties, if known) to the spacecraft during its mission. The threat to mission and human life posed by the mircro-meteoroid & orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the International Space Station. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. With so many uncertainties believed to be present in the models used within BUMPER II, providing uncertainty bounds with BUMPER II results would appear to be essential to properly evaluating its predictions of MMOD risk. The uncertainties in BUMPER II come primarily from three areas: damage prediction/ballistic limit equations, environment models, and failure criteria definitions. In order to quantify the overall uncertainty bounds on MMOD risk predictions, the uncertainties in these three areas must be identified. In this paper, possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the shuttle and station versions of BUMPER II are presented and discussed. We begin the paper with a review of the current approaches used by NASA to perform a PRA for the Space Shuttle and the International Space Station, followed by a review of the results of a recent sensitivity analysis performed by NASA using the shuttle version of the BUMPER II code. Following a discussion of the various equations that are encoded in BUMPER II, we propose several possible approaches for establishing uncertainty bounds for the equations within BUMPER II. We conclude with an evaluation of these approaches and present a recommendation regarding which of them would be the most appropriate to follow.

Schonberg, W. P.; Evans, H. J.; Williamsen, J. E.; Boyer, R. L.; Nakayama, G. S.

2005-01-01

424

Principles of Adult Learning Scale.

ERIC Educational Resources Information Center

The Principles of Adult Learning Scale (PALS) was developed and validated for measuring congruency between adult education practitioners' actual observable classroom behavior and their expressed belief in the collaborative teaching-learning mode. This model is a learner-centered instruction method in which learner and practitioner share authority…

Conti, Gary J.

425

Principles underlying the design of \\

BACKGROUND: Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a \\

Anna J Wilson; Stanislas Dehaene; Philippe Pinel; Susannah K Revkin; Laurent Cohen; David Cohen

2006-01-01

426

Management Principles for Nonproliferation Organizations

This paper identifies business models and six management principles that can be applied by a nonproliferation organization to maximize the value and effectiveness of its products. The organizations responsible for reducing the nuclear proliferation threat have experienced a substantial growth in responsibility and visibility since the September 11 attacks. Since then, the international community has witnessed revelations of clandestine nuclear

Sarah L. Frazar; Gretchen Hund

2012-01-01

427

Mapping Principles for Conceptual Metaphors

Abstract Word Count: 180 words Paper Word Count: 7780 words, Author name: Kathleen Ahrens Affliation: National Taiwan University Address: Professor Kathleen Ahrens Graduate Institute of Linguistics National Taiwan University 1 Roosevelt Road, Sec. 4 Taipei, 106 TAIWAN Fax: +886-2-2363-5358 E-mail: kathleenahrens@yahoo.com Mapping Principles,,2

Kathleen Ahrens

428

Physical principles of heat pipes

Heat pipes are used whenever high rates of heat transfer or the control or conversion of heat flows are required. This book covers the physical principles of operation of heat pipes and choice of working fluid related to temperature range. The authors demonstrate how performance is limited by capillary pumping action in the wick together with impedance to liquid and

M. N. Ivanovskii; V. P. Sorokin; I. V. Yagodkin

1982-01-01

429

Nursing Principles & Skills. Teacher Edition.

ERIC Educational Resources Information Center

This curriculum guide contains 14 units for a course on nursing principles and skills needed by practical nurses. The 14 units of instruction cover the following: (1) using medical terminology; (2) practicing safety procedures; (3) using the nursing process for care planning; (4) using infection control techniques; (5) preparing a patient…

Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

430

NSDL National Science Digital Library

At this site, the author uses the concept of center of gravity and Newton's third law of motion to explain the principle of rocket flight. The explanation is qualitative for the most part and does not require any higher math for understanding.

Stern, David

431

Invariance principles and plasma confinement

If the equations that describe the observed anomalous transport in magnetic confinement systems are invariant under a scale transformation then any confinement time calculated from them must exhibit the same invariance, no matter how intractable the calculation. This principle places constraints on the form of the confinement time scaling which are characteristic of the plasma model represented by the equations.

J. W. Connor

1988-01-01

432

On the competitive exclusion principle

The Competitive Exclusion Principle, formulated by V. Volterra (Memorie del R. Comitato Talassografico Italiano,131, 1–142, 1927) for a number of species competing for a common ecological niche, is extended to a number of species competing\\u000a for many ecological niches.

Aldo Rescigno; Irvin W. Richardson

1965-01-01

433

Classroom Demonstrations of Polymer Principles.

ERIC Educational Resources Information Center

Describes several techniques to help students visualize the principles of polymer chemistry. Outlines demonstrations using simple materials (such as pop-it beads, thread, and wire) to illustrate the size of macromolecules, the composition of copolymers, and the concept of molecular mass. (TW)

Rodriguez, F.; And Others

1987-01-01

434

Aesthetic Principles for Instructional Design

ERIC Educational Resources Information Center

This article offers principles that contribute to developing the aesthetics of instructional design. Rather than describing merely the surface qualities of things and events, the concept of aesthetics as applied here pertains to heightened, integral experience. Aesthetic experiences are those that are immersive, infused with meaning, and felt as…

Parrish, Patrick E.

2009-01-01

435

Demonstrating Fermat's Principle in Optics

ERIC Educational Resources Information Center

We demonstrate Fermat's principle in optics by a simple experiment using reflection from an arbitrarily shaped one-dimensional reflector. We investigated a range of possible light paths from a lamp to a fixed slit by reflection in a curved reflector and showed by direct measurement that the paths along which light is concentrated have either…

Paleiov, Orr; Pupko, Ofir; Lipson, S. G.

2011-01-01

436

Principles for Teaching Problem Solving

NSDL National Science Digital Library

This 14-page monograph addresses the need to teach problem solving and other higher order thinking skills. After summarizing research and positions of various organizations, it defines several models and describes cognitive and attitudinal components of problem solving and the types of knowledge that are required. The authors provide a list of principles for teaching problem solving and include a list of references.

Kirkley, Rob F.

2003-01-01

437

Electronic Structure Principles and Aromaticity

ERIC Educational Resources Information Center

The relationship between aromaticity and stability in molecules on the basis of quantities such as hardness and electrophilicity is explored. The findings reveal that aromatic molecules are less energetic, harder, less polarizable, and less electrophilic as compared to antiaromatic molecules, as expected from the electronic structure principles.

Chattaraj, P. K.; Sarkar, U.; Roy, D. R.

2007-01-01

438

Understanding Bernoulli's principle through simulations

Computer simulations are used to develop a deeper understanding of Bernoulli's principle. Hard disks undergoing elastic collisions are injected into a Venturi nozzle and the pressure in the narrow throat of the nozzle is compared to the pressure in the wider section of the pipe. This model system is an ideal student project because the theory and programming are straightforward,

Brian E. Faulkner; F. Marty Ytreberg

2011-01-01

439

Uncertainty analysis for Ulysses safety evaluation report

NASA Technical Reports Server (NTRS)

As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator, the Interagency Nuclear Safety Review Panel (INSRP) performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

Frank, Michael V.

1991-01-01

440

Uncertainty: the Curate's egg in financial economics.

Economic theories of uncertainty are unpopular with financial experts. As sociologists, we rightly refuse predictions, but the uncertainties of money are constantly sifted and turned into semi-denial by a financial economics set on somehow beating the future. Picking out 'bits' of the future as 'risk' and 'parts' as 'information' is attractive but socially dangerous, I argue, because money's promises are always uncertain. New studies of uncertainty are reversing sociology's neglect of the unavoidable inability to know the forces that will shape the financial future. PMID:24712756

Pixley, Jocelyn

2014-06-01

441

Uncertainty in weather and climate prediction

Following Lorenz's seminal work on chaos theory in the 1960s, probabilistic approaches to prediction have come to dominate the science of weather and climate forecasting. This paper gives a perspective on Lorenz's work and how it has influenced the ways in which we seek to represent uncertainty in forecasts on all lead times from hours to decades. It looks at how model uncertainty has been represented in probabilistic prediction systems and considers the challenges posed by a changing climate. Finally, the paper considers how the uncertainty in projections of climate change can be addressed to deliver more reliable and confident assessments that support decision-making on adaptation and mitigation.

Slingo, Julia; Palmer, Tim

2011-01-01

442

Propagating boundary uncertainties using polynomial expansions

NASA Astrophysics Data System (ADS)

The method of polynomial chaos expansions is illustrated by showing how uncertainties in boundary conditions specifying the flow from the Caribbean Sea into the Gulf of Mexico manifest as uncertainties in a model's simulation of the Gulf's surface elevation field. The method, which has been used for a variety of engineering applications, is explained within an oceanographic context and its advantages and disadvantages are discussed. The method's utility requires that the spatially and temporally varying uncertainties of the inflow be characterized by a small number of independent random variables, which here correspond to amplitudes of spatiotemporal modes inferred from an available boundary climatology.

Thacker, W. C.; Srinivasan, A.; Iskandarani, M.; Knio, O. M.; Hénaff, M. Le

443

Uncertainty Quantification on Prompt Fission Neutrons Spectra

Uncertainties in the evaluated prompt fission neutrons spectra present in ENDF/B-VII.0 are assessed in the framework of the Los Alamos model. The methodology used to quantify the uncertainties on an evaluated spectrum is introduced. We also briefly review the Los Alamos model and single out the parameters that have the largest influence on the calculated results. Using a Kalman filter, experimental data and uncertainties are introduced to constrain model parameters, and construct an evaluated covariance matrix for the prompt neutrons spectrum. Preliminary results are shown in the case of neutron-induced fission of {sup 235}U from thermal up to 15 MeV incident energies.

Talou, P. [T-16, Nuclear Physics Group, Los Alamos National Laboratory, NM 87545 (United States)], E-mail: talou@lanl.gov; Madland, D.G.; Kawano, T. [T-16, Nuclear Physics Group, Los Alamos National Laboratory, NM 87545 (United States)

2008-12-15

444

High-level waste qualification: Managing uncertainty

A vitrification facility is being developed by the U.S. Department of Energy (DOE) at the West Valley Demonstration Plant (WVDP) near Buffalo, New York, where approximately 300 canisters of high-level nuclear waste glass will be produced. To assure that the produced waste form is acceptable, uncertainty must be managed. Statistical issues arise due to sampling, waste variations, processing uncertainties, and analytical variations. This paper presents elements of a strategy to characterize and manage the uncertainties associated with demonstrating that an acceptable waste form product is achieved. Specific examples are provided within the context of statistical work performed by Pacific Northwest Laboratory (PNL).

Pulsipher, B.A.

1993-09-01

445

Uncertainty analysis for Ulysses safety evaluation report

NASA Astrophysics Data System (ADS)

As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator, the Interagency Nuclear Safety Review Panel (INSRP) performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

Frank, Michael V.

446

Two basic Uncertainty Relations in Quantum Mechanics

NASA Astrophysics Data System (ADS)

In the present article, we discuss two types of uncertainty relations in Quantum Mechanics-multiplicative and additive inequalities for two canonical observables. The multiplicative uncertainty relation was discovered by Heisenberg. Few years later (1930) Erwin Schrödinger has generalized and made it more precise than the original. The additive uncertainty relation is based on the three independent statistical moments in Quantum Mechanics-Cov(q,p), Var(q) and Var(p). We discuss the existing symmetry of both types of relations and applicability of the additive form for the estimation of the total error.

Angelow, Andrey

2011-04-01

447

Variational principles in general relativity

NASA Astrophysics Data System (ADS)

A fully relativistic description of the two-body problem in General Relativity is one of the outstanding unsolved problems in Einstein's theory. A detailed description of the orbital parameters of binary systems has become even more pressing with the advent of gravitational wave detection schemes. This thesis details our efforts to generate astrophysically interesting solutions to the two-body problem. The thesis consists of two main parts. The first part presents an analytical variational principle for describing binary neutron stars undergoing irrotational fluid flow. The variational principle is a powerful tool for generating accurate estimates of orbital parameters for neutron stars in quasi-equilibrium orbits. The second part of the thesis details the numerical application of the variational principle by solving the initial value problem for binary black holes in quasi- equilibrium circular orbits. The analysis draws from the novel “puncture” method of describing the black holes, and relies on nonlinear adaptive multigrid techniques for generating numerical results. These tools allow us to generate sequences of circular orbits, which in turn allow us to approximate the orbital parameters for the system up to and including the innermost stable circular orbit. We compare our numerical results to post- Newtonian expectations and we generate numerical data regarding the geometry and the gravitational radiation emitted by the system. We arrive at two important conclusions. First, the analytical variational principle describing binary neutron stars in irrotational motion provides a road map for future numerical simulations, and also lends credence to previous simulations by other authors. Second, the numerical application and description of binary black holes in quasi-equilibrium circular orbits simplifies the analyses of previous authors, and allows for the imposition of realistic boundary data in the simulations with relatively high grid densities. Both the variational principle and its application may be used to help generate accurate estimates of the orbital parameters for waveform templates needed by gravitational wave observatories around the world.

Baker, Brian Douglas

2002-12-01

448

The Principle of Energetic Consistency

NASA Technical Reports Server (NTRS)

A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of energetic consistency implies that, to precisely the extent that growing modes are important in data assimilation, this term is also important.

Cohn, Stephen E.

2009-01-01

449

Microform Calibration Uncertainties of Rockwell Diamond Indenters.

National Technical Information Service (NTIS)

National and international comparisons in Rockwell hardness tests show significant differences. Uncertainties in the geometry of the Rockwell diamond indenters are largely responsible for these differences. By using a stylus instrument, with a series of c...

J. F. Song F. F. Rudder T. V. Vorburger J. H. Smith

1995-01-01

450

QUANTIFICATION OF UNCERTAINTY IN COMPUTATIONAL FLUID DYNAMICS

This review covers Verification, Validation, Confirmation and related subjects for computational fluid dynamics (CFD), including error taxonomies, error estima- tion and banding, convergence rates, surrogate estimators, nonlinear dynamics, and error estimation for grid adaptation vs Quantification of Uncertainty.

P. J. Roache

1997-01-01

451

Approximate Techniques for Representing Nuclear Data Uncertainties

Computational tools are available to utilize sensitivity and uncertainty (S/U) methods for a wide variety of applications in reactor analysis and criticality safety. S/U analysis generally requires knowledge of the underlying uncertainties in evaluated nuclear data, as expressed by covariance matrices; however, only a few nuclides currently have covariance information available in ENDF/B-VII. Recently new covariance evaluations have become available for several important nuclides, but a complete set of uncertainties for all materials needed in nuclear applications is unlikely to be available for several years at least. Therefore if the potential power of S/U techniques is to be realized for near-term projects in advanced reactor design and criticality safety analysis, it is necessary to establish procedures for generating approximate covariance data. This paper discusses an approach to create applications-oriented covariance data by applying integral uncertainties to differential data within the corresponding energy range.

Williams, Mark L [ORNL; Broadhead, Bryan L [ORNL; Dunn, Michael E [ORNL; Rearden, Bradley T [ORNL

2007-01-01

452

Games with Non-Probabilistic Uncertainty.

National Technical Information Service (NTIS)

The thesis studies games with non-probabilistic uncertainty about some parameters that affect the rewards of the players. The goal is to understand whether players should be optimistic or pessimistic in such situations. The first chapter provides a brief ...

J. W. Lee

2010-01-01

453

Effect of Uncertainty on Deterministic Runway Scheduling

NASA Technical Reports Server (NTRS)

Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.

Gupta, Gautam; Malik, Waqar; Jung, Yoon C.

2012-01-01

454

Performance Incentives and Planning under Uncertainty.

National Technical Information Service (NTIS)

The report discusses the use of the performance incentive function (PIF) by planning organizations when there is subjective or objective uncertainty. It is proved that a PIF can be constructed which achieves both allocational and distributional optimality...

G. G. Hildebrandt L. D. Tyson

1976-01-01

455

Management of Uncertainty in Military Scene Analysis.

National Technical Information Service (NTIS)

Research into modeling and managing uncertainty in military scene analysis is presented. A new method of logical inference was developed for the case where propositions are modeled by possibility distributions. This scheme was tested in a prototype fuzzy ...

J. M. Keller R. M. Crownover R. W. McLaren

1988-01-01

456

Measurement uncertainty of silicone oil leakage testing.

National Technical Information Service (NTIS)

An evaluation has been performed to determine the uncertainty of silicone tracer fluid leakage measurements for an environmental sensing device. The units are tested with an instrument which can measure silicone tracer fluid vapor by the gas chromatograph...

W. E. Holland

1991-01-01

457

Quantifying reliability uncertainty : a proof of concept.

This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.

2009-10-01

458

Mitigating Provider Uncertainty in Service Provision Contracts.

National Technical Information Service (NTIS)

Uncertainty is an inherent property of open, distributed and multi-party systems. The economic viability of mutually beneficial relationships between the constituent parties in these systems relies on the ability of each party to effectively quantify and ...

A. van Moorsel C. Smith

2007-01-01

459

Radiometer Design Analysis Based Upon Measurement Uncertainty

NASA Technical Reports Server (NTRS)

This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

Racette, Paul E.; Lang, Roger H.

2004-01-01

460

Numerical Uncertainty Quantification for Radiation Analysis Tools.

National Technical Information Service (NTIS)

Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle d...

B. Anderson M. Clowdsley S. Blattnig

2007-01-01

461

Uncertainty Quantification using Exponential Epi-Splines.

National Technical Information Service (NTIS)

We quantify uncertainty in complex systems by a flexible, nonparametric framework for estimating probability density functions of output quantities of interest. The framework systematically incorporates soft information about the system from engineering j...

J. O. Royset N. Sukumar R. J. Wets

2013-01-01

462

Runoff Prediction Uncertainty for Ungauged Agricultural Watersheds.

National Technical Information Service (NTIS)

A physically based stochastic watershed model is used to estimate runoff prediction uncertainty for small agricultural watersheds in Hastings, Nebraska. The stochastic nature of the model results from postulating a probabilistic model for parameter estima...

D. M. Goldman M. A. Marino A. D. Feldman

1990-01-01

463

Uncertainties in Solar Synoptic Magnetic Flux Maps

NASA Astrophysics Data System (ADS)

Magnetic flux synoptic charts are critical for a reliable modeling of the corona and heliosphere. Until now, however, these charts were provided without uncertainty estimates. The uncertainties are due to instrumental noise in the measurements and to the spatial variance of the magnetic flux distribution that contributes to each bin in the synoptic chart. We describe here a simple method to compute synoptic magnetic flux maps and their corresponding magnetic flux spatial variance charts that can be used to estimate the uncertainty in the results of coronal models. We have tested this approach by computing a potential-field source-surface model of the coronal field for a Monte Carlo simulation of Carrington synoptic magnetic flux maps generated from the variance map. We show that these