Fischer, Andreas
2016-11-01
Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.
Graham, Jeffrey K; Smith, Myron L; Simons, Andrew M
2014-07-22
All organisms are faced with environmental uncertainty. Bet-hedging theory expects unpredictable selection to result in the evolution of traits that maximize the geometric-mean fitness even though such traits appear to be detrimental over the shorter term. Despite the centrality of fitness measures to evolutionary analysis, no direct test of the geometric-mean fitness principle exists. Here, we directly distinguish between predictions of competing fitness maximization principles by testing Cohen's 1966 classic bet-hedging model using the fungus Neurospora crassa. The simple prediction is that propagule dormancy will evolve in proportion to the frequency of 'bad' years, whereas the prediction of the alternative arithmetic-mean principle is the evolution of zero dormancy as long as the expectation of a bad year is less than 0.5. Ascospore dormancy fraction in N. crassa was allowed to evolve under five experimental selection regimes that differed in the frequency of unpredictable 'bad years'. Results were consistent with bet-hedging theory: final dormancy fraction in 12 genetic lineages across 88 independently evolving samples was proportional to the frequency of bad years, and evolved both upwards and downwards as predicted from a range of starting dormancy fractions. These findings suggest that selection results in adaptation to variable rather than to expected environments. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Statistical Entropy of Dirac Field Outside RN Black Hole and Modified Density Equation
NASA Astrophysics Data System (ADS)
Cao, Fei; He, Feng
2012-02-01
Statistical entropy of Dirac field in Reissner-Nordstrom black hole space-time is computed by state density equation corrected by the generalized uncertainty principle to all orders in Planck length and WKB approximation. The result shows that the statistical entropy is proportional to the horizon area but the present result is convergent without any artificial cutoff.
Statistical Entropy of the G-H-S Black Hole to All Orders in Planck Length
NASA Astrophysics Data System (ADS)
Sun, Hangbin; He, Feng; Huang, Hai
2012-02-01
Considering corrections to all orders in Planck length on the quantum state density from generalized uncertainty principle, we calculate the statistical entropy of the scalar field near the horizon of Garfinkle-Horowitz-Strominger (G-H-S) black hole without any artificial cutoff. It is shown that the entropy is proportional to the horizon area.
Continuous quantum measurements and the action uncertainty principle
NASA Astrophysics Data System (ADS)
Mensky, Michael B.
1992-09-01
The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.
What is the uncertainty principle of non-relativistic quantum mechanics?
NASA Astrophysics Data System (ADS)
Riggs, Peter J.
2018-05-01
After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.
Statistical Entropy of Vaidya-de Sitter Black Hole to All Orders in Planck Length
NASA Astrophysics Data System (ADS)
Sun, HangBin; He, Feng; Huang, Hai
2012-06-01
Considering corrections to all orders in Planck length on the quantum state density from generalized uncertainty principle, we calculate the statistical entropy of scalar field near event horizon and cosmological horizon of Vaidya-de Sitter black hole without any artificial cutoff. It is shown that the entropy is linear sum of event horizon area and cosmological horizon area and there are similar proportional parameters related to changing rate of the horizon position. This is different from the static and stationary cases.
Black hole complementarity with the generalized uncertainty principle in Gravity's Rainbow
NASA Astrophysics Data System (ADS)
Gim, Yongwan; Um, Hwajin; Kim, Wontae
2018-02-01
When gravitation is combined with quantum theory, the Heisenberg uncertainty principle could be extended to the generalized uncertainty principle accompanying a minimal length. To see how the generalized uncertainty principle works in the context of black hole complementarity, we calculate the required energy to duplicate information for the Schwarzschild black hole. It shows that the duplication of information is not allowed and black hole complementarity is still valid even assuming the generalized uncertainty principle. On the other hand, the generalized uncertainty principle with the minimal length could lead to a modification of the conventional dispersion relation in light of Gravity's Rainbow, where the minimal length is also invariant as well as the speed of light. Revisiting the gedanken experiment, we show that the no-cloning theorem for black hole complementarity can be made valid in the regime of Gravity's Rainbow on a certain combination of parameters.
Gamma-Ray Telescope and Uncertainty Principle
ERIC Educational Resources Information Center
Shivalingaswamy, T.; Kagali, B. A.
2012-01-01
Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faizal, Mir, E-mail: f2mir@uwaterloo.ca; Majumder, Barun, E-mail: barunbasanta@iitgn.ac.in
In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nizami, Lance
2010-03-01
Norwich's Entropy Theory of Perception (1975-present) is a general theory of perception, based on Shannon's Information Theory. Among many bold claims, the Entropy Theory presents a truly astounding result: that Stevens' Law with an Index of 1, an empirical power relation of direct proportionality between perceived taste intensity and stimulus concentration, arises from theory alone. Norwich's theorizing starts with several extraordinary hypotheses. First, 'multiple, parallel receptor-neuron units' without collaterals 'carry essentially the same message to the brain', i.e. the rate-level curves are identical. Second, sensation is proportional to firing rate. Third, firing rate is proportional to the taste receptor's 'resolvablemore » uncertainty'. Fourth, the 'resolvable uncertainty' is obtained from Shannon's Information Theory. Finally, 'resolvable uncertainty' also depends upon the microscopic thermodynamic density fluctuation of the tasted solute. Norwich proves that density fluctuation is density variance, which is proportional to solute concentration, all based on the theory of fluctuations in fluid composition from Tolman's classic physics text, 'The Principles of Statistical Mechanics'. Altogether, according to Norwich, perceived taste intensity is theoretically proportional to solute concentration. Such a universal rule for taste, one that is independent of solute identity, personal physiological differences, and psychophysical task, is truly remarkable and is well-deserving of scrutiny. Norwich's crucial step was the derivation of density variance. That step was meticulously reconstructed here. It transpires that the appropriate fluctuation is Tolman's mean-square fractional density fluctuation, not density variance as used by Norwich. Tolman's algebra yields a 'Stevens Index' of -1 rather than 1. As 'Stevens Index' empirically always exceeds zero, the Index of -1 suggests that it is risky to infer psychophysical laws of sensory response from information theory and stimulus physics while ignoring empirical biological transformations, such as sensory transduction. Indeed, it raises doubts as to whether the Entropy Theory actually describes psychophysical laws at all.« less
Extrapolation, uncertainty factors, and the precautionary principle.
Steel, Daniel
2011-09-01
This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. Copyright © 2011 Elsevier Ltd. All rights reserved.
Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?
ERIC Educational Resources Information Center
Robertson, Bill
2016-01-01
Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…
NASA Astrophysics Data System (ADS)
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
Entropy of a (1+1)-dimensional charged black hole to all orders in the Planck length
NASA Astrophysics Data System (ADS)
Kim, Yong-Wan; Park, Young-Jai
2013-02-01
We study the statistical entropy of a scalar field on the (1+1)-dimensional Maxwell-dilaton background without an artificial cutoff by considering corrections to all orders in the Planck length obtained from a generalized uncertainty principle applied to the quantum state density. In contrast to the previous results for d ≥ 3 dimensional cases, we obtain an unadjustable entropy due to the independence of the minimal length, which plays the role of an adjustable parameter. However, this entropy is still proportional to the Bekenstein-Hawking entropy.
The Uncertainty Principle in the Presence of Quantum Memory
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato
2010-03-01
One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.
Disturbance, the uncertainty principle and quantum optics
NASA Technical Reports Server (NTRS)
Martens, Hans; Demuynck, Willem M.
1993-01-01
It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.
Verification of the Uncertainty Principle by Using Diffraction of Light Waves
ERIC Educational Resources Information Center
Nikolic, D.; Nesic, Lj
2011-01-01
We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…
Entropy bound of local quantum field theory with generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Kim, Yong-Wan; Lee, Hyung Won; Myung, Yun Soo
2009-03-01
We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound A 3 / 4 rather than A with A the boundary area.
ERIC Educational Resources Information Center
Harbola, Varun
2011-01-01
In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…
The principle of proportionality revisited: interpretations and applications.
Hermerén, Göran
2012-11-01
The principle of proportionality is used in many different contexts. Some of these uses and contexts are first briefly indicated. This paper focusses on the use of this principle as a moral principle. I argue that under certain conditions the principle of proportionality is helpful as a guide in decision-making. But it needs to be clarified and to be used with some flexibility as a context-dependent principle. Several interpretations of the principle are distinguished, using three conditions as a starting point: importance of objective, relevance of means, and most favourable option. The principle is then tested against an example, which suggests that a fourth condition, focusing on non-excessiveness, needs to be added. I will distinguish between three main interpretations of the principle, some primarily with uses in research ethics, others with uses in other areas of bioethics, for instance in comparisons of therapeutic means and ends. The relations between the principle of proportionality and the precautionary principle are explored in the following section. It is concluded that the principles are different and may even clash. In the next section the principle of proportionality is applied to some medical examples drawn from research ethics and bioethics. In concluding, the status of the principle of proportionality as a moral principle is discussed. What has been achieved so far and what remains to be done is finally summarized.
Comparison of Classical and Quantum Mechanical Uncertainties.
ERIC Educational Resources Information Center
Peslak, John, Jr.
1979-01-01
Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2014-03-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self- consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2013-11-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
Quantum corrections to newtonian potential and generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Scardigli, Fabio; Lambiase, Gaetano; Vagenas, Elias
2017-08-01
We use the leading quantum corrections to the newtonian potential to compute the deformation parameter of the generalized uncertainty principle. By assuming just only General Relativity as theory of Gravitation, and the thermal nature of the GUP corrections to the Hawking spectrum, our calculation gives, to first order, a specific numerical result. We briefly discuss the physical meaning of this value, and compare it with the previously obtained bounds on the generalized uncertainty principle deformation parameter.
Uncertainty principle in loop quantum cosmology by Moyal formalism
NASA Astrophysics Data System (ADS)
Perlov, Leonid
2018-03-01
In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.
An uncertainty principle for unimodular quantum groups
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crann, Jason; Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex; Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca
2014-08-15
We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect tomore » the Haar weight reduces to the canonical entropy of the random walk generated by the state.« less
NASA Astrophysics Data System (ADS)
Latifah, E.; Imanullah, M. N.
2018-03-01
One of the objectives of fisheries management is to reach long-term sustainable benefits of the fish stocks while reducing the risk of severe or irreversible damage to the marine ecosystem. Achieving this objective needs, the good scientific knowledge and understanding on fisheries management including scientific data and information on the fish stock, fishing catch, distribution, migration, the proportion of mature fish, the mortality rate, reproduction as well as the knowledge on the impact of fishing on dependent and associated species and other species belonging to the same ecosystem, and further the impact of climate change and climate variability on the fish stocks and marine ecosystem. Lack of this scientific knowledge may lead to high levels of uncertainty. The precautionary principle is one of the basic environmental principles needed in overcoming this problem. An essence of this principle is that, in facing the serious risk as a result of the limited scientific knowledge or the absence of complete evidence of harm, it should not prevent the precautionary measures in minimizing risks and protecting the fish stocks and ecosystem. This study aims to examine how the precautionary principle in fisheries management be formulated into the international legal framework, especially under the climate change framework.
Constraining the generalized uncertainty principle with the atomic weak-equivalence-principle test
NASA Astrophysics Data System (ADS)
Gao, Dongfeng; Wang, Jin; Zhan, Mingsheng
2017-04-01
Various models of quantum gravity imply the Planck-scale modifications of Heisenberg's uncertainty principle into a so-called generalized uncertainty principle (GUP). The GUP effects on high-energy physics, cosmology, and astrophysics have been extensively studied. Here, we focus on the weak-equivalence-principle (WEP) violation induced by the GUP. Results from the WEP test with the 85Rb-87Rb dual-species atom interferometer are used to set upper bounds on parameters in two GUP proposals. A 1045-level bound on the Kempf-Mangano-Mann proposal and a 1027-level bound on Maggiore's proposal, which are consistent with bounds from other experiments, are obtained. All these bounds have huge room for improvement in the future.
NASA Astrophysics Data System (ADS)
Williams, Q.
2018-05-01
The thermal conductivity of iron alloys at high pressures and temperatures is a critical parameter in governing ( a) the present-day heat flow out of Earth's core, ( b) the inferred age of Earth's inner core, and ( c) the thermal evolution of Earth's core and lowermost mantle. It is, however, one of the least well-constrained important geophysical parameters, with current estimates for end-member iron under core-mantle boundary conditions varying by about a factor of 6. Here, the current state of calculations, measurements, and inferences that constrain thermal conductivity at core conditions are reviewed. The applicability of the Wiedemann-Franz law, commonly used to convert electrical resistivity data to thermal conductivity data, is probed: Here, whether the constant of proportionality, the Lorenz number, is constant at extreme conditions is of vital importance. Electron-electron inelastic scattering and increases in Fermi-liquid-like behavior may cause uncertainties in thermal conductivities derived from both first-principles-associated calculations and electrical conductivity measurements. Additional uncertainties include the role of alloying constituents and local magnetic moments of iron in modulating the thermal conductivity. Thus, uncertainties in thermal conductivity remain pervasive, and hence a broad range of core heat flows and inner core ages appear to remain plausible.
Moutel, G; Hergon, E; Duchange, N; Bellier, L; Rouger, P; Hervé, C
2005-02-01
The precautionary principle first appeared in France during the health crisis following the contamination of patients with HIV via blood transfusion. This study analyses whether the risk associated with blood transfusion was taken into account early enough considering the context of scientific uncertainty between 1982 and 1985. The aim was to evaluate whether a precautionary principle was applied and whether it was relevant. First, we investigated the context of scientific uncertainty and controversies prevailing between 1982 and 1985. Then we analysed the attitude and decisions of the French authorities in this situation to determine whether a principle of precaution was applied. Finally, we explored the reasons at the origin of the delay in controlling the risk. Despite the scientific uncertainties associated with the potential risk of HIV contamination by transfusion in 1983, we found that a list of recommendations aiming to reduce this risk was published in June of that year. In the prevailing climate of uncertainty, these measures could be seen as precautionary. However, the recommended measures were not widely applied. Cultural, structural and economic factors hindered their implementation. Our analysis provides insight into the use of precautionary principle in the domain of blood transfusion and, more generally, medicine. It also sheds light on the expectations that health professionals should have of this principle. The aim of the precautionary principle is to manage rather than to reduce scientific uncertainty. The principle is not a futile search for zero risk. Rather, it is a principle for action allowing precautionary measures to be taken. However, we show that these measures must appear legitimate to be applied. This legitimacy requires an adapted decision-making process, involving all those concerned in the management of collective risks.
Uncertainty principles for inverse source problems for electromagnetic and elastic waves
NASA Astrophysics Data System (ADS)
Griesmaier, Roland; Sylvester, John
2018-06-01
In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.
Single-Slit Diffraction and the Uncertainty Principle
ERIC Educational Resources Information Center
Rioux, Frank
2005-01-01
A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.
NASA Astrophysics Data System (ADS)
Velentzas, Athanasios; Halkia, Krystallia
2011-08-01
In this work an attempt is made to explore the possible value of using Thought Experiments (TEs) in teaching physics to upper secondary education students. Specifically, a qualitative research project is designed to investigate the extent to which the Thought Experiment (TE) called `Heisenberg's Microscope', as it has been transformed by Gamow for the public in his book Mr. Tompkins in Paperback, can function as a tool in the teaching of the `uncertainty principle'. The sample in the research consisted of 40 Greek students, in 11 groups of 3-4 students each. The findings of this study reveal that the use of this TE has positive results in teaching the uncertainty principle. Students, based on the TE, were able (i) to derive a formula of the uncertainty principle, (ii) to explain that the uncertainty principle is a general principle in nature and it is not a result of incompleteness of the experimental devices and (iii) to argue that it is impossible to determine the trajectory of a particle as a mathematical line.
The action uncertainty principle and quantum gravity
NASA Astrophysics Data System (ADS)
Mensky, Michael B.
1992-02-01
Results of the path-integral approach to the quantum theory of continuous measurements have been formulated in a preceding paper in the form of an inequality of the type of the uncertainty principle. The new inequality was called the action uncertainty principle, AUP. It was shown that the AUP allows one to find in a simple what outputs of the continuous measurements will occur with high probability. Here a more simple form of the AUP will be formulated, δ S≳ħ. When applied to quantum gravity, it leads in a very simple way to the Rosenfeld inequality for measurability of the average curvature.
An uncertainty budget for VHF and UHF reflectometers
NASA Astrophysics Data System (ADS)
Ridler, N. M.; Medley, C. J.
1992-05-01
Details of the derivation of an uncertainty budget for one port immittance or complex voltage reflection coefficient measuring instruments, operating at VHF and UHF in the 14 mm 50 ohm coaxial line size, are reported. The principles of the uncertainty budget are given along with experimental results obtained using six ports and a network analyzer as the measuring instruments. Details of the types of calibration for which the uncertainty budget is suitable are reported. Various aspects of the uncertainty budget are considered and general principles and treatment of the type A and type B contributions are discussed. Experimental results obtained using the uncertainty budget are given. A summary of uncertainties for the six ports and HP8753B automatic network analyzer are also given.
Fries, James F; Krishnan, Eswar
2004-01-01
The concept of 'equipoise', or the 'uncertainty principle', has been represented as a central ethical principle, and holds that a subject may be enrolled in a randomized controlled trial (RCT) only if there is true uncertainty about which of the trial arms is most likely to benefit the patient. We sought to estimate the frequency with which equipoise conditions were met in industry-sponsored RCTs in rheumatology, to explore the reasons for any deviations from equipoise, to examine the concept of 'design bias', and to consider alternative ethical formulations that might improve subject safety and autonomy. We studied abstracts accepted for the 2001 American College of Rheumatology meetings that reported RCTs, acknowledged industry sponsorship, and had clinical end-points (n = 45), and examined the proportion of studies that favored the registration or marketing of the sponsor's drug. In every trial (45/45) results were favorable to the sponsor, indicating that results could have been predicted in advance solely by knowledge of sponsorship (P < 0.0001). Equipoise clearly was being systematically violated. Publication bias appeared to be an incomplete explanation for this dramatic result; this bias occurs after a study is completed. Rather, we hypothesize that 'design bias', in which extensive preliminary data are used to design studies with a high likelihood of being positive, is the major cause of the asymmetric results. Design 'bias' occurs before the trial is begun and is inconsistent with the equipoise principle. However, design bias increases scientific efficiency, decreases drug development costs, and limits the number of subjects required, probably reducing aggregate risks to participants. Conceptual and ethical issues were found with the equipoise principle, which encourages performance of negative studies; ignores patient values, patient autonomy, and social benefits; is applied at a conceptually inappropriate decision point (after randomization rather than before); and is in conflict with the Belmont, Nuremberg, and other sets of ethical principles, as well as with US Food and Drug Administration procedures. We propose a principle of 'positive expected outcomes', which informs the assessment that a trial is ethical, together with a restatement of the priority of personal autonomy.
Polar Wavelet Transform and the Associated Uncertainty Principles
NASA Astrophysics Data System (ADS)
Shah, Firdous A.; Tantary, Azhar Y.
2018-06-01
The polar wavelet transform- a generalized form of the classical wavelet transform has been extensively used in science and engineering for finding directional representations of signals in higher dimensions. The aim of this paper is to establish new uncertainty principles associated with the polar wavelet transforms in L2(R2). Firstly, we study some basic properties of the polar wavelet transform and then derive the associated generalized version of Heisenberg-Pauli-Weyl inequality. Finally, following the idea of Beckner (Proc. Amer. Math. Soc. 123, 1897-1905 1995), we drive the logarithmic version of uncertainty principle for the polar wavelet transforms in L2(R2).
Time-Frequency Representations for Speech Signals.
1987-06-01
and subsequent processing can take these weights into account . This is, in principle , safer, but pratically it is much harder to think about processing...and frequency along the other. But how should this idea be made precise (the well-known uncertainty principle of fourier analysis is one of the thorny...produce similar results. q2.3. Non-stationarity 19 it is the unique shape that meets the uncertainty principle with equality. 2.2. The quasi-stationary
The Uncertainty Principle, Virtual Particles and Real Forces
ERIC Educational Resources Information Center
Jones, Goronwy Tudor
2002-01-01
This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…
Djulbegovic, Benjamin
2009-01-01
Background Progress in clinical medicine relies on the willingness of patients to take part in experimental clinical trials, particularly randomized controlled trials (RCTs). Before agreeing to enroll in clinical trials, patients require guarantees that they will not knowingly be harmed and will have the best possible chances of receiving the most favorable treatments. This guarantee is provided by the acknowledgment of uncertainty (equipoise), which removes ethical dilemmas and makes it easier for patients to enroll in clinical trials. Methods Since the design of clinical trials is mostly affected by clinical equipoise, the “clinical equipoise hypothesis” has been postulated. If the uncertainty requirement holds, this means that investigators cannot predict what they are going to discover in any individual trial that they undertake. In some instances, new treatments will be superior to standard treatments, while in others, standard treatments will be superior to experimental treatments, and in still others, no difference will be detected between new and standard treatments. It is hypothesized that there must be a relationship between the overall pattern of treatment successes and the uncertainties that RCTs are designed to address. Results An analysis of published trials shows that the results cannot be predicted at the level of individual trials. However, the results also indicate that the overall pattern of discovery of treatment success across a series of trials is predictable and is consistent with clinical equipoise hypothesis. The analysis shows that we can discover no more than 25% to 50% of successful treatments when they are tested in RCTs. The analysis also indicates that this discovery rate is optimal in helping to preserve the clinical trial system; a high discovery rate (eg, a 90% to 100% probability of success) is neither feasible nor desirable since under these circumstances, neither the patient nor the researcher has an interest in randomization. This in turn would halt the RCT system as we know it. Conclusions The “principle or law of clinical discovery” described herein predicts the efficiency of the current system of RCTs at generating discoveries of new treatments. The principle is derived from the requirement for uncertainty or equipoise as a precondition for RCTs, the precept that paradoxically drives discoveries of new treatments while limiting the proportion and rate of new therapeutic discoveries. PMID:19910921
Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models
NASA Technical Reports Server (NTRS)
Terazawa, Hidezumi
1996-01-01
The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.
Two new kinds of uncertainty relations
NASA Technical Reports Server (NTRS)
Uffink, Jos
1994-01-01
We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.
Generalized uncertainty principle: implications for black hole complementarity
NASA Astrophysics Data System (ADS)
Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han
2014-12-01
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.
“Stringy” coherent states inspired by generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Ghosh, Subir; Roy, Pinaki
2012-05-01
Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.
The Generalized Uncertainty Principle and Harmonic Interaction in Three Spatial Dimensions
NASA Astrophysics Data System (ADS)
Hassanabadi, H.; Hooshmand, P.; Zarrinkamar, S.
2015-01-01
In three spatial dimensions, the generalized uncertainty principle is considered under an isotropic harmonic oscillator interaction in both non-relativistic and relativistic regions. By using novel transformations and separations of variables, the exact analytical solution of energy eigenvalues as well as the wave functions is obtained. Time evolution of the non-relativistic region is also reported.
ERIC Educational Resources Information Center
Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie
2011-01-01
Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…
A review of the generalized uncertainty principle.
Tawfik, Abdel Nasser; Diab, Abdel Magied
2015-12-01
Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.
Antonopoulou, Lila; van Meurs, Philip
2003-11-01
The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health.
Uncertainty relations with the generalized Wigner-Yanase-Dyson skew information
NASA Astrophysics Data System (ADS)
Fan, Yajing; Cao, Huaixin; Wang, Wenhua; Meng, Huixian; Chen, Liang
2018-07-01
The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. We introduce the generalized Wigner-Yanase-Dyson correlation and the related quantities. Various properties of them are discussed. Finally, we establish several generalizations of uncertainty relation expressed in terms of the generalized Wigner-Yanase-Dyson skew information.
Model-free adaptive speed control on travelling wave ultrasonic motor
NASA Astrophysics Data System (ADS)
Di, Sisi; Li, Huafeng
2018-01-01
This paper introduced a new data-driven control (DDC) method for the speed control of ultrasonic motor (USM). The model-free adaptive control (MFAC) strategy was presented in terms of its principles, algorithms, and parameter selection. To verify the efficiency of the proposed method, a speed-frequency-time model, which contained all the measurable nonlinearity and uncertainties based on experimental data was established for simulation to mimic the USM operation system. Furthermore, the model was identified using particle swarm optimization (PSO) method. Then, the control of the simulated system using MFAC was evaluated under different expectations in terms of overshoot, rise time and steady-state error. Finally, the MFAC results were compared with that of proportion iteration differentiation (PID) to demonstrate its advantages in controlling general random system.
Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle
NASA Astrophysics Data System (ADS)
Oppenheim, Jacob N.; Magnasco, Marcelo O.
2013-01-01
The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.
Other ways of measuring `Big G'
NASA Astrophysics Data System (ADS)
Rothleitner, Christian
2016-03-01
In 1798, the British scientist Henry Cavendish performed the first laboratory experiment to determine the gravitational force between two massive bodies. From his result, Newton's gravitational constant, G, was calculated. Cavendish's measurement principle was the torsion balance invented by John Michell some 15 years before. During the following two centuries, more than 300 new measurements followed. Although technology - and physics - developed rapidly during this time, surprisingly, most experiments were still based on the same principle. In fact, the most accurate determination of G to date is a measurement based on the torsion balance principle. Despite the fact that G was one of the first fundamental physical constants ever measured, and despite the huge number of experiments performed on it to this day, its CODATA recommended value still has the highest standard measurement uncertainty when compared to other fundamental physical constants. Even more serious is the fact that even measurements based on the same principle often do not overlap within their attributed standard uncertainties. It must be assumed that various experiments are subject to one or more unknown biases. In this talk I will present some alternative experimental setups to the torsion balance which have been performed or proposed to measure G. Although their estimated uncertainties are often higher than most torsion balance experiments, revisiting such ideas is worthwhile. Advances in technology could offer solutions to problems which were previously insurmountable, these solutions could result in lower measurement uncertainties. New measurement principles could also help to uncover hidden systematic effects.
On entropic uncertainty relations in the presence of a minimal length
NASA Astrophysics Data System (ADS)
Rastegin, Alexey E.
2017-07-01
Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.
NASA Astrophysics Data System (ADS)
Mazurova, Elena; Lapshin, Aleksey
2013-04-01
The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell precisely, what frequency is present in the signal at the current moment of time: it is possible to speak only about the range of frequencies. Besides, it is impossible to specify precisely the time moment of the presence of this or that frequency: it is possible to speak only about the time frame. It is this feature that imposes major constrains on the applicability of the STFT. In spite of the fact that the problems of resolution in time and frequency result from a physical phenomenon (Heisenberg's uncertainty principle) and exist independent of the transform applied, there is a possibility to analyze any signal, using the alternative approach - the multiresolutional analysis (MRA). The wavelet-transform is one of the methods for making a MRA-type analysis. Thanks to it, low frequencies can be shown in a more detailed form with respect to time, and high ones - with respect to frequency. The paper presents the results of calculating of the components of the deflection of the vertical, done by the SFT, STFT and WT. The results are presented in the form of 3-d models that visually show the action of Heisenberg's uncertainty principle in the specified algorithms. The research conducted allows us to recommend the application of wavelet-transform to calculate of the components of the deflection of the vertical in the near-field zone. Keywords: Standard Fourier Transform, Short-Time Fourier Transform, Wavelet Transform, Heisenberg's uncertainty principle.
Tightening the entropic uncertainty bound in the presence of quantum memory
NASA Astrophysics Data System (ADS)
Adabi, F.; Salimi, S.; Haseli, S.
2016-06-01
The uncertainty principle is a fundamental principle in quantum physics. It implies that the measurement outcomes of two incompatible observables cannot be predicted simultaneously. In quantum information theory, this principle can be expressed in terms of entropic measures. M. Berta et al. [Nat. Phys. 6, 659 (2010), 10.1038/nphys1734] have indicated that uncertainty bound can be altered by considering a particle as a quantum memory correlating with the primary particle. In this article, we obtain a lower bound for entropic uncertainty in the presence of a quantum memory by adding an additional term depending on the Holevo quantity and mutual information. We conclude that our lower bound will be tightened with respect to that of Berta et al. when the accessible information about measurements outcomes is less than the mutual information about the joint state. Some examples have been investigated for which our lower bound is tighter than Berta et al.'s lower bound. Using our lower bound, a lower bound for the entanglement of formation of bipartite quantum states has been obtained, as well as an upper bound for the regularized distillable common randomness.
Measurement of optical to electrical and electrical to optical delays with ps-level uncertainty.
Peek, H Z; Pinkert, T J; Jansweijer, P P M; Koelemeij, J C J
2018-05-28
We present a new measurement principle to determine the absolute time delay of a waveform from an optical reference plane to an electrical reference plane and vice versa. We demonstrate a method based on this principle with 2 ps uncertainty. This method can be used to perform accurate time delay determinations of optical transceivers used in fiber-optic time-dissemination equipment. As a result the time scales in optical and electrical domain can be related to each other with the same uncertainty. We expect this method will be a new breakthrough in high-accuracy time transfer and absolute calibration of time-transfer equipment.
Generalized uncertainty principle and the maximum mass of ideal white dwarfs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rashidi, Reza, E-mail: reza.rashidi@srttu.edu
The effects of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane–Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.
Bschir, Karim
2017-04-01
Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.
Against proportional shortfall as a priority-setting principle.
Altmann, Samuel
2018-05-01
As the demand for healthcare rises, so does the need for priority setting in healthcare. In this paper, I consider a prominent priority-setting principle: proportional shortfall. My purpose is to argue that proportional shortfall, as a principle, should not be adopted. My key criticism is that proportional shortfall fails to consider past health.Proportional shortfall is justified as it supposedly balances concern for prospective health while still accounting for lifetime health, even though past health is deemed irrelevant. Accounting for this lifetime perspective means that the principle may indirectly consider past health by accounting for how far an individual is from achieving a complete, healthy life. I argue that proportional shortfall does not account for this lifetime perspective as it fails to incorporate the fair innings argument as originally claimed, undermining its purported justification.I go on to demonstrate that the case for ignoring past health is weak, and argue that past health is at least sometimes relevant for priority-setting decisions. Specifically, when an individual's past health has a direct impact on current or future health, and when one individual has enjoyed significantly more healthy life years than another.Finally, I demonstrate that by ignoring past illnesses, even those entirely unrelated to their current illness, proportional shortfall can lead to instances of double jeopardy, a highly problematic implication. These arguments give us reason to reject proportional shortfall. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Technical Reports Server (NTRS)
Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.
1992-01-01
The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.
Generalized uncertainty principles and quantum field theory
NASA Astrophysics Data System (ADS)
Husain, Viqar; Kothawala, Dawood; Seahra, Sanjeev S.
2013-01-01
Quantum mechanics with a generalized uncertainty principle arises through a representation of the commutator [x^,p^]=if(p^). We apply this deformed quantization to free scalar field theory for f±=1±βp2. The resulting quantum field theories have a rich fine scale structure. For small wavelength modes, the Green’s function for f+ exhibits a remarkable transition from Lorentz to Galilean invariance, whereas for f- such modes effectively do not propagate. For both cases Lorentz invariance is recovered at long wavelengths.
The uncertainty principle and quantum chaos
NASA Technical Reports Server (NTRS)
Chirikov, Boris V.
1993-01-01
The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.
GUP parameter from quantum corrections to the Newtonian potential
NASA Astrophysics Data System (ADS)
Scardigli, Fabio; Lambiase, Gaetano; Vagenas, Elias C.
2017-04-01
We propose a technique to compute the deformation parameter of the generalized uncertainty principle by using the leading quantum corrections to the Newtonian potential. We just assume General Relativity as theory of Gravitation, and the thermal nature of the GUP corrections to the Hawking spectrum. With these minimal assumptions our calculation gives, to first order, a specific numerical result. The physical meaning of this value is discussed, and compared with the previously obtained bounds on the generalized uncertainty principle deformation parameter.
Proportionality, just war theory and weapons innovation.
Forge, John
2009-03-01
Just wars are supposed to be proportional responses to aggression: the costs of war must not greatly exceed the benefits. This proportionality principle raises a corresponding 'interpretation problem': what are the costs and benefits of war, how are they to be determined, and a 'measurement problem': how are costs and benefits to be balanced? And it raises a problem about scope: how far into the future do the states of affairs to be measured stretch? It is argued here that weapons innovation always introduces costs, and that these costs cannot be determined in advance of going to war. Three examples, the atomic bomb, the AK-47 and the ancient Greek catapult, are given as examples. It is therefore argued that the proportionality principle is inapplicable prospectively. Some replies to the argument are discussed and rejected. Some more general defences of the proportionality principle are considered and also rejected. Finally, the significance of the argument for Just War Theory as a whole is discussed.
Computing the Entropy of Kerr-Newman Black Hole Without Brick Walls Method
NASA Astrophysics Data System (ADS)
Zhang, Li-Chun; Wu, Yue-Qin; Li, Huai-Fan; Ren, Zhao
By using the entanglement entropy method, the statistical entropy of the Bose and Fermi fields in a thin film is calculated and the Bekenstein-Hawking entropy of Kerr-Newman black hole is obtained. Here, the Bose and Fermi fields are entangled with the quantum states in Kerr-Newman black hole and are outside of the horizon. The divergence of brick-wall model is avoided without any cutoff by the new equation of state density obtained with the generalized uncertainty principle. The calculation implies that the high density quantum states near the event horizon are strongly correlated with the quantum states in black hole. The black hole entropy is a quantum effect. It is an intrinsic characteristic of space-time. The ultraviolet cutoff in the brick-wall model is unreasonable. The generalized uncertainty principle should be considered in the high energy quantum field near the event horizon. From the calculation, the constant λ introduced in the generalized uncertainty principle is related to polar angle θ in an axisymmetric space-time.
Cope, F W
1981-01-01
The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature.
Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
Bejjanki, Vikranth Rao; Clayards, Meghan; Knill, David C.; Aslin, Richard N.
2011-01-01
Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks. PMID:21637344
29 CFR 1608.1 - Statement of purpose.
Code of Federal Regulations, 2010 CFR
2010-07-01
... upon the principles of title VII. Any uncertainty as to the meaning and application of title VII in... Commission believes that it is now necessary to clarify and harmonize the principles of title VII in order to... who comply with the principles of title VII. (b) Purposes of title VII. Congress enacted title VII in...
ERIC Educational Resources Information Center
Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.
2002-01-01
Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…
Modelling and identification for control of gas bearings
NASA Astrophysics Data System (ADS)
Theisen, Lukas R. S.; Niemann, Hans H.; Santos, Ilmar F.; Galeazzi, Roberto; Blanke, Mogens
2016-03-01
Gas bearings are popular for their high speed capabilities, low friction and clean operation, but suffer from poor damping, which poses challenges for safe operation in presence of disturbances. Feedback control can achieve enhanced damping but requires low complexity models of the dominant dynamics over its entire operating range. Models from first principles are complex and sensitive to parameter uncertainty. This paper presents an experimental technique for "in situ" identification of a low complexity model of a rotor-bearing-actuator system and demonstrates identification over relevant ranges of rotational speed and gas injection pressure. This is obtained using parameter-varying linear models that are found to capture the dominant dynamics. The approach is shown to be easily applied and to suit subsequent control design. Based on the identified models, decentralised proportional control is designed and shown to obtain the required damping in theory and in a laboratory test rig.
Bidirectional active control of structures with type-2 fuzzy PD and PID
NASA Astrophysics Data System (ADS)
Paul, Satyam; Yu, Wen; Li, Xiaoou
2018-03-01
Proportional-derivative and proportional-integral-derivative (PD/PID) controllers are popular algorithms in structure vibration control. In order to maintain minimum regulation error, the PD/PID control require big proportional and derivative gains. The control performances are not satisfied because of the big uncertainties in the buildings. In this paper, type-2 fuzzy system is applied to compensate the unknown uncertainties, and is combined with the PD/PID control. We prove the stability of these fuzzy PD and PID controllers. The sufficient conditions can be used for choosing the gains of PD/PID. The theory results are verified by a two-storey building prototype. The experimental results validate our analysis.
Dynamics of non-holonomic systems with stochastic transport
NASA Astrophysics Data System (ADS)
Holm, D. D.; Putkaradze, V.
2018-01-01
This paper formulates a variational approach for treating observational uncertainty and/or computational model errors as stochastic transport in dynamical systems governed by action principles under non-holonomic constraints. For this purpose, we derive, analyse and numerically study the example of an unbalanced spherical ball rolling under gravity along a stochastic path. Our approach uses the Hamilton-Pontryagin variational principle, constrained by a stochastic rolling condition, which we show is equivalent to the corresponding stochastic Lagrange-d'Alembert principle. In the example of the rolling ball, the stochasticity represents uncertainty in the observation and/or error in the computational simulation of the angular velocity of rolling. The influence of the stochasticity on the deterministically conserved quantities is investigated both analytically and numerically. Our approach applies to a wide variety of stochastic, non-holonomically constrained systems, because it preserves the mathematical properties inherited from the variational principle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zozor, Steeve; Portesi, Mariela; Sanchez-Moreno, Pablo
The position-momentum uncertainty-like inequality based on moments of arbitrary order for d-dimensional quantum systems, which is a generalization of the celebrated Heisenberg formulation of the uncertainty principle, is improved here by use of the Renyi-entropy-based uncertainty relation. The accuracy of the resulting lower bound is physico-computationally analyzed for the two main prototypes in d-dimensional physics: the hydrogenic and oscillator-like systems.
2012-08-01
habitats for specific species of trout . The report noted that these uncertainties — and the SMEs, who had past experience in such topic areas — were...reduce uncertainty in HREP projects is reflected in the completion of the Pool 11 Islands (UMRS RM 583-593) HREP in 2003. In 1989 the Browns Lake
Generalized Entropic Uncertainty Relations with Tsallis' Entropy
NASA Technical Reports Server (NTRS)
Portesi, M.; Plastino, A.
1996-01-01
A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.
NASA Astrophysics Data System (ADS)
Zhang, Jun; Zhang, Yang; Yu, Chang-Shui
2015-06-01
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.
Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar
2012-05-01
Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.
On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.
Thomson, Rowan M; Kawrakow, Iwan
2011-08-01
The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.
Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.
Peters, Achim; McEwen, Bruce S; Friston, Karl
2017-09-01
The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.
Rogers, Michael D
2003-06-01
Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.
Zhang, Jun; Zhang, Yang; Yu, Chang-shui
2015-01-01
The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488
The statistical fluctuation study of quantum key distribution in means of uncertainty principle
NASA Astrophysics Data System (ADS)
Liu, Dunwei; An, Huiyao; Zhang, Xiaoyu; Shi, Xuemei
2018-03-01
Laser defects in emitting single photon, photon signal attenuation and propagation of error cause our serious headaches in practical long-distance quantum key distribution (QKD) experiment for a long time. In this paper, we study the uncertainty principle in metrology and use this tool to analyze the statistical fluctuation of the number of received single photons, the yield of single photons and quantum bit error rate (QBER). After that we calculate the error between measured value and real value of every parameter, and concern the propagation error among all the measure values. We paraphrase the Gottesman-Lo-Lutkenhaus-Preskill (GLLP) formula in consideration of those parameters and generate the QKD simulation result. In this study, with the increase in coding photon length, the safe distribution distance is longer and longer. When the coding photon's length is N = 10^{11}, the safe distribution distance can be almost 118 km. It gives a lower bound of safe transmission distance than without uncertainty principle's 127 km. So our study is in line with established theory, but we make it more realistic.
Participatory Development Principles and Practice: Reflections of a Western Development Worker.
ERIC Educational Resources Information Center
Keough, Noel
1998-01-01
Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)
The Irrelevance of the Risk-Uncertainty Distinction.
Roser, Dominic
2017-10-01
Precautionary Principles are often said to be appropriate for decision-making in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the risk-uncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, in particular by distinguishing three interpretations of probability: objective, epistemic, and subjective probability. I then claim that if we are concerned with whether we have probabilities at all-regardless of how low their epistemic credentials are-then we almost always have probabilities for policy-making. The reason is that subjective and epistemic probability are the relevant interpretations of probability and we almost always have subjective and epistemic probabilities. In contrast, if we are only concerned with probabilities that have sufficiently high epistemic credentials, then we obviously do not always have probabilities. Climate policy, for example, would then be a case of decision-making under uncertainty. But, so I argue, we should not dismiss probabilities with low epistemic credentials. Rather, when they are the best available probabilities our decision principles should make use of them. And, since they are almost always available, the risk-uncertainty distinction remains irrelevant.
NASA Astrophysics Data System (ADS)
Zhou, Shiwei; Chen, Ge-Rui
Recently, some approaches to quantum gravity indicate that a minimal measurable length lp ˜ 10-35 should be considered, a direct implication of the minimal measurable length is the generalized uncertainty principle (GUP). Taking the effect of GUP into account, Hawking radiation of massless scalar particles from a Schwarzschild black hole is investigated by the use of Damour-Ruffini’s method. The original Klein-Gordon equation is modified. It is obtained that the corrected Hawking temperature is related to the energy of emitting particles. Some discussions appear in the last section.
Uncertainty information in climate data records from Earth observation
NASA Astrophysics Data System (ADS)
Merchant, C. J.
2017-12-01
How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.
Generalized uncertainty principle and quantum gravity phenomenology
NASA Astrophysics Data System (ADS)
Bosso, Pasquale
The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.
Method for measuring target rotation angle by theodolites
NASA Astrophysics Data System (ADS)
Sun, Zelin; Wang, Zhao; Zhai, Huanchun; Yang, Xiaoxu
2013-05-01
To overcome the disadvantage of the current measurement methods using theodolites in an environment with shock and long working hours and so on, this paper proposes a new method for 3D coordinate measurement that is based on an immovable measuring coordinate system. According to the measuring principle, the mathematics model is established and the measurement uncertainty is analysed. The measurement uncertainty of the new method is a function of the theodolite observation angles and their uncertainty, and can be reduced by optimizing the theodolites’ placement. Compared to other methods, this method allows the theodolite positions to be changed in the measuring process, and mutual collimation between the theodolites is not required. The experimental results show that the measurement model and the optimal placement principle are correct, and the measurement error is less than 0.01° after optimizing the theodolites’ placement.
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
2015-09-05
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less
Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance...
Consensus building for interlaboratory studies, key comparisons, and meta-analysis
NASA Astrophysics Data System (ADS)
Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza
2017-06-01
Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.
Garnett, Kenisha; Parsons, David J
2017-03-01
The precautionary principle was formulated to provide a basis for political action to protect the environment from potentially severe or irreversible harm in circumstances of scientific uncertainty that prevent a full risk or cost-benefit analysis. It underpins environmental law in the European Union and has been extended to include public health and consumer safety. The aim of this study was to examine how the precautionary principle has been interpreted and subsequently applied in practice, whether these applications were consistent, and whether they followed the guidance from the Commission. A review of the literature was used to develop a framework for analysis, based on three attributes: severity of potential harm, standard of evidence (or degree of uncertainty), and nature of the regulatory action. This was used to examine 15 pieces of legislation or judicial decisions. The decision whether or not to apply the precautionary principle appears to be poorly defined, with ambiguities inherent in determining what level of uncertainty and significance of hazard justifies invoking it. The cases reviewed suggest that the Commission's guidance was not followed consistently in forming legislation, although judicial decisions tended to be more consistent and to follow the guidance by requiring plausible evidence of potential hazard in order to invoke precaution. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Von Roos, O.
1978-01-01
The limitations of the detectability of extremely weak signals (gravitational radiation for instance) imposed by Heisenberg's uncertainty principle on the sequential determination of those signals have been explored recently. A variety of schemes have been proposed to circumvent these limitations. Although all of the earlier attempts have been proven fruitless a recent proposal seems to be quite promising. The scheme, consisting of two harmonic oscillators interacting with each other in a peculiar way, allows for an exact analytical solution which is derived here. If it can be assumed that the expectation value of one of the canonical variables of the total system suffices to monitor the weak signal it can be shown that, in the absence of thermal noise, arbitrarily weak signals can in principle be measured without interference from the uncertainty principle.
Hoffmann, Sabine; Laurier, Dominique; Rage, Estelle; Guihenneuc, Chantal; Ancelet, Sophie
2018-01-01
Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies.
Laurier, Dominique; Rage, Estelle
2018-01-01
Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies. PMID:29408862
On the Minimal Length Uncertainty Relation and the Foundations of String Theory
Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...
2011-01-01
We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.
van de Wetering, E J; Stolk, E A; van Exel, N J A; Brouwer, W B F
2013-02-01
Economic evaluations are increasingly used to inform decisions regarding the allocation of scarce health care resources. To systematically incorporate societal preferences into these evaluations, quality-adjusted life year gains could be weighted according to some equity principle, the most suitable of which is a matter of frequent debate. While many countries still struggle with equity concerns for priority setting in health care, the Netherlands has reached a broad consensus to use the concept of proportional shortfall. Our study evaluates the concept and its support in the Dutch health care context. We discuss arguments in the Netherlands for using proportional shortfall and difficulties in transitioning from principle to practice. In doing so, we address universal issues leading to a systematic consideration of equity concerns for priority setting in health care. The article thus has relevance to all countries struggling with the formalization of equity concerns for priority setting.
NASA Astrophysics Data System (ADS)
Alonso-Serrano, Ana; DÄ browski, Mariusz P.; Gohar, Hussain
2018-02-01
We investigate the generalized uncertainty principle (GUP) corrections to the entropy content and the information flux of black holes, as well as the corrections to the sparsity of the Hawking radiation at the late stages of evaporation. We find that due to these quantum gravity motivated corrections, the entropy flow per particle reduces its value on the approach to the Planck scale due to a better accuracy in counting the number of microstates. We also show that the radiation flow is no longer sparse when the mass of a black hole approaches Planck mass which is not the case for non-GUP calculations.
Kamiura, Moto; Sano, Kohei
2017-10-01
The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.
Plasmonic trace sensing below the photon shot noise limit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooser, Raphael C.; Lawrie, Benjamin J.
Plasmonic sensors are important detectors of biochemical trace compounds, but those that utilize optical readout are approaching their absolute limits of detection as defined by the Heisenberg uncertainty principle in both differential intensity and phase readout. However, the use of more general minimum uncertainty states in the form of squeezed light can push the noise floor in these sensors below the shot noise limit (SNL) in one analysis variable at the expense of another. Here, we demonstrate a quantum plasmonic sensor whose noise floor is reduced below the SNL in order to perform index of refraction measurements with sensitivities unobtainablemore » with classical plasmonic sensors. The increased signal-to-noise ratio can result in faster detection of analyte concentrations that were previously lost in the noise. As a result, these benefits are the hallmarks of a sensor exploiting quantum readout fields in order to manipulate the limits of the Heisenberg uncertainty principle.« less
Plasmonic trace sensing below the photon shot noise limit
Pooser, Raphael C.; Lawrie, Benjamin J.
2015-12-09
Plasmonic sensors are important detectors of biochemical trace compounds, but those that utilize optical readout are approaching their absolute limits of detection as defined by the Heisenberg uncertainty principle in both differential intensity and phase readout. However, the use of more general minimum uncertainty states in the form of squeezed light can push the noise floor in these sensors below the shot noise limit (SNL) in one analysis variable at the expense of another. Here, we demonstrate a quantum plasmonic sensor whose noise floor is reduced below the SNL in order to perform index of refraction measurements with sensitivities unobtainablemore » with classical plasmonic sensors. The increased signal-to-noise ratio can result in faster detection of analyte concentrations that were previously lost in the noise. As a result, these benefits are the hallmarks of a sensor exploiting quantum readout fields in order to manipulate the limits of the Heisenberg uncertainty principle.« less
Reconstructing signals from noisy data with unknown signal and noise covariance.
Oppermann, Niels; Robbers, Georg; Ensslin, Torsten A
2011-10-01
We derive a method to reconstruct Gaussian signals from linear measurements with Gaussian noise. This new algorithm is intended for applications in astrophysics and other sciences. The starting point of our considerations is the principle of minimum Gibbs free energy, which was previously used to derive a signal reconstruction algorithm handling uncertainties in the signal covariance. We extend this algorithm to simultaneously uncertain noise and signal covariances using the same principles in the derivation. The resulting equations are general enough to be applied in many different contexts. We demonstrate the performance of the algorithm by applying it to specific example situations and compare it to algorithms not allowing for uncertainties in the noise covariance. The results show that the method we suggest performs very well under a variety of circumstances and is indeed qualitatively superior to the other methods in cases where uncertainty in the noise covariance is present.
Making Decisions about an Educational Game, Simulation or Workshop: A 'Game Theory' Perspective.
ERIC Educational Resources Information Center
Cryer, Patricia
1988-01-01
Uses game theory to help practitioners make decisions about educational games, simulations, or workshops whose outcomes depend to some extent on chance. Highlights include principles for making decisions involving risk; elementary laws of probability; utility theory; and principles for making decisions involving uncertainty. (eight references)…
The Less You Know: The Utility of Ambiguity and Uncertainty in Counter-Terrorism
2015-03-01
60 V. ETHICAL AND LEGAL IMPLICATIONS...61 A. LEGALITY AND AN OPEN SOCIETY .....................................................61 B. ETHICS AND UNCERTAINTY...results in cries of immoral overreaction. Israel wrestles with this ethical dilemma/public relations problem during every Intifada. An out-of-proportion
NASA Astrophysics Data System (ADS)
Kheiri, R.
2016-09-01
As an undergraduate exercise, in an article (2012 Am. J. Phys. 80 780-14), quantum and classical uncertainties for dimensionless variables of position and momentum were evaluated in three potentials: infinite well, bouncing ball, and harmonic oscillator. While original quantum uncertainty products depend on {{\\hslash }} and the number of states (n), a dimensionless approach makes the comparison between quantum uncertainty and classical dispersion possible by excluding {{\\hslash }}. But the question is whether the uncertainty still remains dependent on quantum number n. In the above-mentioned article, there lies this contrast; on the one hand, the dimensionless quantum uncertainty of the potential box approaches classical dispersion only in the limit of large quantum numbers (n\\to ∞ )—consistent with the correspondence principle. On the other hand, similar evaluations for bouncing ball and harmonic oscillator potentials are equal to their classical counterparts independent of n. This equality may hide the quantum feature of low energy levels. In the current study, we change the potential intervals in order to make them symmetric for the linear potential and non-symmetric for the quadratic potential. As a result, it is shown in this paper that the dimensionless quantum uncertainty of these potentials in the new potential intervals is expressed in terms of quantum number n. In other words, the uncertainty requires the correspondence principle in order to approach the classical limit. Therefore, it can be concluded that the dimensionless analysis, as a useful pedagogical method, does not take away the quantum feature of the n-dependence of quantum uncertainty in general. Moreover, our numerical calculations include the higher powers of the position for the potentials.
ERIC Educational Resources Information Center
Wiseman, Alexander W.; Astiz, M. Fernanda; Baker, David P.
2014-01-01
The rise in globalisation studies in comparative education places neo-institutional theory at the centre of many debates among comparative education researchers. However, uncertainty about how to interpret neo-institutional theory still persists among educational comparativists. With this uncertainty comes misinterpretation of its principles,…
ERIC Educational Resources Information Center
Cole, Charles; Cantero, Pablo; Sauve, Diane
1998-01-01
Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…
Role of information theoretic uncertainty relations in quantum theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz; ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin; Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again,more » improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.« less
Myhr, Anne Ingeborg; Myskja, Bjørn K
2011-04-01
Nanoparticles have multifaceted advantages in drug administration as vaccine delivery and hence hold promises for improving protection of farmed fish against diseases caused by pathogens. However, there are concerns that the benefits associated with distribution of nanoparticles may also be accompanied with risks to the environment and health. The complexity of the natural and social systems involved implies that the information acquired in quantified risk assessments may be inadequate for evidence-based decisions. One controversial strategy for dealing with this kind of uncertainty is the precautionary principle. A few years ago, an UNESCO expert group suggested a new approach for implementation of the principle. Here we compare the UNESCO principle with earlier versions and explore the advantages and disadvantages by employing the UNESCO version to the use of PLGA nanoparticles for delivery of vaccines in aquaculture. Finally, we discuss whether a combined scientific and ethical analysis that involves the concept of responsibility will enable approaches that can provide a supplement to the precautionary principle as basis for decision-making in areas of scientific uncertainty, such as the application of nanoparticles in the vaccination of farmed fish.
Fermi Blobs and the Symplectic Camel: A Geometric Picture of Quantum States
NASA Astrophysics Data System (ADS)
Gossona, Maurice A. De
We have explained in previous work the correspondence between the standard squeezed coherent states of quantum mechanics, and quantum blobs, which are the smallest phase space units compatible with the uncertainty principle of quantum mechanics and having the symplectic group as a group of symmetries. In this work, we discuss the relation between quantum blobs and a certain level set (which we call "Fermi blob") introduced by Enrico Fermi in 1930. Fermi blobs allows us to extend our previous results not only to the excited states of the generalized harmonic oscillator in n dimensions, but also to arbitrary quadratic Hamiltonians. As is the case for quantum blobs, we can evaluate Fermi blobs using a topological notion, related to the uncertainty principle, the symplectic capacity of a phase space set. The definition of this notion is made possible by Gromov's symplectic non-squeezing theorem, nicknamed the "principle of the symplectic camel".
ERIC Educational Resources Information Center
Howe, Christine; Luthman, Stefanie; Ruthven, Kenneth; Mercer, Neil; Hofmann, Riikka; Ilie, Sonia; Guardia, Paula
2015-01-01
Reflecting concerns about student attainment and participation in mathematics and science, the Effecting Principled Improvement in STEM Education ("epiSTEMe") project attempted to support pedagogical advancement in these two disciplines. Using principles identified as effective in the research literature (and combining these in a novel…
Lorentz invariance violation and generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag
2016-01-01
There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.
Outdoor sensor-based operation of autonomous robots has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. Two basic principles, or philosophies, and their associated methodologies are proposed in an attempt to remedy some of these difficulties. The first principle is based on the concept of ``minimal model`` for accomplishing given tasks and proposes to utilize only themore » minimum level of information and precision necessary to accomplish elemental functions of complex tasks. This approach diverges completely from the direction taken by most artificial vision studies which conventionally call for crisp and detailed analysis of every available component in the perception data. The paper will first review the basic concepts of this approach and will discuss its pragmatic feasibility when embodied in a behaviorist framework. The second principle which is proposed deals with implicit representation of uncertainties using Fuzzy Set Theory-based approximations and approximate reasoning, rather than explicit (crisp) representation through calculation and conventional propagation techniques. A framework which merges these principles and approaches is presented, and its application to the problem of sensor-based outdoor navigation of a mobile robot is discussed. Results of navigation experiments with a real car in actual outdoor environments are also discussed to illustrate the feasibility of the overall concept.« less
DeCaro, Daniel A; Arnol, Craig Anthony Tony; Boama, Emmanuel Frimpong; Garmestani, Ahjond S
2017-03-01
Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people's decision making that cloud their judgment and create conflict. These systems must also satisfy people's fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance.
DeCaro, Daniel A.; Arnol, Craig Anthony (Tony); Boama, Emmanuel Frimpong; Garmestani, Ahjond S.
2018-01-01
Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. PMID:29780425
Zhou, Lin; Long, Shitong; Tang, Biao; Chen, Xi; Gao, Fen; Peng, Wencui; Duan, Weitao; Zhong, Jiaqi; Xiong, Zongyuan; Wang, Jin; Zhang, Yuanzhong; Zhan, Mingsheng
2015-07-03
We report an improved test of the weak equivalence principle by using a simultaneous 85Rb-87Rb dual-species atom interferometer. We propose and implement a four-wave double-diffraction Raman transition scheme for the interferometer, and demonstrate its ability in suppressing common-mode phase noise of Raman lasers after their frequencies and intensity ratios are optimized. The statistical uncertainty of the experimental data for Eötvös parameter η is 0.8×10(-8) at 3200 s. With various systematic errors corrected, the final value is η=(2.8±3.0)×10(-8). The major uncertainty is attributed to the Coriolis effect.
Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?
NASA Astrophysics Data System (ADS)
Majumder, Barun; Sen, Sourav
2012-10-01
In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.
Principles of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Landé, Alfred
2013-10-01
Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schrödinger's equation for non-conservative systems; 46. Pertubation theory; 47. Orthogonality, normalization and Hermitian conjugacy; 48. General matrix elements; Part IV. The Principle of Correspondence: 49. Contact transformations in classical mechanics; 50. Point transformations; 51. Contact transformations in quantum mechanics; 52. Constants of motion and angular co-ordinates; 53. Periodic orbits; 54. De Broglie and Schrödinger function; correspondence to classical mechanics; 55. Packets of probability; 56. Correspondence to hydrodynamics; 57. Motion and scattering of wave packets; 58. Formal correspondence between classical and quantum mechanics; Part V. Mathematical Appendix: Principle of Invariance: 59. The general theorem of transformation; 60. Operator calculus; 61. Exchange relations; three criteria for conjugacy; 62. First method of canonical transformation; 63. Second method of canonical transformation; 64. Proof of the transformation theorem; 65. Invariance of the matrix elements against unitary transformations; 66. Matrix mechanics; Index of literature; Index of names and subjects.
Lujan, Heidi L; DiCarlo, Stephen E
2014-12-01
Students are naturally curious and inquisitive with powerful intrinsic motives to probe, learn, and understand their world. Accordingly, class activities must capitalize on this inherently energetic and curious nature so that learning becomes a lifelong activity where students take initiative for learning, are skilled in learning, and want to learn new things. This report describes a student-centered class activity, the "flipped exam," designed to achieve this goal. The flipped exam was a collaborative, group effort, and learning was interactive. It included a significant proportion (∼30-35%) of material not covered in class. This required students to actively search for content and context, dynamically making connections between what they knew and what they learned, grappling with complexity, uncertainty, and ambiguity, and finally discovering answers to important questions. Accordingly, the need or desire to know was the catalyst for meaningful learning. Student assessment was determined by behavioral noncognitive parameters that were based on the observation of the student and the student's work as well as cognitive parameters (i.e., the student's score on the examination). It is our view that the flipped exam provided a student-centered activity in which students discovered, because of the need to know and opportunities for discussion, the important concepts and principles we wanted them to learn. Copyright © 2014 The American Physiological Society.
The physical origins of the uncertainty theorem
NASA Astrophysics Data System (ADS)
Giese, Albrecht
2013-10-01
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.
Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle
Shoufan Fang; George Z. Gertner
2000-01-01
When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...
Uncertainty, imprecision, and the precautionary principle in climate change assessment.
Borsuk, M E; Tomassini, L
2005-01-01
Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.
The action uncertainty principle for continuous measurements
NASA Astrophysics Data System (ADS)
Mensky, Michael B.
1996-02-01
The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.
DC motor proportional control system for orthotic devices
NASA Technical Reports Server (NTRS)
Blaise, H. T.; Allen, J. R.
1972-01-01
Multi-channel proportional control system for operation of dc motors for use with externally-powered orthotic arm braces is described. Components of circuitry and principles of operation are described. Schematic diagram of control circuit is provided.
2015-04-01
of the state. Such threats may come into existence when 9 the organizing principles of two states contradict each other in a context where the...security is that the normal condition of actors in a market econ - omy is one of risk, competition, and uncertainty.12 In other words, the actors in the...liberal principles , federative states have no natural unifying principle and, consequently, are more vulnerable to dismemberment, separatism, and
NASA Astrophysics Data System (ADS)
Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik
2018-06-01
Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.
Conditional uncertainty principle
NASA Astrophysics Data System (ADS)
Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun
2018-04-01
We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.
NASA Astrophysics Data System (ADS)
Ahmad, Zeeshan; Viswanathan, Venkatasubramanian
2016-08-01
Computationally-guided material discovery is being increasingly employed using a descriptor-based screening through the calculation of a few properties of interest. A precise understanding of the uncertainty associated with first-principles density functional theory calculated property values is important for the success of descriptor-based screening. The Bayesian error estimation approach has been built in to several recently developed exchange-correlation functionals, which allows an estimate of the uncertainty associated with properties related to the ground state energy, for example, adsorption energies. Here, we propose a robust and computationally efficient method for quantifying uncertainty in mechanical properties, which depend on the derivatives of the energy. The procedure involves calculating energies around the equilibrium cell volume with different strains and fitting the obtained energies to the corresponding energy-strain relationship. At each strain, we use instead of a single energy, an ensemble of energies, giving us an ensemble of fits and thereby, an ensemble of mechanical properties associated with each fit, whose spread can be used to quantify its uncertainty. The generation of ensemble of energies is only a post-processing step involving a perturbation of parameters of the exchange-correlation functional and solving for the energy non-self-consistently. The proposed method is computationally efficient and provides a more robust uncertainty estimate compared to the approach of self-consistent calculations employing several different exchange-correlation functionals. We demonstrate the method by calculating the uncertainty bounds for several materials belonging to different classes and having different structures using the developed method. We show that the calculated uncertainty bounds the property values obtained using three different GGA functionals: PBE, PBEsol, and RPBE. Finally, we apply the approach to calculate the uncertainty associated with the DFT-calculated elastic properties of solid state Li-ion and Na-ion conductors.
1990-01-01
least-squares sense by adding a penalty term proportional to the square of the divergence to the variational principle At the start of this project... principle required for stable solutions of the electromagnetic field: It must be possible to express the basis functions used in the finite element method as... principle to derive several different methods for computing stable solutions to electromagnetic field problems. To understand above principle , notice that
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pramanik, Souvik, E-mail: souvick.in@gmail.com; Moussa, Mohamed, E-mail: mohamed.ibrahim@fsc.bu.edu.eg; Faizal, Mir, E-mail: f2mir@uwaterloo.ca
In this paper, the deformation of the Heisenberg algebra, consistent with both the generalized uncertainty principle and doubly special relativity, has been analyzed. It has been observed that, though this algebra can give rise to fractional derivative terms in the corresponding quantum mechanical Hamiltonian, a formal meaning can be given to them by using the theory of harmonic extensions of function. Depending on this argument, the expression of the propagator of the path integral corresponding to the deformed Heisenberg algebra, has been obtained. In particular, the consistent expression of the one dimensional free particle propagator has been evaluated explicitly. Withmore » this propagator in hand, it has been shown that, even in free particle case, normal generalized uncertainty principle and doubly special relativity show very much different result.« less
Thermodynamics of a class of regular black holes with a generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Maluf, R. V.; Neves, Juliano C. S.
2018-05-01
In this article, we present a study on thermodynamics of a class of regular black holes. Such a class includes Bardeen and Hayward regular black holes. We obtained thermodynamic quantities like the Hawking temperature, entropy, and heat capacity for the entire class. As part of an effort to indicate some physical observable to distinguish regular black holes from singular black holes, we suggest that regular black holes are colder than singular black holes. Besides, contrary to the Schwarzschild black hole, that class of regular black holes may be thermodynamically stable. From a generalized uncertainty principle, we also obtained the quantum-corrected thermodynamics for the studied class. Such quantum corrections provide a logarithmic term for the quantum-corrected entropy.
1980-08-01
extended to include influence of time in transit, perishability, and uncertainty in market 7Alfred Marshall, Principles of Economics , 9th Edition (McMillan...the Maritime Administration, U.S. Department of Commerce, Washington, D.C. (1974). Marshall, Alfred. Principles of Economics . 9th Edition. :.[cMillian
NASA Astrophysics Data System (ADS)
Hey, Anthony J. G.; Walters, Patrick
This book provides a descriptive, popular account of quantum physics. The basic topics addressed include: waves and particles, the Heisenberg uncertainty principle, the Schroedinger equation and matter waves, atoms and nuclei, quantum tunneling, the Pauli exclusion principle and the elements, quantum cooperation and superfluids, Feynman rules, weak photons, quarks, and gluons. The applications of quantum physics to astrophyics, nuclear technology, and modern electronics are addressed.
Understanding and applying principles of social cognition and ...
Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. Social-ecological stressors place significant pressure on major societal systems, triggering adaptive reforms in human governance and environmental law. Though potentially benefici
H2/H∞ control for grid-feeding converter considering system uncertainty
NASA Astrophysics Data System (ADS)
Li, Zhongwen; Zang, Chuanzhi; Zeng, Peng; Yu, Haibin; Li, Shuhui; Fu, Xingang
2017-05-01
Three-phase grid-feeding converters are key components to integrate distributed generation and renewable power sources to the power utility. Conventionally, proportional integral and proportional resonant-based control strategies are applied to control the output power or current of a GFC. But, those control strategies have poor transient performance and are not robust against uncertainties and volatilities in the system. This paper proposes a H2/H∞-based control strategy, which can mitigate the above restrictions. The uncertainty and disturbance are included to formulate the GFC system state-space model, making it more accurate to reflect the practical system conditions. The paper uses a convex optimisation method to design the H2/H∞-based optimal controller. Instead of using a guess-and-check method, the paper uses particle swarm optimisation to search a H2/H∞ optimal controller. Several case studies implemented by both simulation and experiment can verify the superiority of the proposed control strategy than the traditional PI control methods especially under dynamic and variable system conditions.
A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Jin; Yu, Yaming; Van Dyk, David A.
2014-10-20
Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use amore » principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.« less
Uncertainty for Part Density Determination: An Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Mario Orlando
2016-12-14
Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM)more » for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.« less
Planck Constant Determination from Power Equivalence
NASA Astrophysics Data System (ADS)
Newell, David B.
2000-04-01
Equating mechanical to electrical power links the kilogram, the meter, and the second to the practical realizations of the ohm and the volt derived from the quantum Hall and the Josephson effects, yielding an SI determination of the Planck constant. The NIST watt balance uses this power equivalence principle, and in 1998 measured the Planck constant with a combined relative standard uncertainty of 8.7 x 10-8, the most accurate determination to date. The next generation of the NIST watt balance is now being assembled. Modification to the experimental facilities have been made to reduce the uncertainty components from vibrations and electromagnetic interference. A vacuum chamber has been installed to reduce the uncertainty components associated with performing the experiment in air. Most of the apparatus is in place and diagnostic testing of the balance should begin this year. Once a combined relative standard uncertainty of one part in 10-8 has been reached, the power equivalence principle can be used to monitor the possible drift in the artifact mass standard, the kilogram, and provide an accurate alternative definition of mass in terms of fundamental constants. *Electricity Division, Electronics and Electrical Engineering Laboratory, Technology Administration, U.S. Department of Commerce. Contribution of the National Institute of Standards and Technology, not subject to copyright in the U.S.
Particles, Waves, and the Interpretation of Quantum Mechanics
ERIC Educational Resources Information Center
Christoudouleas, N. D.
1975-01-01
Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)
The equivalence principle in a quantum world
NASA Astrophysics Data System (ADS)
Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre
2015-09-01
We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).
The Sapir-Whorf hypothesis and inference under uncertainty.
Regier, Terry; Xu, Yang
2017-11-01
The Sapir-Whorf hypothesis holds that human thought is shaped by language, leading speakers of different languages to think differently. This hypothesis has sparked both enthusiasm and controversy, but despite its prominence it has only occasionally been addressed in computational terms. Recent developments support a view of the Sapir-Whorf hypothesis in terms of probabilistic inference. This view may resolve some of the controversy surrounding the Sapir-Whorf hypothesis, and may help to normalize the hypothesis by linking it to established principles that also explain other phenomena. On this view, effects of language on nonlinguistic cognition or perception reflect standard principles of inference under uncertainty. WIREs Cogn Sci 2017, 8:e1440. doi: 10.1002/wcs.1440 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Rothleitner, Christian; Francis, Olivier
2014-04-01
An original setup is presented to measure the Newtonian Constant of Gravitation G. It is based on the same principle as used in ballistic absolute gravimeters. The differential acceleration of three simultaneously freely falling test masses is measured in order to determine G. In this paper, a description of the experimental setup is presented. A detailed uncertainty budget estimates the relative uncertainty to be of the order of 5.3 × 10-4, however with some improvements a relative uncertainty in G of one part in 104 could be feasible.
Patterns of Hierarchy in Formal and Principled Moral Reasoning.
ERIC Educational Resources Information Center
Zeidler, Dana Lewis
Measurements of formal reasoning and principled moral reasoning ability were obtained from a sample of 99 tenth grade students. Specific modes of formal reasoning (proportional reasoning, controlling variables, probabilistic, correlational and combinatorial reasoning) were first examined. Findings support the notion of hierarchical relationships…
Position-momentum uncertainty relations in the presence of quantum memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Berta, Mario; Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich
2014-12-15
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting ofmore » position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.« less
Steering the measured uncertainty under decoherence through local PT -symmetric operations
NASA Astrophysics Data System (ADS)
Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu
2018-07-01
The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.
Confidence Intervals for Proportion Estimates in Complex Samples. Research Report. ETS RR-06-21
ERIC Educational Resources Information Center
Oranje, Andreas
2006-01-01
Confidence intervals are an important tool to indicate uncertainty of estimates and to give an idea of probable values of an estimate if a different sample from the population was drawn or a different sample of measures was used. Standard symmetric confidence intervals for proportion estimates based on a normal approximation can yield bounds…
Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation
NASA Astrophysics Data System (ADS)
Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.
2012-12-01
This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.
NASA Technical Reports Server (NTRS)
Madejski, Greg
1994-01-01
We report the soft X-ray spectrum of BL Lac object AO 0235+164, observed with the Einstein Observatory Imaging Proportional Counter (IPC). This object (z = 0.94) has an intervening galaxy (or a protogalactic disk) at z = 0.524 present in the line of sight, producing both radio and optical absorption lines in the background BL Lac continuum. The X-ray spectrum exhibits a substantial soft X-ray cutoff, corresponding to several times that expected from our own Galaxy; we interpret that excess cutoff as due to the intervening galaxy. The comparison of the hydrogen column density inferred from the 21 cm radio data and the X-ray absorption allows, in principle, the determination of the elemental abundances in the intervening galaxy. However, the uncertainties in both the H I spin temperature and X-ray spectral parameters only loosely restrict these abundances to be 2 +/- 1 solar, which even at the lower limit appears higher than that inferred from studies of samples of optical absoprtion-line systems.
A hybrid method for provincial scale energy-related carbon emission allocation in China.
Bai, Hongtao; Zhang, Yingxuan; Wang, Huizhi; Huang, Yanying; Xu, He
2014-01-01
Achievement of carbon emission reduction targets proposed by national governments relies on provincial/state allocations. In this study, a hybrid method for provincial energy-related carbon emissions allocation in China was developed to provide a good balance between production- and consumption-based approaches. In this method, provincial energy-related carbon emissions are decomposed into direct emissions of local activities other than thermal power generation and indirect emissions as a result of electricity consumption. Based on the carbon reduction efficiency principle, the responsibility for embodied emissions of provincial product transactions is assigned entirely to the production area. The responsibility for carbon generation during the production of thermal power is borne by the electricity consumption area, which ensures that different regions with resource endowments have rational development space. Empirical studies were conducted to examine the hybrid method and three indices, per capita GDP, resource endowment index and the proportion of energy-intensive industries, were screened to preliminarily interpret the differences among China's regional carbon emissions. Uncertainty analysis and a discussion of this method are also provided herein.
Questions Students Ask: Beta Decay.
ERIC Educational Resources Information Center
Koss, Jordan; Hartt, Kenneth
1988-01-01
Answers a student's question about the emission of a positron from a nucleus. Discusses the problem from the aspects of the uncertainty principle, beta decay, the Fermi Theory, and modern physics. (YP)
The redoubtable ecological periodic table
Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...
Computer-Aided Experiment Planning toward Causal Discovery in Neuroscience.
Matiasz, Nicholas J; Wood, Justin; Wang, Wei; Silva, Alcino J; Hsu, William
2017-01-01
Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning.
Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.
2011-01-01
The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.
Darnaude, Audrey M.
2016-01-01
Background Mixture models (MM) can be used to describe mixed stocks considering three sets of parameters: the total number of contributing sources, their chemical baseline signatures and their mixing proportions. When all nursery sources have been previously identified and sampled for juvenile fish to produce baseline nursery-signatures, mixing proportions are the only unknown set of parameters to be estimated from the mixed-stock data. Otherwise, the number of sources, as well as some/all nursery-signatures may need to be also estimated from the mixed-stock data. Our goal was to assess bias and uncertainty in these MM parameters when estimated using unconditional maximum likelihood approaches (ML-MM), under several incomplete sampling and nursery-signature separation scenarios. Methods We used a comprehensive dataset containing otolith elemental signatures of 301 juvenile Sparus aurata, sampled in three contrasting years (2008, 2010, 2011), from four distinct nursery habitats. (Mediterranean lagoons) Artificial nursery-source and mixed-stock datasets were produced considering: five different sampling scenarios where 0–4 lagoons were excluded from the nursery-source dataset and six nursery-signature separation scenarios that simulated data separated 0.5, 1.5, 2.5, 3.5, 4.5 and 5.5 standard deviations among nursery-signature centroids. Bias (BI) and uncertainty (SE) were computed to assess reliability for each of the three sets of MM parameters. Results Both bias and uncertainty in mixing proportion estimates were low (BI ≤ 0.14, SE ≤ 0.06) when all nursery-sources were sampled but exhibited large variability among cohorts and increased with the number of non-sampled sources up to BI = 0.24 and SE = 0.11. Bias and variability in baseline signature estimates also increased with the number of non-sampled sources, but tended to be less biased, and more uncertain than mixing proportion ones, across all sampling scenarios (BI < 0.13, SE < 0.29). Increasing separation among nursery signatures improved reliability of mixing proportion estimates, but lead to non-linear responses in baseline signature parameters. Low uncertainty, but a consistent underestimation bias affected the estimated number of nursery sources, across all incomplete sampling scenarios. Discussion ML-MM produced reliable estimates of mixing proportions and nursery-signatures under an important range of incomplete sampling and nursery-signature separation scenarios. This method failed, however, in estimating the true number of nursery sources, reflecting a pervasive issue affecting mixture models, within and beyond the ML framework. Large differences in bias and uncertainty found among cohorts were linked to differences in separation of chemical signatures among nursery habitats. Simulation approaches, such as those presented here, could be useful to evaluate sensitivity of MM results to separation and variability in nursery-signatures for other species, habitats or cohorts. PMID:27761305
NASA Astrophysics Data System (ADS)
Yanai, R. D.; Bae, K.; Levine, C. R.; Lilly, P.; Vadeboncoeur, M. A.; Fatemi, F. R.; Blum, J. D.; Arthur, M.; Hamburg, S.
2013-12-01
Ecosystem nutrient budgets are difficult to construct and even more difficult to replicate. As a result, uncertainty in the estimates of pools and fluxes are rarely reported, and opportunities to assess confidence through replicated measurements are rare. In this study, we report nutrient concentrations and contents of soil and biomass pools in northern hardwood stands in replicate plots within replicate stands in 3 age classes (14-19 yr, 26-29 yr, and > 100 yr) at the Bartlett Experimental Forest, USA. Soils were described by quantitative soil pits in three plots per stand, excavated by depth increment to the C horizon and analyzed by a sequential extraction procedure. Variation in soil mass among pits within stands averaged 28% (coefficient of variation); variation among stands within an age class ranged from 9-25%. Variation in nutrient concentrations were higher still (averaging 38%, within element, depth increment, and extraction type), perhaps because the depth increments contained varying proportions of genetic horizons. To estimate nutrient contents of aboveground biomass, we propagated model uncertainty through allometric equations, and found errors ranging from 3-7%, depending on the stand. The variation in biomass among plots within stands (6-19%) was always larger than the allometric uncertainties. Variability in measured nutrient concentrations of tree tissues were more variable than the uncertainty in biomass. Foliage had the lowest variability (averaging 16% for Ca, Mg, K, N and P within age class and species), and wood had the highest (averaging 30%), when reported in proportion to the mean, because concentrations in wood are low. For Ca content of aboveground biomass, sampling variation was the greatest source of uncertainty. Coefficients of variation among plots within a stand averaged 16%; stands within an age class ranged from 5-25% CV, including uncertainties in tree allometry and tissue chemistry. Uncertainty analysis can help direct research effort to areas most in need of improvement. In systems such as the one we studied, more intensive sampling would be the best approach to reducing uncertainty, as natural spatial variation was higher than model or measurement uncertainties.
NASA Astrophysics Data System (ADS)
Ghorbani, A.; Farahani, M. Mahmoodi; Rabbani, M.; Aflaki, F.; Waqifhosain, Syed
2008-01-01
In this paper we propose uncertainty estimation for the analytical results we obtained from determination of Ni, Pb and Al by solidphase extraction and inductively coupled plasma optical emission spectrometry (SPE-ICP-OES). The procedure is based on the retention of analytes in the form of 8-hydroxyquinoline (8-HQ) complexes on a mini column of XAD-4 resin and subsequent elution with nitric acid. The influence of various analytical parameters including the amount of solid phase, pH, elution factors (concentration and volume of eluting solution), volume of sample solution, and amount of ligand on the extraction efficiency of analytes was investigated. To estimate the uncertainty of analytical result obtained, we propose assessing trueness by employing spiked sample. Two types of bias are calculated in the assessment of trueness: a proportional bias and a constant bias. We applied Nested design for calculating proportional bias and Youden method to calculate the constant bias. The results we obtained for proportional bias are calculated from spiked samples. In this case, the concentration found is plotted against the concentration added and the slop of standard addition curve is an estimate of the method recovery. Estimated method of average recovery in Karaj river water is: (1.004±0.0085) for Ni, (0.999±0.010) for Pb and (0.987±0.008) for Al.
Measurements of Acceleration Due to Gravity.
ERIC Educational Resources Information Center
Crummett, Bill
1990-01-01
The principle means by which g has been measured are summarized. Discussed are "Kater's Reversible Pendulum," falling rules, and interferometry methods. Types of corrections and various sources of uncertainty are considered. (CW)
Heisenberg Uncertainty and the Allowable Masses of the Up Quark and Down Quark
NASA Astrophysics Data System (ADS)
Orr, Brian
2004-05-01
A possible explanation for the inability to attain deterministic measurements of an elementary particle's energy, as given by the Heisenberg Uncertainty Principle, manifests itself in an interesting anthropic consequent of Andrei Linde's Self-reproducing Inflationary Multiverse model. In Linde's model, the physical laws and constants that govern our universe adopt other values in other universes, due to variable Higgs fields. While the physics in our universe allow for the advent of life and consciousness, the physics necessary for life are not likely to exist in other universes -- Linde demonstrates this through a kind of Darwinism for universes. Our universe, then, is unique. But what are the physical laws and constants that make our universe what it is? Craig Hogan identifies five physical constants that are not bound by symmetry. Fine-tuning these constants gives rise to the basic behavior and structures of the universe. Three of the non-symmetric constants are fermion masses: the up quark mass, the down quark mass, and the electron mass. I will explore Linde's and Hogan's works by comparing the amount of uncertainty in quark masses, as calculated from the Heisenberg Uncertainty Principle, to the range of quark mass values consistent with our observed universe. Should the fine-tuning of the up quark and down quark masses be greater than the range of Heisenberg uncertainties in their respective masses (as I predict, due to quantum tunneling), then perhaps there is a correlation between the measured Heisenberg uncertainty in quark masses and the fine-tuning of masses required for our universe to be as it is. Hogan; "Why the Universe is Just So;" Reviews of Modern Physics; Issue 4; Vol. 72; pg. 1149-1161; Oct. 2000 Linde, "The Self-Reproducing Inflationary Universe;" Scientific American; No. 5; Vol. 271; pg. 48-55; Nov. 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehra, J.
1987-05-01
In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger during 1920-1927 are treated. From the formulation of quantum mechanics in 1925-1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory - formulated in fall 1926 by Dirac, London, and Jordan - Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publiclymore » enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of quantum mechanics and of physical theory as such - formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930 - were continued during the next decades. All these aspects are briefly summarized.« less
Enhancing the Therapy Experience Using Principles of Video Game Design.
Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison
2016-02-01
This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.
Quantitative basis for component factors of gas flow proportional counting efficiencies
NASA Astrophysics Data System (ADS)
Nichols, Michael C.
This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.
Minimizing Significant Figure Fuzziness.
ERIC Educational Resources Information Center
Fields, Lawrence D.; Hawkes, Stephen J.
1986-01-01
Addresses the principles and problems associated with the use of significant figures. Explains uncertainty, the meaning of significant figures, the Simple Rule, the Three Rule, and the 1-5 Rule. Also provides examples of the Rules. (ML)
Generalized Uncertainty Principle and Parikh-Wilczek Tunneling
NASA Astrophysics Data System (ADS)
Mehdipour, S. Hamid
We investigate the modifications of the Hawking radiation by the Generalized Uncertainty Principle (GUP) and the tunneling process. By using the GUP-corrected de Broglie wavelength, the squeezing of the fundamental momentum cell, and consequently a GUP-corrected energy, we find the nonthermal effects which lead to a nonzero statistical correlation function between probabilities of tunneling of two massive particles with different energies. Then the recovery of part of the information from the black hole radiation is feasible. From the other point of view, the inclusion of the effects of quantum gravity as the GUP expression can halt the evaporation process, so that a stable black hole remnant is left behind, including the other part of the black hole information content. Therefore, these features of the Planck-scale corrections may solve the information problem in black hole evaporation.
Modification of Schrödinger-Newton equation due to braneworld models with minimal length
NASA Astrophysics Data System (ADS)
Bhat, Anha; Dey, Sanjib; Faizal, Mir; Hou, Chenguang; Zhao, Qin
2017-07-01
We study the correction of the energy spectrum of a gravitational quantum well due to the combined effect of the braneworld model with infinite extra dimensions and generalized uncertainty principle. The correction terms arise from a natural deformation of a semiclassical theory of quantum gravity governed by the Schrödinger-Newton equation based on a minimal length framework. The two fold correction in the energy yields new values of the spectrum, which are closer to the values obtained in the GRANIT experiment. This raises the possibility that the combined theory of the semiclassical quantum gravity and the generalized uncertainty principle may provide an intermediate theory between the semiclassical and the full theory of quantum gravity. We also prepare a schematic experimental set-up which may guide to the understanding of the phenomena in the laboratory.
Principles of proportional recovery after stroke generalize to neglect and aphasia.
Marchi, N A; Ptak, R; Di Pietro, M; Schnider, A; Guggisberg, A G
2017-08-01
Motor recovery after stroke can be characterized into two different patterns. A majority of patients recover about 70% of initial impairment, whereas some patients with severe initial deficits show little or no improvement. Here, we investigated whether recovery from visuospatial neglect and aphasia is also separated into two different groups and whether similar proportions of recovery can be expected for the two cognitive functions. We assessed 35 patients with neglect and 14 patients with aphasia at 3 weeks and 3 months after stroke using standardized tests. Recovery patterns were classified with hierarchical clustering and the proportion of recovery was estimated from initial impairment using a linear regression analysis. Patients were reliably clustered into two different groups. For patients in the first cluster (n = 40), recovery followed a linear model where improvement was proportional to initial impairment and achieved 71% of maximal possible recovery for both cognitive deficits. Patients in the second cluster (n = 9) exhibited poor recovery (<25% of initial impairment). Our findings indicate that improvement from neglect or aphasia after stroke shows the same dichotomy and proportionality as observed in motor recovery. This is suggestive of common underlying principles of plasticity, which apply to motor and cognitive functions. © 2017 EAN.
Uncertainty in age-specific harvest estimates and consequences for white-tailed deer management
Collier, B.A.; Krementz, D.G.
2007-01-01
Age structure proportions (proportion of harvested individuals within each age class) are commonly used as support for regulatory restrictions and input for deer population models. Such use requires critical evaluation when harvest regulations force hunters to selectively harvest specific age classes, due to impact on the underlying population age structure. We used a stochastic population simulation model to evaluate the impact of using harvest proportions to evaluate changes in population age structure under a selective harvest management program at two scales. Using harvest proportions to parameterize the age-specific harvest segment of the model for the local scale showed that predictions of post-harvest age structure did not vary dependent upon whether selective harvest criteria were in use or not. At the county scale, yearling frequency in the post-harvest population increased, but model predictions indicated that post-harvest population size of 2.5 years old males would decline below levels found before implementation of the antler restriction, reducing the number of individuals recruited into older age classes. Across the range of age-specific harvest rates modeled, our simulation predicted that underestimation of age-specific harvest rates has considerable influence on predictions of post-harvest population age structure. We found that the consequence of uncertainty in harvest rates corresponds to uncertainty in predictions of residual population structure, and this correspondence is proportional to scale. Our simulations also indicate that regardless of use of harvest proportions or harvest rates, at either the local or county scale the modeled SHC had a high probability (>0.60 and >0.75, respectively) of eliminating recruitment into >2.5 years old age classes. Although frequently used to increase population age structure, our modeling indicated that selective harvest criteria can decrease or eliminate the number of white-tailed deer recruited into older age classes. Thus, we suggest that using harvest proportions for management planning and evaluation should be viewed with caution. In addition, we recommend that managers focus more attention on estimation of age-specific harvest rates, and modeling approaches which combine harvest rates with information from harvested individuals to further increase their ability to effectively manage deer populations under selective harvest programs. ?? 2006 Elsevier B.V. All rights reserved.
Origins and implications of the relationship between warming and cumulative carbon emissions
NASA Astrophysics Data System (ADS)
Raupach, M. R.; Davis, S. J.; Peters, G. P.; Andrew, R. M.; Canadell, J.; Le Quere, C.
2014-12-01
A near-linear relationship between warming (T) and cumulative carbon emissions (Q) is a robust finding from numerous studies. This finding opens biophysical questions concerning (1) its theoretical basis, (2) the treatment of non-CO2 forcings, and (3) uncertainty specifications. Beyond these biophysical issues, a profound global policy question is raised: (4) how can a quota on cumulative emissions be shared? Here, an integrated survey of all four issues is attempted. (1) Proportionality between T and Q is an emergent property of a linear carbon-climate system forced by exponentially increasing CO2 emissions. This idealisation broadly explains past but not future near-proportionality between T and Q: in future, the roles of non-CO2 forcings and carbon-climate nonlinearities become important, and trajectory dependence becomes stronger. (2) The warming effects of short-lived non-CO2 forcers depend on instantaneous rather than cumulative fluxes. However, inertia in emissions trajectories reinstates some of the benefits of a cumulative emissions approach, with residual trajectory dependence comparable to that for CO2. (3) Uncertainties arise from several sources: climate projections, carbon-climate feedbacks, and residual trajectory dependencies in CO2 and other emissions. All of these can in principle be combined into a probability distribution P(T|Q) for the warming T from given cumulative CO2 emissions Q. Present knowledge of P(T|Q) allows quantification of the tradeoff between mitigation ambition and climate risk. (4) Cumulative emissions consistent with a given warming target and climate risk are a finite common resource that will inevitably be shared, creating a tragedy-of-the-commons dilemma. Sharing options range from "inertia" (present distribution of emissions is maintained) to "equity" (cumulative emissions are distributed equally per-capita). Both extreme options lead to emissions distributions that are unrealisable in practice, but a blend of the two extremes may be realisable. This perspective provides a means for nations to compare the global consequences of their own proposed emissions quotas if others were to act in a consistent way, a critical step towards achieving consensus.
Energy and Uncertainty in General Relativity
NASA Astrophysics Data System (ADS)
Cooperstock, F. I.; Dupre, M. J.
2018-03-01
The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.; Culas, Donald E.
1991-01-01
Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
ERIC Educational Resources Information Center
Jackett, Dwane
1990-01-01
Described is a science activity which illustrates the principle of uncertainty using a computer simulation of bacterial reproduction. Procedures and results are discussed. Several illustrations of results are provided. The availability of a computer program is noted. (CW)
ERIC Educational Resources Information Center
Howe, Christine; Ilie, Sonia; Guardia, Paula; Hofmann, Riikka; Mercer, Neil; Riga, Fran
2015-01-01
In response to continuing concerns about student attainment and participation in science and mathematics, the "epiSTEMe" project took a novel approach to pedagogy in these two disciplines. Using principles identified as effective in the research literature (and combining these in a fashion not previously attempted), the project developed…
Teaching the Economics of Urban Sprawl in the Principles of Economics Course
ERIC Educational Resources Information Center
Eckenrod, Sarah B.; Holahan, William L.
2004-01-01
The authors provide an explanation of urban sprawl using topics commonly taught in the principles of economics course. Specifically, employing the concepts of congestible public goods, they explain that underpriced road usage leads to an inefficiently large proportion of the population moving farther from the cities. Increased demand for highway…
Experiences of Uncertainty in Men With an Elevated PSA
Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather
2016-01-01
A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men’s reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. PMID:25979635
Holtschlag, D.J.; Koschik, J.A.
2001-01-01
St. Clair and Detroit Rivers are connecting channels between Lake Huron and Lake Erie in the Great Lakes waterway, and form part of the boundary between the United States and Canada. St. Clair River, the upper connecting channel, drains 222,400 square miles and has an average flow of about 182,000 cubic feet per second. Water from St. Clair River combines with local inflows and discharges into Lake St. Clair before flowing into Detroit River. In some reaches of St. Clair and Detroit Rivers, islands and dikes split the flow into two to four branches. Even when the flow in a reach is known, proportions of flows within individual branches of a reach are uncertain. Simple linear regression equations, subject to a flow continuity constraint, are developed to provide estimators of these proportions and flows. The equations are based on 533 paired measurements of flow in 13 reaches forming 31 branches. The equations provide a means for computing the expected values and uncertainties of steady-state flows on the basis of flow conditions specified at the upstream boundaries of the waterway. In 7 upstream reaches, flow is considered fixed because it can be determined on the basis of flows specified at waterway boundaries and flow continuity. In these reaches, the uncertainties of flow proportions indicated by the regression equations can be used directly to determine the uncertainties of the corresponding flows. In the remaining 6 downstream reaches, flow is considered uncertain because these reaches do not receive flow from all the branches of an upstream reach, or they receive flow from some branches of more than one upstream reach. Monte Carlo simulation analysis is used to quantify this increase in uncertainty associated with the propagation of uncertainties from upstream reaches to downstream reaches. To eliminate the need for Monte Carlo simulations for routine calculations, polynomial regression equations are developed to approximate the variation in uncertainties as a function of flow at the headwaters of St. Clair River. Finally, monthly flow-duration data on the main channels of St. Clair and Detroit Rivers are used with the equations developed in this report to estimate the steady-state flow-duration characteristics of selected branches.
Saposnik, Gustavo; Johnston, S Claiborne
2016-04-01
Acute stroke care represents a challenge for decision makers. Decisions based on erroneous assessments may generate false expectations of patients and their family members, and potentially inappropriate medical advice. Game theory is the analysis of interactions between individuals to study how conflict and cooperation affect our decisions. We reviewed principles of game theory that could be applied to medical decisions under uncertainty. Medical decisions in acute stroke care are usually made under constrains: short period of time, with imperfect clinical information, limit understanding about patients and families' values and beliefs. Game theory brings some strategies to help us manage complex medical situations under uncertainty. For example, it offers a different perspective by encouraging the consideration of different alternatives through the understanding of patients' preferences and the careful evaluation of cognitive distortions when applying 'real-world' data. The stag-hunt game teaches us the importance of trust to strength cooperation for a successful patient-physician interaction that is beyond a good or poor clinical outcome. The application of game theory to stroke care may improve our understanding of complex medical situations and help clinicians make practical decisions under uncertainty. © 2016 World Stroke Organization.
Uncertainty in simulating wheat yields under climate change
NASA Astrophysics Data System (ADS)
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P. J.; Rötter, R. P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P. K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, A. J.; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, R.; Heng, L.; Hooker, J.; Hunt, L. A.; Ingwersen, J.; Izaurralde, R. C.; Kersebaum, K. C.; Müller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.; Olesen, J. E.; Osborne, T. M.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M. A.; Shcherbak, I.; Steduto, P.; Stöckle, C.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J. W.; Williams, J. R.; Wolf, J.
2013-09-01
Projections of climate change impacts on crop yields are inherently uncertain. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models are difficult. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development andpolicymaking.
The fragmentation instability of a black hole with f( R) global monopole under GUP
NASA Astrophysics Data System (ADS)
Chen, Lingshen; Cheng, Hongbo
2018-03-01
Having studied the fragmentation of the black holes containing f( R) global monopole under the generalized uncertainty principle (GUP), we show the influences from this kind of monopole, f( R) theory, and GUP on the evolution of black holes. We focus on the possibility that the black hole breaks into two parts by means of the second law of thermodynamics. We derive the entropies of the initial black hole and the broken parts while the generalization of Heisenberg's uncertainty principle is introduced. We find that the f( R) global monopole black hole keeps stable instead of splitting without the generalization because the entropy difference is negative. The fragmentation of the black hole will happen if the black hole entropies are limited by the GUP and the considerable deviation from the general relativity leads to the case that the mass of one fragmented black hole is smaller and the other one's mass is larger.
NASA Astrophysics Data System (ADS)
Hill Clarvis, M.; Allan, A.; Hannah, D. M.
2013-12-01
Climate change has significant ramifications for water law and governance, yet, there is strong evidence that legal regulations have often failed to protect environments or promote sustainable development. Scholars have increasingly suggested that the preservation and restoration paradigms of legislation and regulation are no longer adequate for climate change related challenges in complex and cross-scale social-ecological systems. This is namely due to past assumptions of stationarity, uniformitarianism and the perception of ecosystem change as predictable and reversible. This paper reviews the literature on law and resilience and then presents and discusses a set of practical examples of legal mechanisms from the water resources management sector, identified according to a set of guiding principles from the literature on adaptive capacity, adaptive governance as well as adaptive and integrated water resources management. It then assesses the aptness of these different measures according to scientific evidence of increased uncertainty and changing ecological baselines. A review of the best practice examples demonstrates that there are a number of best practice examples attempting to integrate adaptive elements of flexibility, iterativity, connectivity and subsidiarity into a variety of legislative mechanisms, suggesting that there is not as significant a tension between resilience and the law as many scholars have suggested. However, while many of the mechanisms may indeed be suitable for addressing challenges relating to current levels of change and uncertainty, analysis across a broader range of uncertainty highlights challenges relating to more irreversible changes associated with greater levels of warming. Furthermore the paper identifies a set of pre-requisites that are fundamental to the successful implementation of such mechanisms, namely monitoring and data sharing, financial and technical capacity, particularly in nations that are most at risk with the least data infrastructure. The article aims to contribute to both theory and practice, enabling policy makers to translate resilience based terminology and adaptive governance principles into clear instructions for incorporating uncertainty into legislation and policy design.
Unifying decoherence and the Heisenberg Principle
NASA Astrophysics Data System (ADS)
Janssens, Bas
2017-08-01
We exhibit three inequalities involving quantum measurement, all of which are sharp and state independent. The first inequality bounds the performance of joint measurement. The second quantifies the trade-off between the measurement quality and the disturbance caused on the measured system. Finally, the third inequality provides a sharp lower bound on the amount of decoherence in terms of the measurement quality. This gives a unified description of both the Heisenberg uncertainty principle and the collapse of the wave function.
Demetrius, L
2000-09-07
The science of thermodynamics is concerned with understanding the properties of inanimate matter in so far as they are determined by changes in temperature. The Second Law asserts that in irreversible processes there is a uni-directional increase in thermodynamic entropy, a measure of the degree of uncertainty in the thermal energy state of a randomly chosen particle in the aggregate. The science of evolution is concerned with understanding the properties of populations of living matter in so far as they are regulated by changes in generation time. Directionality theory, a mathematical model of the evolutionary process, establishes that in populations subject to bounded growth constraints, there is a uni-directional increase in evolutionary entropy, a measure of the degree of uncertainty in the age of the immediate ancestor of a randomly chosen newborn. This article reviews the mathematical basis of directionality theory and analyses the relation between directionality theory and statistical thermodynamics. We exploit an analytic relation between temperature, and generation time, to show that the directionality principle for evolutionary entropy is a non-equilibrium extension of the principle of a uni-directional increase of thermodynamic entropy. The analytic relation between these directionality principles is consistent with the hypothesis of the equivalence of fundamental laws as one moves up the hierarchy, from a molecular ensemble where the thermodynamic laws apply, to a population of replicating entities (molecules, cells, higher organisms), where evolutionary principles prevail. Copyright 2000 Academic Press.
Mallucci, Patrick; Branford, Olivier Alexandre
2015-10-01
There are few objective analyses in the plastic surgical literature to define an aesthetically pleasing template for breast shape and proportion. The authors previously identified key objective parameters that define breast aesthetic ideals in 2 studies: an observational analysis of 100 models with natural breasts, and a population analysis with 1315 respondents. From these data a simple yet reproducible formula for surgical planning in breast augmentation has been developed to consistently achieve beautiful breasts, namely the ICE principle. This article proposes that this principle be used as the basis for design in aesthetic breast surgery. Copyright © 2015 Elsevier Inc. All rights reserved.
Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates
NASA Astrophysics Data System (ADS)
Grassi, Giacomo; Monni, Suvi; Federici, Sandro; Achard, Frederic; Mollicone, Danilo
2008-07-01
A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data—i.e., area change and C stock change/area—may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools—already existing in UNFCCC decisions and IPCC guidance documents—may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.
Is the Precautionary Principle Really Incoherent?
Boyer-Kassem, Thomas
2017-11-01
The Precautionary Principle has been an increasingly important principle in international treaties since the 1980s. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This article concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Li, B.; Lee, H. C.; Duan, X.; Shen, C.; Zhou, L.; Jia, X.; Yang, M.
2017-09-01
The dual-energy CT-based (DECT) approach holds promise in reducing the overall uncertainty in proton stopping-power-ratio (SPR) estimation as compared to the conventional stoichiometric calibration approach. The objective of this study was to analyze the factors contributing to uncertainty in SPR estimation using the DECT-based approach and to derive a comprehensive estimate of the range uncertainty associated with SPR estimation in treatment planning. Two state-of-the-art DECT-based methods were selected and implemented on a Siemens SOMATOM Force DECT scanner. The uncertainties were first divided into five independent categories. The uncertainty associated with each category was estimated for lung, soft and bone tissues separately. A single composite uncertainty estimate was eventually determined for three tumor sites (lung, prostate and head-and-neck) by weighting the relative proportion of each tissue group for that specific site. The uncertainties associated with the two selected DECT methods were found to be similar, therefore the following results applied to both methods. The overall uncertainty (1σ) in SPR estimation with the DECT-based approach was estimated to be 3.8%, 1.2% and 2.0% for lung, soft and bone tissues, respectively. The dominant factor contributing to uncertainty in the DECT approach was the imaging uncertainties, followed by the DECT modeling uncertainties. Our study showed that the DECT approach can reduce the overall range uncertainty to approximately 2.2% (2σ) in clinical scenarios, in contrast to the previously reported 1%.
Constrained sampling experiments reveal principles of detection in natural scenes.
Sebastian, Stephen; Abrams, Jared; Geisler, Wilson S
2017-07-11
A fundamental everyday visual task is to detect target objects within a background scene. Using relatively simple stimuli, vision science has identified several major factors that affect detection thresholds, including the luminance of the background, the contrast of the background, the spatial similarity of the background to the target, and uncertainty due to random variations in the properties of the background and in the amplitude of the target. Here we use an experimental approach based on constrained sampling from multidimensional histograms of natural stimuli, together with a theoretical analysis based on signal detection theory, to discover how these factors affect detection in natural scenes. We sorted a large collection of natural image backgrounds into multidimensional histograms, where each bin corresponds to a particular luminance, contrast, and similarity. Detection thresholds were measured for a subset of bins spanning the space, where a natural background was randomly sampled from a bin on each trial. In low-uncertainty conditions, both the background bin and the amplitude of the target were fixed, and, in high-uncertainty conditions, they varied randomly on each trial. We found that thresholds increase approximately linearly along all three dimensions and that detection accuracy is unaffected by background bin and target amplitude uncertainty. The results are predicted from first principles by a normalized matched-template detector, where the dynamic normalizing gain factor follows directly from the statistical properties of the natural backgrounds. The results provide an explanation for classic laws of psychophysics and their underlying neural mechanisms.
Uncertainty in Bohr's response to the Heisenberg microscope
NASA Astrophysics Data System (ADS)
Tanona, Scott
2004-09-01
In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.
Lineal energy calibration of mini tissue-equivalent gas-proportional counters (TEPC)
NASA Astrophysics Data System (ADS)
Conte, V.; Moro, D.; Grosswendt, B.; Colautti, P.
2013-07-01
Mini TEPCs are cylindrical gas proportional counters of 1 mm or less of sensitive volume diameter. The lineal energy calibration of these tiny counters can be performed with an external gamma-ray source. However, to do that, first a method to get a simple and precise spectral mark has to be found and then the keV/μm value of this mark. A precise method (less than 1% of uncertainty) to identify this markis described here, and the lineal energy value of this mark has been measured for different simulated site sizes by using a 137Cs gamma source and a cylindrical TEPC equipped with a precision internal 244Cm alpha-particle source, and filled with propane-based tissue-equivalent gas mixture. Mini TEPCs can be calibrated in terms of lineal energy, by exposing them to 137Cesium sources, with an overall uncertainty of about 5%.
Adaptive approaches to biosecurity governance.
Cook, David C; Liu, Shuang; Murphy, Brendan; Lonsdale, W Mark
2010-09-01
This article discusses institutional changes that may facilitate an adaptive approach to biosecurity risk management where governance is viewed as a multidisciplinary, interactive experiment acknowledging uncertainty. Using the principles of adaptive governance, evolved from institutional theory, we explore how the concepts of lateral information flows, incentive alignment, and policy experimentation might shape Australia's invasive species defense mechanisms. We suggest design principles for biosecurity policies emphasizing overlapping complementary response capabilities and the sharing of invasive species risks via a polycentric system of governance. © 2010 Society for Risk Analysis
Scenario-based fitted Q-iteration for adaptive control of water reservoir systems under uncertainty
NASA Astrophysics Data System (ADS)
Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea
2017-04-01
Over recent years, mathematical models have largely been used to support planning and management of water resources systems. Yet, the increasing uncertainties in their inputs - due to increased variability in the hydrological regimes - are a major challenge to the optimal operations of these systems. Such uncertainty, boosted by projected changing climate, violates the stationarity principle generally used for describing hydro-meteorological processes, which assumes time persisting statistical characteristics of a given variable as inferred by historical data. As this principle is unlikely to be valid in the future, the probability density function used for modeling stochastic disturbances (e.g., inflows) becomes an additional uncertain parameter of the problem, which can be described in a deterministic and set-membership based fashion. This study contributes a novel method for designing optimal, adaptive policies for controlling water reservoir systems under climate-related uncertainty. The proposed method, called scenario-based Fitted Q-Iteration (sFQI), extends the original Fitted Q-Iteration algorithm by enlarging the state space to include the space of the uncertain system's parameters (i.e., the uncertain climate scenarios). As a result, sFQI embeds the set-membership uncertainty of the future inflow scenarios in the action-value function and is able to approximate, with a single learning process, the optimal control policy associated to any scenario included in the uncertainty set. The method is demonstrated on a synthetic water system, consisting of a regulated lake operated for ensuring reliable water supply to downstream users. Numerical results show that the sFQI algorithm successfully identifies adaptive solutions to operate the system under different inflow scenarios, which outperform the control policy designed under historical conditions. Moreover, the sFQI policy generalizes over inflow scenarios not directly experienced during the policy design, thus alleviating the risk of mis-adaptation, namely the design of a solution fully adapted to a scenario that is different from the one that will actually realize.
Quantum Mechanics predicts evolutionary biology.
Torday, J S
2018-07-01
Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Uncertainty Analysis of Thermal Comfort Parameters
NASA Astrophysics Data System (ADS)
Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages
2015-08-01
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
Towards traceability in CO2 line strength measurements by TDLAS at 2.7 µm
NASA Astrophysics Data System (ADS)
Pogány, Andrea; Ott, Oliver; Werhahn, Olav; Ebert, Volker
2013-11-01
Direct tunable diode laser absorption spectroscopy (TDLAS) was combined in this study with metrological principles on the determination of uncertainties to measure the line strengths of the P36e and P34e line of 12C16O2 in the ν1+ν3 band at 2.7 μm. Special emphasis was put on traceability and a concise, well-documented uncertainty assessment. We have quantitatively analyzed the uncertainty contributions of different experimental parameters to the uncertainty of the line strength. Establishment of the wavenumber axis and the gas handling procedure proved to be the two major contributors to the final uncertainty. The obtained line strengths at 296 K are 1.593×10-20 cm/molecule for the P36e and 1.981×10-20 cm/molecule for the P34e line, with relative expanded uncertainties of 1.1% and 1.3%, respectively (k=2, corresponding to a 95% confidence level). The measured line strength values are in agreement with literature data (line strengths listed in the HITRAN and GEISA databases), but show an uncertainty, which is at least a factor of 2 lower.
On Entropy Production in the Madelung Fluid and the Role of Bohm's Potential in Classical Diffusion
NASA Astrophysics Data System (ADS)
Heifetz, Eyal; Tsekov, Roumen; Cohen, Eliahu; Nussinov, Zohar
2016-07-01
The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow (or decrease) due to an expansion (or compression) of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon entropy due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the "force" accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum.
Research on the method of establishing the total radiation meter calibration device
NASA Astrophysics Data System (ADS)
Gao, Jianqiang; Xia, Ming; Xia, Junwen; Zhang, Dong
2015-10-01
Pyranometer is an instrument used to measure the solar radiation, according to pyranometer differs as installation state, can be respectively measured total solar radiation, reflected radiation, or with the help of shading device for measuring scattering radiation. Pyranometer uses the principle of thermoelectric effect, inductive element adopts winding plating type multi junction thermopile, its surface is coated with black coating with high absorption rate. Hot junction in the induction surface, while the cold junction is located in the body, the cold and hot junction produce thermoelectric potential. In the linear range, the output signal is proportional to the solar irradiance. Traceability to national meteorological station, as the unit of the national legal metrology organizations, the responsibility is to transfer value of the sun and the earth radiation value about the national meteorological industry. Using the method of comparison, with indoor calibration of solar simulator, at the same location, standard pyranometer and measured pyranometer were alternately measured radiation irradiance, depending on the irradiation sensitivity standard pyranometer were calculated the radiation sensitivity of measured pyranometer. This paper is mainly about the design and calibration method of the pyranometer indoor device. The uncertainty of the calibration result is also evaluated.
Probabilistic Risk Assessment to Inform Decision Making: Frequently Asked Questions
General concepts and principles of Probabilistic Risk Assessment (PRA), describe how PRA can improve the bases of Agency decisions, and provide illustrations of how PRA has been used in risk estimation and in describing the uncertainty in decision making.
The precautionary principle and pharmaceutical risk management.
Callréus, Torbjörn
2005-01-01
Although it is often vigorously contested and has several different formulations, the precautionary principle has in recent decades guided environmental policy making in the face of scientific uncertainty. Originating from a criticism of traditional risk assessment, the key element of the precautionary principle is the justification for acting in the face of uncertain knowledge about risks. In the light of its growing invocation in various areas that are related to public health and recently in relation to drug safety issues, this article presents an introductory review of the main elements of the precautionary principle and some arguments conveyed by its advocates and opponents. A comparison of the characteristics of pharmaceutical risk management and environmental policy making (i.e. the setting within which the precautionary principle evolved), indicates that several important differences exist. If believed to be of relevance, in order to avoid arbitrary and unpredictable decision making, both the interpretation and possible application of the precautionary principle need to be adapted to the conditions of pharmaceutical risk management.
Application of fuzzy system theory in addressing the presence of uncertainties
NASA Astrophysics Data System (ADS)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.
2015-02-01
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.
Towards Statistically Undetectable Steganography
2011-06-30
payload size. Middle, payload proportional to y/N. Right, proportional to N. LSB replacement steganography in never-compressed cover images , detected...Books. (1) J. Fridrich, Steganography in Digital Media: Principles, Algorithms , and Applications, Cambridge University Press, November 2009. Journal... Images for Applications in Steganography ," IEEE Trans, on Info. Forensics and Security, vol. 3(2), pp. 247-258, 2008. Conference papers. (1) T. Filler
Uncertainty in quantum mechanics: faith or fantasy?
Penrose, Roger
2011-12-13
The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.
Fundamental uncertainty limit for speckle displacement measurements.
Fischer, Andreas
2017-09-01
The basic metrological task in speckle photography is to quantify displacements of speckle patterns, allowing for instance the investigation of the mechanical load and modification of objects with rough surfaces. However, the fundamental limit of the measurement uncertainty due to photon shot noise is unknown. For this reason, the Cramér-Rao bound (CRB) is derived for speckle displacement measurements, representing the squared minimal achievable measurement uncertainty. As result, the CRB for speckle patterns is only two times the CRB for an ideal point light source. Hence, speckle photography is an optimal measurement approach for contactless displacement measurements on rough surfaces. In agreement with a derivation from Heisenberg's uncertainty principle, the CRB depends on the number of detected photons and the diffraction limit of the imaging system described by the speckle size. The theoretical results are verified and validated, demonstrating the capability for displacement measurements with nanometer resolution.
Norman, Rosana; Barnes, Brendon; Mathee, Angela; Bradshaw, Debbie
2007-08-01
To estimate the burden of respiratory ill health in South African children and adults in 2000 from exposure to indoor air pollution associated with household use of solid fuels. World Health Organization comparative risk assessment (CRA) methodology was followed. The South African Census 2001 was used to derive the proportion of households using solid fuels for cooking and heating by population group. Exposure estimates were adjusted by a ventilation factor taking into account the general level of ventilation in the households. Population-attributable fractions were calculated and applied to revised burden of disease estimates for each population group. Monte Carlo simulation-modelling techniques were used for uncertainty analysis. South Africa. Black African, coloured, white and Indian children under 5 years of age and adults aged 30 years and older. Mortality and disability-adjusted life years (DALYs) from acute lower respiratory infections in children under 5 years, and chronic obstructive pulmonary disease and lung cancer in adults 30 years and older. An estimated 20% of South African households were exposed to indoor smoke from solid fuels, with marked variation by population group. This exposure was estimated to have caused 2,489 deaths (95% uncertainty interval 1,672 - 3,324) or 0.5% (95% uncertainty interval 0.3 - 0.6%) of all deaths in South Africa in 2000. The loss of healthy life years comprised a slightly smaller proportion of the total: 60,934 DALYs (95% uncertainty interval 41,170 - 81,246) or 0.4% of all DALYs (95% uncertainty interval 0.3 - 0.5%) in South Africa in 2000. Almost 99% of this burden occurred in the black African population. The most important interventions to reduce this impact include access to cleaner household fuels, improved stoves, and better ventilation.
NASA Astrophysics Data System (ADS)
Irby, Victor D.
2004-09-01
The concept and subsequent experimental verification of the proportionality between pulse amplitude and detector transit time for microchannel-plate detectors is presented. This discovery has led to considerable improvement in the overall timing resolution for detection of high-energy ggr-photons. Utilizing a 22Na positron source, a full width half maximum (FWHM) timing resolution of 138 ps has been achieved. This FWHM includes detector transit-time spread for both chevron-stack-type detectors, timing spread due to uncertainties in annihilation location, all electronic uncertainty and any remaining quantum mechanical uncertainty. The first measurement of the minimum quantum uncertainty in the time interval between detection of the two annihilation photons is reported. The experimental results give strong evidence against instantaneous spatial localization of ggr-photons due to measurement-induced nonlocal quantum wavefunction collapse. The experimental results are also the first that imply momentum is conserved only after the quantum uncertainty in time has elapsed (Yukawa H 1935 Proc. Phys. Math. Soc. Japan 17 48).
Experiences of Uncertainty in Men With an Elevated PSA.
Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather
2015-05-15
A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.
The actual content of quantum theoretical kinematics and mechanics
NASA Technical Reports Server (NTRS)
Heisenberg, W.
1983-01-01
First, exact definitions are supplied for the terms: position, velocity, energy, etc. (of the electron, for instance), such that they are valid also in quantum mechanics. Canonically conjugated variables are determined simultaneously only with a characteristic uncertainty. This uncertainty is the intrinsic reason for the occurrence of statistical relations in quantum mechanics. Mathematical formulation is made possible by the Dirac-Jordan theory. Beginning from the basic principles thus obtained, macroscopic processes are understood from the viewpoint of quantum mechanics. Several imaginary experiments are discussed to elucidate the theory.
Mass Uncertainty and Application For Space Systems
NASA Technical Reports Server (NTRS)
Beech, Geoffrey
2013-01-01
Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.
NASA Astrophysics Data System (ADS)
van den Hoek, Ronald; Brugnach, Marcela; Hoekstra, Arjen
2013-04-01
In the 20th century, flood management was dominated by rigid structures - such as dikes and dams - which intend to strictly regulate and control water systems. Although the application of these rigid structures has been successful in the recent past, their negative implications for ecosystems and natural processes is often not properly taken into account. Therefore, flood management practices are currently moving towards more nature-inclusive approaches. Building with Nature (BwN) is such a new approach of nature-inclusive flood management in the Netherlands, which aims to utilize natural dynamics (e.g., wind and currents) and natural materials (e.g., sediment and vegetation) for the realization of effective flood infrastructure, while providing opportunities for nature development. However, the natural dynamics driving a project based on BwN design principles are inherently unpredictable. Furthermore, our factual knowledge base regarding the socio-ecological system in which the BwN initiative is implemented is incomplete. Moreover, in recent years, it is increasingly aimed for by decision-makers to involve local stakeholders in the development of promising flood management initiatives. These stakeholders and other actors involved can have diverging views regarding the project, can perceive unanticipated implications and could choose unforeseen action paths. In short, while a project based on BwN design principles - like any human intervention - definitely has implications for the socio-ecological system, both the extent to which these particular implications will occur and the response of stakeholders are highly uncertain. In this paper, we study the Safety Buffer Oyster Dam case - a BwN pilot project - and address the interplay between the project's implications, the uncertainties regarding these implications and the action paths chosen by the local stakeholders and project team. We determine how the implications of the Safety Buffer project are viewed by local stakeholders, identify the frames and uncertainties related to these implications, and classify these uncertainties according to their nature and level. We describe which action paths are chosen by the local stakeholders and project team regarding the implications identified. Our research shows that there is a correspondence between the level of uncertainty about the implications identified and the action paths chosen by the actors involved. This suggests that the inherent deep uncertainty in projects based on BwN principles calls for more adaptable and flexible strategies to cope with the implications of these initiatives.
Uncertainty in Simulating Wheat Yields Under Climate Change
NASA Technical Reports Server (NTRS)
Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.;
2013-01-01
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.
Robert Frost and the Poetry of Physics.
ERIC Educational Resources Information Center
Coletta, W. John; Tamres, David H.
1992-01-01
Examines five poems by Robert Frost that illustrate Frost's interest in science. The poems include allusions to renowned physicists, metaphoric descriptions of some famous physics experiments, explorations of complementarity as enunciated by Bohr, and poetic formulations of Heisenberg's uncertainty principle. (20 references) (MDH)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less
Research implications of science-informed, value-based decision making.
Dowie, Jack
2004-01-01
In 'Hard' science, scientists correctly operate as the 'guardians of certainty', using hypothesis testing formulations and value judgements about error rates and time discounting that make classical inferential methods appropriate. But these methods can neither generate most of the inputs needed by decision makers in their time frame, nor generate them in a form that allows them to be integrated into the decision in an analytically coherent and transparent way. The need for transparent accountability in public decision making under uncertainty and value conflict means the analytical coherence provided by the stochastic Bayesian decision analytic approach, drawing on the outputs of Bayesian science, is needed. If scientific researchers are to play the role they should be playing in informing value-based decision making, they need to see themselves also as 'guardians of uncertainty', ensuring that the best possible current posterior distributions on relevant parameters are made available for decision making, irrespective of the state of the certainty-seeking research. The paper distinguishes the actors employing different technologies in terms of the focus of the technology (knowledge, values, choice); the 'home base' mode of their activity on the cognitive continuum of varying analysis-to-intuition ratios; and the underlying value judgements of the activity (especially error loss functions and time discount rates). Those who propose any principle of decision making other than the banal 'Best Principle', including the 'Precautionary Principle', are properly interpreted as advocates seeking to have their own value judgements and preferences regarding mode location apply. The task for accountable decision makers, and their supporting technologists, is to determine the best course of action under the universal conditions of uncertainty and value difference/conflict.
Cosmological horizons, uncertainty principle, and maximum length quantum mechanics
NASA Astrophysics Data System (ADS)
Perivolaropoulos, L.
2017-05-01
The cosmological particle horizon is the maximum measurable length in the Universe. The existence of such a maximum observable length scale implies a modification of the quantum uncertainty principle. Thus due to nonlocality of quantum mechanics, the global properties of the Universe could produce a signature on the behavior of local quantum systems. A generalized uncertainty principle (GUP) that is consistent with the existence of such a maximum observable length scale lmax is Δ x Δ p ≥ℏ2/1/1 -α Δ x2 where α =lmax-2≃(H0/c )2 (H0 is the Hubble parameter and c is the speed of light). In addition to the existence of a maximum measurable length lmax=1/√{α }, this form of GUP implies also the existence of a minimum measurable momentum pmin=3/√{3 } 4 ℏ√{α }. Using appropriate representation of the position and momentum quantum operators we show that the spectrum of the one-dimensional harmonic oscillator becomes E¯n=2 n +1 +λnα ¯ where E¯n≡2 En/ℏω is the dimensionless properly normalized n th energy level, α ¯ is a dimensionless parameter with α ¯≡α ℏ/m ω and λn˜n2 for n ≫1 (we show the full form of λn in the text). For a typical vibrating diatomic molecule and lmax=c /H0 we find α ¯˜10-77 and therefore for such a system, this effect is beyond the reach of current experiments. However, this effect could be more important in the early Universe and could produce signatures in the primordial perturbation spectrum induced by quantum fluctuations of the inflaton field.
DOT National Transportation Integrated Search
2008-01-01
Unexpected delays due to traffic incidents represent a significant proportion of overall delay, especially in urban areas. The resulting uncertainty can represent major costs to businesses and travelers, as well as restrict employment opportunities. ...
NASA Astrophysics Data System (ADS)
Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2017-09-01
The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.
[Invariants of the anthropometrical proportions].
Smolianinov, V V
2012-01-01
In this work a general interpretation of a modulor as scales of segments proportions of anthropometrical modules (extremities and a body) is made. The objects of this study were: 1) to reason the idea of the growth modulor; 2) using the modern empirical data, to prove the validity of a principle of linear similarity for anthropometrical segments; 3) to specify the system of invariants for constitutional anthropometrics.
Weak values, 'negative probability', and the uncertainty principle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokolovski, D.
2007-10-15
A quantum transition can be seen as a result of interference between various pathways (e.g., Feynman paths), which can be labeled by a variable f. An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of f. We show f to be an average obtained with an amplitude distribution which can, in general, take negative values, which, in accordance with the uncertainty principle, need not contain information about the actual range of f which contributes to the transition. It is also demonstrated that the moments of such alternating distributions have amore » number of unusual properties which may lead to a misinterpretation of the weak-measurement results. We provide a detailed analysis of weak measurements with and without post-selection. Examples include the double-slit diffraction experiment, weak von Neumann and von Neumann-like measurements, traversal time for an elastic collision, phase time, and local angular momentum.« less
"He loves me, he loves me not . . . ": uncertainty can increase romantic attraction.
Whitchurch, Erin R; Wilson, Timothy D; Gilbert, Daniel T
2011-02-01
This research qualifies a social psychological truism: that people like others who like them (the reciprocity principle). College women viewed the Facebook profiles of four male students who had previously seen their profiles. They were told that the men (a) liked them a lot, (b) liked them only an average amount, or (c) liked them either a lot or an average amount (uncertain condition). Comparison of the first two conditions yielded results consistent with the reciprocity principle. Participants were more attracted to men who liked them a lot than to men who liked them an average amount. Results for the uncertain condition, however, were consistent with research on the pleasures of uncertainty. Participants in the uncertain condition were most attracted to the men-even more attracted than were participants who were told that the men liked them a lot. Uncertain participants reported thinking about the men the most, and this increased their attraction toward the men.
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
Exploration of quantum-memory-assisted entropic uncertainty relations in a noninertial frame
NASA Astrophysics Data System (ADS)
Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Shi, Jia-Dong; Ye, Liu
2017-05-01
The uncertainty principle offers a bound to show accuracy of the simultaneous measurement outcome for two incompatible observables. In this letter, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) when the particle to be measured stays at an open system, and another particle is treated as quantum memory under a noninertial frame. In such a scenario, the collective influence of the unital and nonunital noise environment, and of the relativistic motion of the system, on the QMA-EUR is examined. By numerical analysis, we conclude that, firstly, the noises and the Unruh effect can both increase the uncertainty, due to the decoherence of the bipartite system induced by the noise or Unruh effect; secondly, the uncertainty is more affected by the noises than by the Unruh effect from the acceleration; thirdly, unital noises can reduce the uncertainty in long-time regime. We give a possible physical interpretation for those results: that the information of interest is redistributed among the bipartite, the noisy environment and the physically inaccessible region in the noninertial frame. Therefore, we claim that our observations provide an insight into dynamics of the entropic uncertainty in a noninertial frame, and might be important to quantum precision measurement under relativistic motion.
Singularity of the time-energy uncertainty in adiabatic perturbation and cycloids on a Bloch sphere
Oh, Sangchul; Hu, Xuedong; Nori, Franco; Kais, Sabre
2016-01-01
Adiabatic perturbation is shown to be singular from the exact solution of a spin-1/2 particle in a uniformly rotating magnetic field. Due to a non-adiabatic effect, its quantum trajectory on a Bloch sphere is a cycloid traced by a circle rolling along an adiabatic path. As the magnetic field rotates more and more slowly, the time-energy uncertainty, proportional to the length of the quantum trajectory, calculated by the exact solution is entirely different from the one obtained by the adiabatic path traced by the instantaneous eigenstate. However, the non-adiabatic Aharonov- Anandan geometric phase, measured by the area enclosed by the exact path, approaches smoothly the adiabatic Berry phase, proportional to the area enclosed by the adiabatic path. The singular limit of the time-energy uncertainty and the regular limit of the geometric phase are associated with the arc length and arc area of the cycloid on a Bloch sphere, respectively. Prolate and curtate cycloids are also traced by different initial states outside and inside of the rolling circle, respectively. The axis trajectory of the rolling circle, parallel to the adiabatic path, is shown to be an example of transitionless driving. The non-adiabatic resonance is visualized by the number of cycloid arcs. PMID:26916031
NASA Astrophysics Data System (ADS)
Parada, M.; Sbarbaro, D.; Borges, R. A.; Peres, P. L. D.
2017-01-01
The use of robust design techniques such as the one based on ? and ? for tuning proportional integral (PI) and proportional integral derivative (PID) controllers have been limited to address a small set of processes. This work addresses the problem by considering a wide set of possible plants, both first- and second-order continuous-time systems with time delays and zeros, leading to PI and PID controllers. The use of structured uncertainties to handle neglected dynamics allows to expand the range of processes to be considered. The proposed approach takes into account the robustness of the controller with respect to these structured uncertainties by using the small-gain theorem. In addition, improved performance is sought through the minimisation of an upper bound to the closed-loop system ? norm. A Lyapunov-Krasovskii-type functional is used to obtain delay-dependent design conditions. The controller design is accomplished by means of a convex optimisation procedure formulated using linear matrix inequalities. In order to illustrate the flexibility of the approach, several examples considering recycle compensation, reduced-order controller design and a practical implementation are addressed. Numerical experiments are provided in each case to highlight the main characteristics of the proposed design method.
Application of fuzzy system theory in addressing the presence of uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statisticalmore » approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.« less
NASA Astrophysics Data System (ADS)
Shababi, Homa; Chung, Won Sang
2018-04-01
In this paper, using the new type of D-dimensional nonperturbative Generalized Uncertainty Principle (GUP) which has predicted both a minimal length uncertainty and a maximal observable momentum,1 first, we obtain the maximally localized states and express their connections to [P. Pedram, Phys. Lett. B 714, 317 (2012)]. Then, in the context of our proposed GUP and using the generalized Schrödinger equation, we solve some important problems including particle in a box and one-dimensional hydrogen atom. Next, implying modified Bohr-Sommerfeld quantization, we obtain energy spectra of quantum harmonic oscillator and quantum bouncer. Finally, as an example, we investigate some statistical properties of a free particle, including partition function and internal energy, in the presence of the mentioned GUP.
Scaling Up Decision Theoretic Planning to Planetary Rover Problems
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Dearden, Richard; Washington, Rich
2004-01-01
Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.
Reply to "Comment on 'Fractional quantum mechanics' and 'Fractional Schrödinger equation' ".
Laskin, Nick
2016-06-01
The fractional uncertainty relation is a mathematical formulation of Heisenberg's uncertainty principle in the framework of fractional quantum mechanics. Two mistaken statements presented in the Comment have been revealed. The origin of each mistaken statement has been clarified and corrected statements have been made. A map between standard quantum mechanics and fractional quantum mechanics has been presented to emphasize the features of fractional quantum mechanics and to avoid misinterpretations of the fractional uncertainty relation. It has been shown that the fractional probability current equation is correct in the area of its applicability. Further studies have to be done to find meaningful quantum physics problems with involvement of the fractional probability current density vector and the extra term emerging in the framework of fractional quantum mechanics.
The Introduction of Agility into Albania.
ERIC Educational Resources Information Center
Smith-Stevens, Eileen J.; Shkurti, Drita
1998-01-01
Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…
Trends in Modern War Gaming: The Art of Conversation
2014-01-01
overtook, and in some cases replaced, these thought processes. as Jung aptly noted, “in the West, consciousness has been developed mainly through...Werner Heisenberg’s uncertainty principle in 1927, successive generations of quantum theorists have moved well beyond anecdotal claims into the realm of
Local Telephone Costs and the Design of Rate Structures,
1981-05-01
guide the setting of prices for the multi-product regulated firm. Economic effi- ciency can be increased by designing rate structures that incorporate the... basic principles developed from this theory. These principles call for provisionally pricing each of the firm’s outputs at its marginal cost, testing...rule--prices are increased above marginal costs in inverse proportion to the individual price elasticities of demand. This paper applies ratemaking
NASA Astrophysics Data System (ADS)
Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2018-04-01
The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.
MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle.
Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter
2017-12-08
According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10^{-15} precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ(Ti,Pt)=[-1±9(stat)±9(syst)]×10^{-15} (1σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.
MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle
NASA Astrophysics Data System (ADS)
Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter
2017-12-01
According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10-15 precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ (Ti ,Pt )=[-1 ±9 (stat)±9 (syst)]×10-15 (1 σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.
How measurement science can improve confidence in research results.
Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T
2018-04-01
The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.
Parameter uncertainty in simulations of extreme precipitation and attribution studies.
NASA Astrophysics Data System (ADS)
Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.
2017-12-01
The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.
Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan
2018-01-01
Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0–20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution. PMID:29642623
Hu, Bifeng; Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan; Shi, Zhou
2018-04-10
Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0-20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution.
Tire Force Estimation using a Proportional Integral Observer
NASA Astrophysics Data System (ADS)
Farhat, Ahmad; Koenig, Damien; Hernandez-Alcantara, Diana; Morales-Menendez, Ruben
2017-01-01
This paper addresses a method for detecting critical stability situations in the lateral vehicle dynamics by estimating the non-linear part of the tire forces. These forces indicate the road holding performance of the vehicle. The estimation method is based on a robust fault detection and estimation approach which minimize the disturbance and uncertainties to residual sensitivity. It consists in the design of a Proportional Integral Observer (PIO), while minimizing the well known H ∞ norm for the worst case uncertainties and disturbance attenuation, and combining a transient response specification. This multi-objective problem is formulated as a Linear Matrix Inequalities (LMI) feasibility problem where a cost function subject to LMI constraints is minimized. This approach is employed to generate a set of switched robust observers for uncertain switched systems, where the convergence of the observer is ensured using a Multiple Lyapunov Function (MLF). Whilst the forces to be estimated can not be physically measured, a simulation scenario with CarSimTM is presented to illustrate the developed method.
Medical Evidence Influence on Inpatients and Nurses Pain Ratings Agreement
Samolsky Dekel, Boaz Gedaliahu; Gori, Alberto; Vasarri, Alessio; Sorella, Maria Cristina; Di Nino, Gianfranco; Melotti, Rita Maria
2016-01-01
Biased pain evaluation due to automated heuristics driven by symptom uncertainty may undermine pain treatment; medical evidence moderators are thought to play a role in such circumstances. We explored, in this cross-sectional survey, the effect of such moderators (e.g., nurse awareness of patients' pain experience and treatment) on the agreement between n = 862 inpatients' self-reported pain and n = 115 nurses' pain ratings using a numerical rating scale. We assessed the mean of absolute difference, agreement (κ-statistics), and correlation (Spearman rank) of inpatients and nurses' pain ratings and analyzed congruence categories' (CCs: underestimation, congruence, and overestimation) proportions and dependence upon pain categories for each medical evidence moderator (χ 2 analysis). Pain ratings agreement and correlation were limited; the CCs proportions were further modulated by the studied moderators. Medical evidence promoted in nurses overestimation of low and underestimation of high inpatients' self-reported pain. Knowledge of the negative influence of automated heuristics driven by symptoms uncertainty and medical-evidence moderators on pain evaluation may render pain assessment more accurate. PMID:27445633
Principles for high-quality, high-value testing.
Power, Michael; Fell, Greg; Wright, Michael
2013-02-01
A survey of doctors working in two large NHS hospitals identified over 120 laboratory tests, imaging investigations and investigational procedures that they considered not to be overused. A common suggestion in this survey was that more training was required. And, this prompted the development of a list of core principles for high-quality, high-value testing. The list can be used as a framework for training and as a reference source. The core principles are: (1) Base testing practices on the best available evidence. (2) Apply the evidence on test performance with careful judgement. (3) Test efficiently. (4) Consider the value (and affordability) of a test before requesting it. (5) Be aware of the downsides and drivers of overdiagnosis. (6) Confront uncertainties. (7) Be patient-centred in your approach. (8) Consider ethical issues. (9) Be aware of normal cognitive limitations and biases when testing. (10) Follow the 'knowledge journey' when teaching and learning these core principles.
Good modeling practice guidelines for applying multimedia models in chemical assessments.
Buser, Andreas M; MacLeod, Matthew; Scheringer, Martin; Mackay, Don; Bonnell, Mark; Russell, Mark H; DePinto, Joseph V; Hungerbühler, Konrad
2012-10-01
Multimedia mass balance models of chemical fate in the environment have been used for over 3 decades in a regulatory context to assist decision making. As these models become more comprehensive, reliable, and accepted, there is a need to recognize and adopt principles of Good Modeling Practice (GMP) to ensure that multimedia models are applied with transparency and adherence to accepted scientific principles. We propose and discuss 6 principles of GMP for applying existing multimedia models in a decision-making context, namely 1) specification of the goals of the model assessment, 2) specification of the model used, 3) specification of the input data, 4) specification of the output data, 5) conduct of a sensitivity and possibly also uncertainty analysis, and finally 6) specification of the limitations and limits of applicability of the analysis. These principles are justified and discussed with a view to enhancing the transparency and quality of model-based assessments. Copyright © 2012 SETAC.
Lierman, S; Veuchelen, L
2005-01-01
The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-07
... accounting principles (``GAAP''). The rule was a clarification, rather than a limitation, of the repudiation... has created uncertainty for securitization participants. On June 12, 2009, the Financial Accounting Standards Board (``FASB'') finalized modifications to GAAP through Statement of Financial Accounting...
Quantum Theory from Observer's Mathematics Point of View
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khots, Dmitriy; Khots, Boris
2010-05-04
This work considers the linear (time-dependent) Schrodinger equation, quantum theory of two-slit interference, wave-particle duality for single photons, and the uncertainty principle in a setting of arithmetic, algebra, and topology provided by Observer's Mathematics, see [1]. Certain theoretical results and communications pertaining to these theorems are also provided.
STRATEGIC PLAN FOR THE OFFICE OF RESEARCH AND DEVELOPMENT, MAY 1996
The ORD Strategic Plan is based on nine principles: (1) Focus research on the greatest risks to people and the environment, (2) Focus research on reducing uncertainty in risk assessment, (3) Balance human health and ecological research, (4) Work for customers and clients, (5) Mai...
Quantum Theory, the Uncertainty Principle, and the Alchemy of Standardized Testing.
ERIC Educational Resources Information Center
Wassermann, Selma
2001-01-01
Argues that reliance on the outcome of quantitative standardized tests to assess student performance is misplaced quest for certainty in an uncertain world. Reviews and lauds Canadian teacher-devised qualitative diagnostic tool, "Profiles of Student Behaviors," composed of 20 behavioral patterns in student knowledge, attitude, and skill.…
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
NASA Astrophysics Data System (ADS)
Scheingraber, Christoph; Käser, Martin; Allmann, Alexander
2017-04-01
Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.
The first determination of the Planck constant with the joule balance NIM-2
NASA Astrophysics Data System (ADS)
Li, Zhengkun; Zhang, Zhonghua; Lu, Yunfeng; Hu, Pengcheng; Liu, Yongmeng; Xu, Jinxin; Bai, Yang; Zeng, Tao; Wang, Gang; You, Qiang; Wang, Dawei; Li, Shisong; He, Qing; Tan, Jiubin
2017-10-01
The National Institute of Metrology (NIM, China) proposed a joule balance method to measure the Planck constant in 2006, and built the first prototype NIM-1 to verify its principle with a relative uncertainty of 8.9 × 10-6 by 2013. Since 2013, a new joule balance NIM-2 has been designed, with a series of improvements to reduce the measurement uncertainty. By April 2017, NIM-2 has been constructed and can be employed to measure the Planck constant in vacuum. A first measurement on NIM-2 yields a determination of the Planck constant is 6.626 069 2(16) × 10-34 Js with a relative uncertainty of 2.4 × 10-7. The determination differs in relative terms by -1.27 × 10-7 from the CODATA 2014 value. Further improvement of NIM-2 is still in progress towards 10-8 level uncertainty in the future.
Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control
NASA Astrophysics Data System (ADS)
Deffner, Sebastian; Campbell, Steve
2017-11-01
One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.
A precautionary principle for dual use research in the life sciences.
Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan
2011-01-01
Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. © 2009 Blackwell Publishing Ltd.
Principles of urban transportation
DOT National Transportation Integrated Search
1951-07-01
ONE of the predominant characteristics of modern life in the United States has been the increasing proportion of the population concentrated in metropolitan areas. This growth of large urban centers within relatively narrow geographic areas would hav...
On Some Aspects of Study on Dimensions and Proportions of Church Architecture
NASA Astrophysics Data System (ADS)
Kolobaeva, T. V.
2017-11-01
Architecture forms and arranges the environment required for a comfortable life and human activity. The modern principles of architectural space arrangement and form making are represented by a reliable system of buildings which are used in design. Architects apply these principles and knowledge of space arrangement in regard to the study of special and regulatory literature when performing a particular creative task. This system of accumulated knowledge is perceived in the form of an existing stereotype with no regard for understanding of the form making and experience inherent to the architects and thinkers of previous ages. We make an attempt to restore this connection as the form-making specific regularities known by ancient architects should be taken into account. The paper gives an insight into some aspects of traditional dimensions and proportions of church architecture.
The Systemic Control of Growth
Boulan, Laura; Milán, Marco; Léopold, Pierre
2015-01-01
Growth is a complex process that is intimately linked to the developmental program to form adults with proper size and proportions. Genetics is an important determinant of growth, as exemplified by the role of local diffusible molecules setting up organ proportions. In addition, organisms use adaptive responses allowing modulating the size of individuals according to environmental cues, for example, nutrition. Here, we describe some of the physiological principles participating in the determination of final individual size. PMID:26261282
Estimating uncertainties in complex joint inverse problems
NASA Astrophysics Data System (ADS)
Afonso, Juan Carlos
2016-04-01
Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related to the forward and statistical models, I will also address other uncertainties associated with data and uncertainty propagation.
Accounting for uncertainty in health economic decision models by using model averaging.
Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D
2009-04-01
Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment.
Measuring uncertainty by extracting fuzzy rules using rough sets
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.
1991-01-01
Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.
First-Principles Calculation of the Third Virial Coefficient of Helium
Garberoglio, Giovanni; Harvey, Allan H.
2009-01-01
Knowledge of the pair and three-body potential-energy surfaces of helium is now sufficient to allow calculation of the third density virial coefficient, C(T), with significantly smaller uncertainty than that of existing experimental data. In this work, we employ the best available pair and three-body potentials for helium and calculate C(T) with path-integral Monte Carlo (PIMC) calculations supplemented by semiclassical calculations. The values of C(T) presented extend from 24.5561 K to 10 000 K. In the important metrological range of temperatures near 273.16 K, our uncertainties are smaller than the best experimental results by approximately an order of magnitude, and the reduction in uncertainty at other temperatures is at least as great. For convenience in calculation of C(T) and its derivatives, a simple correlating equation is presented. PMID:27504226
Pace, D C; Pipes, R; Fisher, R K; Van Zeeland, M A
2014-11-01
New phase space mapping and uncertainty analysis of energetic ion loss data in the DIII-D tokamak provides experimental results that serve as valuable constraints in first-principles simulations of energetic ion transport. Beam ion losses are measured by the fast ion loss detector (FILD) diagnostic system consisting of two magnetic spectrometers placed independently along the outer wall. Monte Carlo simulations of mono-energetic and single-pitch ions reaching the FILDs are used to determine the expected uncertainty in the measurements. Modeling shows that the variation in gyrophase of 80 keV beam ions at the FILD aperture can produce an apparent measured energy signature spanning across 50-140 keV. These calculations compare favorably with experiments in which neutral beam prompt loss provides a well known energy and pitch distribution.
Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.
2013-12-01
The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, throughmore » the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.« less
Analysis and Design of Launch Vehicle Flight Control Systems
NASA Technical Reports Server (NTRS)
Wie, Bong; Du, Wei; Whorton, Mark
2008-01-01
This paper describes the fundamental principles of launch vehicle flight control analysis and design. In particular, the classical concept of "drift-minimum" and "load-minimum" control principles is re-examined and its performance and stability robustness with respect to modeling uncertainties and a gimbal angle constraint is discussed. It is shown that an additional feedback of angle-of-attack or lateral acceleration can significantly improve the overall performance and robustness, especially in the presence of unexpected large wind disturbance. Non-minimum-phase structural filtering of "unstably interacting" bending modes of large flexible launch vehicles is also shown to be effective and robust.
The quantum limit for gravitational-wave detectors and methods of circumventing it
NASA Technical Reports Server (NTRS)
Thorne, K. S.; Caves, C. M.; Sandberg, V. D.; Zimmermann, M.; Drever, R. W. P.
1979-01-01
The Heisenberg uncertainty principle prevents the monitoring of the complex amplitude of a mechanical oscillator more accurately than a certain limit value. This 'quantum limit' is a serious obstacle to the achievement of a 10 to the -21st gravitational-wave detection sensitivity. This paper examines the principles of the back-action evasion technique and finds that this technique may be able to overcome the problem of the quantum limit. Back-action evasion does not solve, however, other problems of detection, such as weak coupling, large amplifier noise, and large Nyquist noise.
Semiclassical black holes expose forbidden charges and censor divergent densities
NASA Astrophysics Data System (ADS)
Brustein, Ram; Medved, A. J. M.
2013-09-01
Classically, the horizon of a Schwarzschild black hole (BH) is a rigid surface of infinite redshift; whereas the uncertainty principle dictates that the semiclassical (would-be) horizon cannot be fixed in space nor can it exhibit any divergences. We propose that this distinction underlies the BH information-loss paradox, the apparent absence of BH hair, the so-called trans-Planckian problem and the recent "firewall" controversy. We argue that the correct prescription is to first integrate out the fluctuations of the background geometry and only then evaluate matter observables. The basic idea is illustrated using a system of two strongly coupled harmonic oscillators, with the heavier oscillator representing the background. We then apply our proposal to matter fields near a BH horizon, initially treating the matter fields as classical and the background as semiclassical. In this case, the average value of the associated current does not vanish; so that it is possible, in pr inciple, to measure the global charge of the BH. Then the matter is, in addition to the background, treated quantum mechanically. We show that the average energy density of matter as seen by an asymptotic observer is finite and proportional to the BH entropy, rather than divergent. We discuss the implications of our results for the various controversial issues concerning BH physics.
NASA Astrophysics Data System (ADS)
Hyeon, Changbong; Hwang, Wonseok
2017-07-01
Using Brownian motion in periodic potentials V (x ) tilted by a force f , we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q , and in order to limit the output fluctuation within a relative uncertainty ɛ , at least 2 kBT /ɛ2 of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f ≪V'(x ) ] but also at far-from-equilibrium [f ≫V'(x ) ] , more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σq2) increases with its mean (μq), and it cannot be smaller than 2 kBT μq .
Entropic uncertainty and measurement reversibility
NASA Astrophysics Data System (ADS)
Berta, Mario; Wehner, Stephanie; Wilde, Mark M.
2016-07-01
The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.
Hyeon, Changbong; Hwang, Wonseok
2017-07-01
Using Brownian motion in periodic potentials V(x) tilted by a force f, we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q, and in order to limit the output fluctuation within a relative uncertainty ε, at least 2k_{B}T/ε^{2} of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f≪V^{'}(x)] but also at far-from-equilibrium [f≫V^{'}(x)], more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σ_{q}^{2}) increases with its mean (μ_{q}), and it cannot be smaller than 2k_{B}Tμ_{q}.
Effects of Lambertian sources design on uniformity and measurements
NASA Astrophysics Data System (ADS)
Cariou, Nadine; Durell, Chris; McKee, Greg; Wilks, Dylan; Glastre, Wilfried
2014-10-01
Integrating sphere (IS) based uniform sources are a primary tool for ground based calibration, characterization and testing of flight radiometric equipment. The idea of a Lambertian field of energy is a very useful tool in radiometric testing, but this concept is being checked in many ways by newly lowered uncertainty goals. At an uncertainty goal of 2% one needs to assess carefully uniformity in addition to calibration uncertainties, as even sources with a 0.5% uniformity are now substantial proportions of uncertainty budgets. The paper explores integrating sphere design options for achieving 99.5% and better uniformity of exit port radiance and spectral irradiance created by an integrating sphere. Uniformity in broad spectrum and spectral bands are explored. We discuss mapping techniques and results as a function of observed uniformity as well as laboratory testing results customized to match with customer's instrumentation field of view. We will also discuss recommendations with basic commercial instrumentation, we have used to validate, inspect, and improve correlation of uniformity measurements with the intended application.
UNCERTAINTY IN SOURCE PARTITIONING USING STABLE ISOTOPES
Stable isotope analyses are often used to quantify the contribution of multiple sources to a mixture, such as proportions of food sources in an animal's diet, C3 vs. C4 plant inputs to soil organic carbon, etc. Linear mixing models can be used to partition two sources with a sin...
``From Fundamental Motives to Rational Expectation Equilibrium[REE, henceworth] of Indeterminacy''
NASA Astrophysics Data System (ADS)
Maksoed, Ssi, Wh-
For ``Principle of Indeterminacy''from Heisenberg states: ``one of the fundamental cornerstone of quantum mechanics is the Heisenberg uncertainty principle''.whereas canonically conjugate quantities can be determined simultaneously only with a characteristic indeterminacy[M. Arevalo Aguilar, et.al]. Accompanying Alfred North Whitehead conclusion in ``The Aims of Education''that mathematical symbols are artificial before new meanings given, two kinds of fundamental motives: (i) expectation-expectation, (ii) expectation-certainty inherently occurs with determinacy properties of rational expectation equilibrium(REE, henceworth)- Guido Ascari & Tizano Ropele:''Trend inflation, Taylor principle & Indeterminacy'', Kiel Institute, June 2007. Furthers, relative price expression can be compare of their α and (1 - α) configurations in the expression of possible activity. Acknowledgment to Prof[asc]. Dr. Bobby Eka Gunara for ``made a rank through physics'' denotes...
NASA Astrophysics Data System (ADS)
Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu
2017-06-01
The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.
NASA Astrophysics Data System (ADS)
Kato, S.; Rutan, D. A.; Rose, F. G.; Loeb, N. G.
2017-12-01
The surface of the Earth receives solar radiation (shortwave) and emission from the atmosphere (longwave). At a global and annual mean approximately 12% of solar radiation incident on the surface is reflected and the rest is absorbed by the surface. The surface emits radiation proportional to the forth power of the temperature. Although the uncertainty in global and annual mean surface irradiances is estimated in earlier studies (Zhang et al. 1995, 2004; L'Ecuyer et al. 2008; Stephens et al. 2012; Kato et al. 2012), only a few studies estimated the uncertainty in computed surface irradiances at smaller spatial and temporal scales (Zhang et al. 1995, 2004; Kato et al. 2012). We use surface observations at 46 buoys and 36 land sites and newly released the Edition 4.0 Clouds and the Earth's Radiant Energy System (CERES) Energy Balanced and Filled (EBAF)-surface data product to estimate the uncertainty in regional and zonal monthly mean downward shortwave and longwave surface irradiances. The root-mean-square difference of monthly mean computed and observed irradiances is used for the regional uncertainty. The uncertainty is separated into bias and spatially random components. The random component decreases when irradiances are averaged over a larger area, nearly inversely proportional to the number of surface observation sites. The presentation provides the uncertainty in the regional and zonal monthly mean downward surface irradiances over ocean and land. ReferencesKato, S. and N.G.Loeb, D. A.Rutan, F. G. Rose, S. Sun-Mack,W.F.Miller, and Y. Chen, 2012. Surv. Geophys., 33, 395-412, doi:10.1007/s10712-012-9179-x. L'Ecuyer, T. S., N. B. Wood, T. Haladay, G. L. Stephens, and P. W. Stackhouse Jr., 2008, J. Geophys. Res., 113, D00A15, doi:10.1029/2008JD009951. Stephens, G. L. and Coauthors, 2012, Nat. Geosci., 5, 691-696, doi:10.1038/ngeo1580. Zhang, Y., W. B. Rossow, A. A. Lacis, V. Oinas, and M. I. Mishchenko, 2004, J. Geophys. Res., 109, D19105, doi:10.1029/2003JD004457. Zhang, Y.-C., W. B. Rossow, and A. A. Lacis, 1995, J. Geophys. Res., 100, 1149-1165.
Köppel, René; Eugster, Albert; Ruf, Jürg; Rentsch, Jürg
2012-01-01
The quantification of meat proportions in raw and boiled sausage according to the recipe was evaluated using three different calibrators. To measure the DNA contents from beef, pork, sheep (mutton), and horse, a tetraplex real-time PCR method was applied. Nineteen laboratories analyzed four meat products each made of different proportions of beef, pork, sheep, and horse meat. Three kinds of calibrators were used: raw and boiled sausages of known proportions ranging from 1 to 55% of meat, and a dilution series of DNA from muscle tissue. In general, results generated using calibration sausages were more accurate than those resulting from the use of DNA from muscle tissue, and exhibited smaller measurement uncertainties. Although differences between uses of raw and boiled calibration sausages were small, the most precise and accurate results were obtained by calibration with fine-textured boiled reference sausages.
2013-03-01
Incompleteness Theorem (used in mathematics), and Heisenberg’s Uncertainty Principle as it pertains to the mathematical foundations of quantum ...subtlety which normal consciousness cannot even see…”46 It is in our PME system where we can create the time to develop the unconscious realms of the
Human Resource Planning: An Introduction. Report 312.
ERIC Educational Resources Information Center
Reilly, Peter
This report is designed to give readers an introduction to the principles of human resource planning (HRP) and the areas in which it can be used, including those facing today's managers. Chapter 1 outlines why some organizations no longer plan, describes the background of change and uncertainty that discouraged them, and defines HRP. Chapter 2…
Quantum information aspects of noncommutative quantum mechanics
NASA Astrophysics Data System (ADS)
Bertolami, Orfeu; Bernardini, Alex E.; Leal, Pedro
2018-01-01
Some fundamental aspects related with the construction of Robertson-Schrödinger-like uncertainty-principle inequalities are reported in order to provide an overall description of quantumness, separability and nonlocality of quantum systems in the noncommutative phase-space. Some consequences of the deformed noncommutative algebra are also considered in physical systems of interest.
The Heisenberg Uncertainty Principle Demonstrated with An Electron Diffraction Experiment
ERIC Educational Resources Information Center
Matteucci, Giorgio; Ferrari, Loris; Migliori, Andrea
2010-01-01
An experiment analogous to the classical diffraction of light from a circular aperture has been realized with electrons. The results are used to introduce undergraduate students to the wave behaviour of electrons. The diffraction fringes produced by the circular aperture are compared to those predicted by quantum mechanics and are exploited to…
Franchisees in Crisis: Using Action Learning to Self-Organise
ERIC Educational Resources Information Center
O'Donoghue, Carol
2011-01-01
The present article describes the use of action learning by a group of 30 franchisees to organise themselves and work through a period of upheaval and uncertainty when their parent company faced liquidation. Written from the perspective of one of the franchisees who found herself adopting action learning principles to facilitate the group, it…
Quantization of spacetime based on a spacetime interval operator
NASA Astrophysics Data System (ADS)
Chiang, Hsu-Wen; Hu, Yao-Chieh; Chen, Pisin
2016-04-01
Motivated by both concepts of Adler's recent work on utilizing Clifford algebra as the linear line element d s =⟨γμ⟩ d Xμ and the fermionization of the cylindrical worldsheet Polyakov action, we introduce a new type of spacetime quantization that is fully covariant. The theory is based on the reinterpretation of Adler's linear line element as d s =γμ⟨λ γμ⟩ , where λ is the characteristic length of the theory. We name this new operator the "spacetime interval operator" and argue that it can be regarded as a natural extension to the one-forms in the U (s u (2 )) noncommutative geometry. By treating Fourier momentum as the particle momentum, the generalized uncertainty principle of the U (s u (2 )) noncommutative geometry, as an approximation to the generalized uncertainty principle of our theory, is derived and is shown to have a lowest order correction term of the order p2 similar to that of Snyder's. The holography nature of the theory is demonstrated and the predicted fuzziness of the geodesic is shown to be much smaller than conceivable astrophysical bounds.
Risk-based principles for defining and managing water security
Hall, Jim; Borgomeo, Edoardo
2013-01-01
The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616
The Species Delimitation Uncertainty Principle
Adams, Byron J.
2001-01-01
If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874
The Rosiwal Principle and the regolithic distributions of solar-wind elements
NASA Technical Reports Server (NTRS)
Criswell, D. R.
1975-01-01
In situ accumulation of solar elements is studied for the purpose of determining the extent of applicability of the Rosiwal Principle. The Rosiwal Principle states that the grain exposure area is proportional to the fraction of the unit volume occupied by the grains, and the test involves measurement of the relative concentrations of inert gases and reactive elements across sets of lunar fines samples for which mean grain size, sorting, and minimum radius of surface correlation are known. In some cases, the quantity of an element implanted into the lunar fines from the solar wind is found to be surface correlated, and the implications of this relationship are considered. According to the Rosiwal Principle, coarse soils should retain less inert gas than fine soil. The Principle can also be applied to species volatized or sputtered from the lunar surface and redeposited locally.
Stronger steerability criterion for more uncertain continuous-variable systems
NASA Astrophysics Data System (ADS)
Chowdhury, Priyanka; Pramanik, Tanumoy; Majumdar, A. S.
2015-10-01
We derive a fine-grained uncertainty relation for the measurement of two incompatible observables on a single quantum system of continuous variables, and show that continuous-variable systems are more uncertain than discrete-variable systems. Using the derived fine-grained uncertainty relation, we formulate a stronger steering criterion that is able to reveal the steerability of NOON states that has hitherto not been possible using other criteria. We further obtain a monogamy relation for our steering inequality which leads to an, in principle, improved lower bound on the secret key rate of a one-sided device independent quantum key distribution protocol for continuous variables.
Saviano, Alessandro Morais; Francisco, Fabiane Lacerda; Ostronoff, Celina Silva; Lourenço, Felipe Rebello
2015-01-01
The aim of this study was to develop, optimize, and validate a microplate bioassay for relative potency determination of linezolid in pharmaceutical samples using quality-by-design and design space approaches. In addition, a procedure is described for estimating relative potency uncertainty based on microbiological response variability. The influence of culture media composition was studied using a factorial design and a central composite design was adopted to study the influence of inoculum proportion and triphenyltetrazolium chloride in microbial growth. The microplate bioassay was optimized regarding the responses of low, medium, and high doses of linezolid, negative and positive controls, and the slope, intercept, and correlation coefficient of dose-response curves. According to optimization results, design space ranges were established using: (a) low (1.0 μg/mL), medium (2.0 μg/mL), and high (4.0 μg/mL) doses of pharmaceutical samples and linezolid chemical reference substance; (b) Staphylococcus aureus ATCC 653 in an inoculum proportion of 10%; (c) antibiotic No. 3 culture medium pH 7.0±0.1; (d) 6 h incubation at 37.0±0.1ºC; and (e) addition of 50 μL of 0.5% (w/v) triphenyltetrazolium chloride solution. The microplate bioassay was linear (r2=0.992), specific, precise (repeatability RSD=2.3% and intermediate precision RSD=4.3%), accurate (mean recovery=101.4%), and robust. The overall measurement uncertainty was reasonable considering the increased variability inherent in microbiological response. Final uncertainty was comparable with those obtained with other microbiological assays, as well as chemical methods.
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Baudin, François; Chenu, Claire; Houot, Sabine; Jolivet, Romain; Kätterer, Thomas; Lutfalla, Suzanne; Macdonald, Andy; van Oort, Folkert; Plante, Alain F.; Savignac, Florence; Soucémarianadin, Laure N.; Barré, Pierre
2018-05-01
Changes in global soil carbon stocks have considerable potential to influence the course of future climate change. However, a portion of soil organic carbon (SOC) has a very long residence time ( > 100 years) and may not contribute significantly to terrestrial greenhouse gas emissions during the next century. The size of this persistent SOC reservoir is presumed to be large. Consequently, it is a key parameter required for the initialization of SOC dynamics in ecosystem and Earth system models, but there is considerable uncertainty in the methods used to quantify it. Thermal analysis methods provide cost-effective information on SOC thermal stability that has been shown to be qualitatively related to SOC biogeochemical stability. The objective of this work was to build the first quantitative model of the size of the centennially persistent SOC pool based on thermal analysis. We used a unique set of 118 archived soil samples from four agronomic experiments in northwestern Europe with long-term bare fallow and non-bare fallow treatments (e.g., manure amendment, cropland and grassland) as a sample set for which estimating the size of the centennially persistent SOC pool is relatively straightforward. At each experimental site, we estimated the average concentration of centennially persistent SOC and its uncertainty by applying a Bayesian curve-fitting method to the observed declining SOC concentration over the duration of the long-term bare fallow treatment. Overall, the estimated concentrations of centennially persistent SOC ranged from 5 to 11 g C kg-1 of soil (lowest and highest boundaries of four 95 % confidence intervals). Then, by dividing the site-specific concentrations of persistent SOC by the total SOC concentration, we could estimate the proportion of centennially persistent SOC in the 118 archived soil samples and the associated uncertainty. The proportion of centennially persistent SOC ranged from 0.14 (standard deviation of 0.01) to 1 (standard deviation of 0.15). Samples were subjected to thermal analysis by Rock-Eval 6 that generated a series of 30 parameters reflecting their SOC thermal stability and bulk chemistry. We trained a nonparametric machine-learning algorithm (random forests multivariate regression model) to predict the proportion of centennially persistent SOC in new soils using Rock-Eval 6 thermal parameters as predictors. We evaluated the model predictive performance with two different strategies. We first used a calibration set (n = 88) and a validation set (n = 30) with soils from all sites. Second, to test the sensitivity of the model to pedoclimate, we built a calibration set with soil samples from three out of the four sites (n = 84). The multivariate regression model accurately predicted the proportion of centennially persistent SOC in the validation set composed of soils from all sites (R2 = 0.92, RMSEP = 0.07, n = 30). The uncertainty of the model predictions was quantified by a Monte Carlo approach that produced conservative 95 % prediction intervals across the validation set. The predictive performance of the model decreased when predicting the proportion of centennially persistent SOC in soils from one fully independent site with a different pedoclimate, yet the mean error of prediction only slightly increased (R2 = 0.53, RMSEP = 0.10, n = 34). This model based on Rock-Eval 6 thermal analysis can thus be used to predict the proportion of centennially persistent SOC with known uncertainty in new soil samples from different pedoclimates, at least for sites that have similar Rock-Eval 6 thermal characteristics to those included in the calibration set. Our study reinforces the evidence that there is a link between the thermal and biogeochemical stability of soil organic matter and demonstrates that Rock-Eval 6 thermal analysis can be used to quantify the size of the centennially persistent organic carbon pool in temperate soils.
Applications of statistics to medical science, IV survival analysis.
Watanabe, Hiroshi
2012-01-01
The fundamental principles of survival analysis are reviewed. In particular, the Kaplan-Meier method and a proportional hazard model are discussed. This work is the last part of a series in which medical statistics are surveyed.
Optimality in the Development of Intestinal Crypts
Itzkovitz, Shalev; Blat, Irene C.; Jacks, Tyler; Clevers, Hans; van Oudenaarden, Alexander
2012-01-01
SUMMARY Intestinal crypts in mammals are comprised of long-lived stem cells and shorter-lived progenies. These two populations are maintained in specific proportions during adult life. Here, we investigate the design principles governing the dynamics of these proportions during crypt morphogenesis. Using optimal control theory, we show that a proliferation strategy known as a “bang-bang” control minimizes the time to obtain a mature crypt. This strategy consists of a surge of symmetric stem cell divisions, establishing the entire stem cell pool first, followed by a sharp transition to strictly asymmetric stem cell divisions, producing nonstem cells with a delay. We validate these predictions using lineage tracing and single-molecule fluorescence in situ hybridization of intestinal crypts in infant mice, uncovering small crypts that are entirely composed of Lgr5-labeled stem cells, which become a minority as crypts continue to grow. Our approach can be used to uncover similar design principles in other developmental systems. PMID:22304925
Optimality in the Development of Intestinal Crypts
NASA Astrophysics Data System (ADS)
van Oudenaarden, Alexander
2012-02-01
Intestinal crypts in mammals are comprised of long-lived stem cells and shorter-lived progenies, maintained under tight proportions during adult life. Here we ask what are the design principles that govern the dynamics of these proportions during crypt morphogenesis. We use optimal control theory to show that a stem cell proliferation strategy known as a `bang-bang' control minimizes the time to obtain a mature crypt. This strategy consists of a surge of symmetric stem cell divisions, establishing the entire stem cell pool first, followed by a sharp transition to strictly asymmetric stem cell divisions, producing non-stem cells with a delay. We validate these predictions using lineage tracing and single molecule fluorescent in-situ hybridization of intestinal crypts in newborn mice and find that small crypts are entirely composed of Lgr5 stem cells, which become a minority as crypts further grow. Our approach can be used to uncover similar design principles in other developmental systems.
Huchra, J P
1992-04-17
The Hubble constant is the constant of proportionality between recession velocity and distance in the expanding universe. It is a fundamental property of cosmology that sets both the scale and the expansion age of the universe. It is determined by measurement of galaxy The Hubble constant is the constant of proportionality between recession velocity and development of new techniques for the measurements of galaxy distances, both calibration uncertainties and debates over systematic errors remain. Current determinations still range over nearly a factor of 2; the higher values favored by most local measurements are not consistent with many theories of the origin of large-scale structure and stellar evolution.
Pace, D. C.; Pipes, R.; Fisher, R. K.; ...
2014-08-05
New phase space mapping and uncertainty analysis of energetic ion loss data in the DIII-D tokamak provides experimental results that serve as valuable constraints in first-principles simulations of energetic ion transport. Beam ion losses are measured by the fast ion loss detector (FILD) diagnostic system consisting of two magnetic spectrometers placed independently along the outer wall. Monte Carlo simulations of mono-energetic and single-pitch ions reaching the FILDs are used to determine the expected uncertainty in the measurements. Modeling shows that the variation in gyrophase of 80 keV beam ions at the FILD aperture can produce an apparent measured energy signaturemore » spanning across 50-140 keV. As a result, these calculations compare favorably with experiments in which neutral beam prompt loss provides a well known energy and pitch distribution.« less
Accounting for uncertainty in health economic decision models by using model averaging
Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D
2009-01-01
Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment. PMID:19381329
NASA Astrophysics Data System (ADS)
Sutton, Jonathan E.; Guo, Wei; Katsoulakis, Markos A.; Vlachos, Dionisios G.
2016-04-01
Kinetic models based on first principles are becoming common place in heterogeneous catalysis because of their ability to interpret experimental data, identify the rate-controlling step, guide experiments and predict novel materials. To overcome the tremendous computational cost of estimating parameters of complex networks on metal catalysts, approximate quantum mechanical calculations are employed that render models potentially inaccurate. Here, by introducing correlative global sensitivity analysis and uncertainty quantification, we show that neglecting correlations in the energies of species and reactions can lead to an incorrect identification of influential parameters and key reaction intermediates and reactions. We rationalize why models often underpredict reaction rates and show that, despite the uncertainty being large, the method can, in conjunction with experimental data, identify influential missing reaction pathways and provide insights into the catalyst active site and the kinetic reliability of a model. The method is demonstrated in ethanol steam reforming for hydrogen production for fuel cells.
Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chunyuan; Stevens, Andrew J.; Chen, Changyou
2016-08-10
Learning the representation of shape cues in 2D & 3D objects for recognition is a fundamental task in computer vision. Deep neural networks (DNNs) have shown promising performance on this task. Due to the large variability of shapes, accurate recognition relies on good estimates of model uncertainty, ignored in traditional training of DNNs, typically learned via stochastic optimization. This paper leverages recent advances in stochastic gradient Markov Chain Monte Carlo (SG-MCMC) to learn weight uncertainty in DNNs. It yields principled Bayesian interpretations for the commonly used Dropout/DropConnect techniques and incorporates them into the SG-MCMC framework. Extensive experiments on 2D &more » 3D shape datasets and various DNN models demonstrate the superiority of the proposed approach over stochastic optimization. Our approach yields higher recognition accuracy when used in conjunction with Dropout and Batch-Normalization.« less
Long Term Uncertainty Investigations of 1 MN Force Calibration Machine at NPL, India (NPLI)
NASA Astrophysics Data System (ADS)
Kumar, Rajesh; Kumar, Harish; Kumar, Anil; Vikram
2012-01-01
The present paper is an attempt to study the long term uncertainty of 1 MN hydraulic multiplication system (HMS) force calibration machine (FCM) at the National Physical Laboratory, India (NPLI), which is used for calibration of the force measuring instruments in the range of 100 kN - 1 MN. The 1 MN HMS FCM was installed at NPLI in 1993 and was built on the principle of hydraulic amplifications of dead weights. The best measurement capability (BMC) of the machine is ± 0.025% (
Gregersen, I B; Arnbjerg-Nielsen, K
2012-01-01
Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.
Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.
Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A
2014-01-01
The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.
NASA Astrophysics Data System (ADS)
Gu, Tingwei; Kong, Deren; Shang, Fei; Chen, Jing
2018-04-01
This paper describes the merits and demerits of different sensors for measuring propellant gas pressure, the applicable range of the frequently used dynamic pressure calibration methods, and the working principle of absolute quasi-static pressure calibration based on the drop-weight device. The main factors affecting the accuracy of pressure calibration are analyzed from two aspects of the force sensor and the piston area. To calculate the effective area of the piston rod and evaluate the uncertainty between the force sensor and the corresponding peak pressure in the absolute quasi-static pressure calibration process, a method for solving these problems based on the least squares principle is proposed. According to the relevant quasi-static pressure calibration experimental data, the least squares fitting model between the peak force and the peak pressure, and the effective area of the piston rod and its measurement uncertainty, are obtained. The fitting model is tested by an additional group of experiments, and the peak pressure obtained by the existing high-precision comparison calibration method is taken as the reference value. The test results show that the peak pressure obtained by the least squares fitting model is closer to the reference value than the one directly calculated by the cross-sectional area of the piston rod. When the peak pressure is higher than 150 MPa, the percentage difference is less than 0.71%, which can meet the requirements of practical application.
Hollon, Matthew F
2015-01-01
By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents.
Prevalence of Pervasive Developmental Disorders in Two Canadian Provinces
ERIC Educational Resources Information Center
Ouellette-Kuntz, Helene; Coo, Helen; Yu, C. T.; Chudley, Albert E.; Noonan, Andrea; Breitenbach, Marlene; Ramji, Nasreen; Prosick, Talia; Bedard, Angela; Holden, Jeanette J. A.
2006-01-01
Although it is generally accepted that the proportion of children diagnosed with pervasive developmental disorders (PDDs) has increased in the past two decades, there is no consensus on the prevalence of these conditions. The accompanying large rise in demand for services, together with uncertainty regarding the extent to which the observed…
Rosi, G.; D'Amico, G.; Cacciapuoti, L.; Sorrentino, F.; Prevedelli, M.; Zych, M.; Brukner, Č.; Tino, G. M.
2017-01-01
The Einstein equivalence principle (EEP) has a central role in the understanding of gravity and space–time. In its weak form, or weak equivalence principle (WEP), it directly implies equivalence between inertial and gravitational mass. Verifying this principle in a regime where the relevant properties of the test body must be described by quantum theory has profound implications. Here we report on a novel WEP test for atoms: a Bragg atom interferometer in a gravity gradiometer configuration compares the free fall of rubidium atoms prepared in two hyperfine states and in their coherent superposition. The use of the superposition state allows testing genuine quantum aspects of EEP with no classical analogue, which have remained completely unexplored so far. In addition, we measure the Eötvös ratio of atoms in two hyperfine levels with relative uncertainty in the low 10−9, improving previous results by almost two orders of magnitude. PMID:28569742
Estimation of divergence from Hardy-Weinberg form.
Stark, Alan E
2015-08-01
The Hardy–Weinberg (HW) principle explains how random mating (RM) can produce and maintain a population in equilibrium, that is, with constant genotypic proportions. When proportions diverge from HW form, it is of interest to estimate the fixation index F, which reflects the degree of divergence. Starting from a sample of genotypic counts, a mixed procedure gives first the orthodox estimate of gene frequency q and then a Bayesian estimate of F, based on a credible prior distribution of F, which is described here.
Djulbegovic, Benjamin; Cantor, Alan; Clarke, Mike
2003-01-01
Previous research has identified methodological problems in the design and conduct of randomized trials that could, if left unaddressed, lead to biased results. In this report we discuss one such problem, inadequate control intervention, and argue that it can be by far the most important design characteristic of randomized trials in overestimating the effect of new treatments. Current guidelines for the design and reporting of randomized trials, such as the Consolidated Standards of Reporting Trials (CONSORT) statement, do not address the choice of the comparator intervention. We argue that an adequate control intervention can be selected if people designing a trial explicitly take into consideration the ethical principle of equipoise, also known as "the uncertainty principle."
Decoherence effect on quantum-memory-assisted entropic uncertainty relations
NASA Astrophysics Data System (ADS)
Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2018-01-01
Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.
NASA Astrophysics Data System (ADS)
Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.
2014-11-01
In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.
A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.
Zheng, Chaojie; Wang, Xiuying; Feng, Dagan
2015-01-01
PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.
Experimental validation of a self-calibrating cryogenic mass flowmeter
NASA Astrophysics Data System (ADS)
Janzen, A.; Boersch, M.; Burger, B.; Drache, J.; Ebersoldt, A.; Erni, P.; Feldbusch, F.; Oertig, D.; Grohmann, S.
2017-12-01
The Karlsruhe Institute of Technology (KIT) and the WEKA AG jointly develop a commercial flowmeter for application in helium cryostats. The flowmeter functions according to a new thermal measurement principle that eliminates all systematic uncertainties and enables self-calibration during real operation. Ideally, the resulting uncertainty of the measured flow rate is only dependent on signal noises, which are typically very small with regard to the measured value. Under real operating conditions, cryoplant-dependent flow rate fluctuations induce an additional uncertainty, which follows from the sensitivity of the method. This paper presents experimental results with helium at temperatures between 30 and 70 K and flow rates in the range of 4 to 12 g/s. The experiments were carried out in a control cryostat of the 2 kW helium refrigerator of the TOSKA test facility at KIT. Inside the cryostat, the new flowmeter was installed in series with a Venturi tube that was used for reference measurements. The measurement results demonstrate the self-calibration capability during real cryoplant operation. The influences of temperature and flow rate fluctuations on the self-calibration uncertainty are discussed.
While Heisenberg Is Not Looking: The Strength of "Weak Measurements" in Educational Research
ERIC Educational Resources Information Center
Geelan, David R.
2015-01-01
The concept of "weak measurements" in quantum physics is a way of "cheating" the Uncertainty Principle. Heisenberg stated (and 85 years of experiments have demonstrated) that it is impossible to know both the position and momentum of a particle with arbitrary precision. More precise measurements of one decrease the precision…
Using Familiar Contexts to Ease the Transition between A-Level and First-Year Degree-Level Chemistry
ERIC Educational Resources Information Center
Turner, John J.
2013-01-01
This article endeavours to define how an understanding of the context of chemical principles and processes investigated at A-level (post-16) and earlier can be continued and contribute to easing the tensions and uncertainties encountered by chemistry and chemical engineering students on entry to university. The importance of using chemistry…
The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary
ERIC Educational Resources Information Center
Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul
2012-01-01
Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…
From Autonomy to Relationships: Productive Engagement with Uncertainty
ERIC Educational Resources Information Center
Clegg, J.; Lansdall-Welfare, R.
2010-01-01
This paper argues that we are at a point of change in ID services, that new ideas and different frames of reference are required to take services forward in the 21st century. We describe how contemporary thinking in architecture, philosophy and organisational theory can assist in generating service principles for specialist services that allow us…
ERIC Educational Resources Information Center
Dunne, Ciaran
2011-01-01
Although many academics and policymakers espouse the idea of an intercultural curriculum in principle, the practical implementation of this is problematic for several reasons. Firstly, the ambiguity and uncertainty that often surrounds key concepts complicates the articulation of cogent rationales and goals. Secondly, there may be no clear vision…
A Methodology for Validation of High Resolution Combat Models
1988-06-01
TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the
Shen, Yu-Ming; Le, Lien D; Wilson, Rory; Mansmann, Ulrich
2017-01-09
Biomarkers providing evidence for patient-treatment interaction are key in the development and practice of personalized medicine. Knowledge that a patient with a specific feature - as demonstrated through a biomarker - would have an advantage under a given treatment vs. a competing treatment can aid immensely in medical decision-making. Statistical strategies to establish evidence of continuous biomarkers are complex and their formal results are thus not easy to communicate. Good graphical representations would help to translate such findings for use in the clinical community. Although general guidelines on how to present figures in clinical reports are available, there remains little guidance for figures elucidating the role of continuous biomarkers in patient-treatment interaction (CBPTI). To combat the current lack of comprehensive reviews or adequate guides on graphical presentation within this topic, our study proposes presentation principles for CBPTI plots. In order to understand current practice, we review the development of CBPTI methodology and how CBPTI plots are currently used in clinical research. The quality of a CBPTI plot is determined by how well the presentation provides key information for clinical decision-making. Several criteria for a good CBPTI plot are proposed, including general principles of visual display, use of units presenting absolute outcome measures, appropriate quantification of statistical uncertainty, correct display of benchmarks, and informative content for answering clinical questions especially on the quantitative advantage for an individual patient with regard to a specific treatment. We examined the development of CBPTI methodology from the years 2000 - 2014, and reviewed how CBPTI plots were currently used in clinical research in six major clinical journals from 2013 - 2014 using the principle of theoretical saturation. Each CBPTI plot found was assessed for appropriateness of its presentation and clinical utility. In our review, a total of seven methodological papers and five clinical reports used CBPTI plots which we categorized into four types: those that distinguish the outcome effect for each treatment group; those that show the outcome differences between treatment groups (by either partitioning all individuals into subpopulations or modelling the functional form of the interaction); those that evaluate the proportion of population impact of the biomarker; and those that show the classification accuracy of the biomarker. The current practice of utilizing CBPTI plots in clinical reports suffers from methodological shortcomings: the lack of presentation of statistical uncertainty, the outcome measure scaled by relative unit instead of absolute unit, incorrect use of benchmarks, and being non-informative in answering clinical questions. There is considerable scope for improvement in the graphical representation of CBPTI in clinical reports. The current challenge is to develop instruments for high-quality graphical plots which not only convey quantitative concepts to readers with limited statistical knowledge, but also facilitate medical decision-making.
Maximum predictive power and the superposition principle
NASA Technical Reports Server (NTRS)
Summhammer, Johann
1994-01-01
In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.
A novel continuous fractional sliding mode control
NASA Astrophysics Data System (ADS)
Muñoz-Vázquez, A. J.; Parra-Vega, V.; Sánchez-Orta, A.
2017-10-01
A new fractional-order controller is proposed, whose novelty is twofold: (i) it withstands a class of continuous but not necessarily differentiable disturbances as well as uncertainties and unmodelled dynamics, and (ii) based on a principle of dynamic memory resetting of the differintegral operator, it is enforced an invariant sliding mode in finite time. Both (i) and (ii) account for exponential convergence of tracking errors, where such principle is instrumental to demonstrate the closed-loop stability, robustness and a sustained sliding motion, as well as that high frequencies are filtered out from the control signal. The proposed methodology is illustrated with a representative simulation study.
Uncertainties in selected river water quality data
NASA Astrophysics Data System (ADS)
Rode, M.; Suhr, U.
2007-02-01
Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.
Uncertainties in selected surface water quality data
NASA Astrophysics Data System (ADS)
Rode, M.; Suhr, U.
2006-09-01
Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Borrmann, Robin
2010-01-01
This article examines whether the use of Depleted Uranium (DU) munitions can be considered illegal under current public international law. The analysis covers the law of arms control and focuses in particular on international humanitarian law. The article argues that DU ammunition cannot be addressed adequately under existing treaty based weapon bans, such as the Chemical Weapons Convention, due to the fact that DU does not meet the criteria required to trigger the applicability of those treaties. Furthermore, it is argued that continuing uncertainties regarding the effects of DU munitions impedes a reliable review of the legality of their use under various principles of international law, including the prohibition on employing indiscriminate weapons; the prohibition on weapons that are intended, or may be expected, to cause widespread, long-term and severe damage to the natural environment; and the prohibition on causing unnecessary suffering or superfluous injury. All of these principles require complete knowledge of the effects of the weapon in question. Nevertheless, the author argues that the same uncertainty places restrictions on the use of DU under the precautionary principle. The paper concludes with an examination of whether or not there is a need for--and if so whether there is a possibility of achieving--a Convention that comprehensively outlaws the use, transfer and stockpiling of DU weapons, as proposed by some non-governmental organisations (NGOs).
Hayford, Sarah R.; Agadjanian, Victor
2012-01-01
In many high-fertility countries, and especially in sub-Saharan Africa, substantial proportions of women give non-numeric responses when asked about desired family size. Demographic transition theory has interpreted responses of “don’t know” or “up to God” as evidence of fatalistic attitudes toward childbearing. Alternatively, these responses can be understood as meaningful reactions to uncertainty about the future. Following this latter approach, we use data from rural Mozambique to test the hypothesis that non-numeric responses are more common when uncertainty about the future is greater. We expand on previous research linking child mortality and non-numeric fertility preferences by testing the predictive power of economic conditions, marital instability, and adult mortality. Results show that uncertainty related to adult and child mortality and to economic conditions predicts non-numeric responses, while marital stability is less strongly related. PMID:26430294
Hayford, Sarah R; Agadjanian, Victor
In many high-fertility countries, and especially in sub-Saharan Africa, substantial proportions of women give non-numeric responses when asked about desired family size. Demographic transition theory has interpreted responses of "don't know" or "up to God" as evidence of fatalistic attitudes toward childbearing. Alternatively, these responses can be understood as meaningful reactions to uncertainty about the future. Following this latter approach, we use data from rural Mozambique to test the hypothesis that non-numeric responses are more common when uncertainty about the future is greater. We expand on previous research linking child mortality and non-numeric fertility preferences by testing the predictive power of economic conditions, marital instability, and adult mortality. Results show that uncertainty related to adult and child mortality and to economic conditions predicts non-numeric responses, while marital stability is less strongly related.
Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas
2018-01-01
Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.
Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas
2018-01-01
Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505
Nuclear Forensics and Radiochemistry: Radiation Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rundberg, Robert S.
Radiation detection is necessary for isotope identification and assay in nuclear forensic applications. The principles of operation of gas proportional counters, scintillation counters, germanium and silicon semiconductor counters will be presented. Methods for calibration and potential pitfalls in isotope quantification will be described.
A Crisis of Legendary Proportions.
ERIC Educational Resources Information Center
Simpson, Christopher
2001-01-01
Describes the activities of Indiana University's crisis communications team during the Bob Knight controversy. Discusses how the school's response was based on four crisis communications principles: create a plan, appoint a single spokesperson, respond with open and continuous communications, and expect the unexpected. (EV)
NASA Astrophysics Data System (ADS)
Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.
2014-07-01
Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via the water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling, time-proportional sampling and passive sampling using flow proportional samplers. Assuming time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.
NASA Astrophysics Data System (ADS)
Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.
2014-11-01
Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling; time-proportional sampling; and passive sampling using flow-proportional samplers. Assuming hourly time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.
The role of uncertainty and reward on eye movements in a virtual driving task
Sullivan, Brian T.; Johnson, Leif; Rothkopf, Constantin A.; Ballard, Dana; Hayhoe, Mary
2012-01-01
Eye movements during natural tasks are well coordinated with ongoing task demands and many variables could influence gaze strategies. Sprague and Ballard (2003) proposed a gaze-scheduling model that uses a utility-weighted uncertainty metric to prioritize fixations on task-relevant objects and predicted that human gaze should be influenced by both reward structure and task-relevant uncertainties. To test this conjecture, we tracked the eye movements of participants in a simulated driving task where uncertainty and implicit reward (via task priority) were varied. Participants were instructed to simultaneously perform a Follow Task where they followed a lead car at a specific distance and a Speed Task where they drove at an exact speed. We varied implicit reward by instructing the participants to emphasize one task over the other and varied uncertainty in the Speed Task with the presence or absence of uniform noise added to the car's velocity. Subjects' gaze data were classified for the image content near fixation and segmented into looks. Gaze measures, including look proportion, duration and interlook interval, showed that drivers more closely monitor the speedometer if it had a high level of uncertainty, but only if it was also associated with high task priority or implicit reward. The interaction observed appears to be an example of a simple mechanism whereby the reduction of visual uncertainty is gated by behavioral relevance. This lends qualitative support for the primary variables controlling gaze allocation proposed in the Sprague and Ballard model. PMID:23262151
NASA Astrophysics Data System (ADS)
Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.
2017-12-01
Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.
Uncertainty and the difficulty of thinking through disjunctions.
Shafir, E
1994-01-01
This paper considers the relationship between decision under uncertainty and thinking through disjunctions. Decision situations that lead to violations of Savage's sure-thing principle are examined, and a variety of simple reasoning problems that often generate confusion and error are reviewed. The common difficulty is attributed to people's reluctance to think through disjunctions. Instead of hypothetically traveling through the branches of a decision tree, it is suggested, people suspend judgement and remain at the node. This interpretation is applied to instances of decision making, information search, deductive and inductive reasoning, probabilistic judgement, games, puzzles and paradoxes. Some implications of the reluctance to think through disjunctions, as well as potential corrective procedures, are discussed.
Lunar Laser Ranging Science: Gravitational Physics and Lunar Interior and Geodesy
NASA Technical Reports Server (NTRS)
Williams, James G.; Turyshev, Slava G.; Boggs, Dale H.; Ratcliff, J. Todd
2004-01-01
Laser pulses fired at retroreflectors on the Moon provide very accurate ranges. Analysis yields information on Earth, Moon, and orbit. The highly accurate retroreflector positions have uncertainties less than a meter. Tides on the Moon show strong dissipation, with Q=33+/-4 at a month and a weak dependence on period. Lunar rotation depends on interior properties; a fluid core is indicated with radius approx.20% that of the Moon. Tests of relativistic gravity verify the equivalence principle to +/-1.4x10(exp -13), limit deviations from Einstein's general relativity, and show no rate for the gravitational constant G/G with uncertainty 9x10(exp -13)/yr.
High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME
NASA Astrophysics Data System (ADS)
Otis, Richard A.; Liu, Zi-Kui
2017-05-01
One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.
Early object labels: the case for a developmental lexical principles framework.
Golinkoff, R M; Mervis, C B; Hirsh-Pasek, K
1994-02-01
Universally, object names make up the largest proportion of any word type found in children's early lexicons. Here we present and critically evaluate a set of six lexical principles (some previously proposed and some new) for making object label learning a manageable task. Overall, the principles have the effect of reducing the amount of information that language-learning children must consider for what a new word might mean. These principles are constructed by children in a two-tiered developmental sequence, as a function of their sensitivity to linguistic input, contextual information, and social-interactional cues. Thus, the process of lexical acquisition changes as a result of the particular principles a given child has at his or her disposal. For children who have only the principles of the first tier (reference, extendibility, and object scope), word learning has a deliberate and laborious look. The principles of the second tier (categorical scope, novel name-nameless category' or N3C, and conventionality) enable the child to acquire many new labels rapidly. The present unified account is argued to have a number of advantages over treating such principles separately and non-developmentally. Further, the explicit recognition that the acquisition and operation of these principles is influenced by the child's interpretation of both linguistic and non-linguistic input is seen as an advance.
Morin, Xavier; Thuiller, Wilfried
2009-05-01
Obtaining reliable predictions of species range shifts under climate change is a crucial challenge for ecologists and stakeholders. At the continental scale, niche-based models have been widely used in the last 10 years to predict the potential impacts of climate change on species distributions all over the world, although these models do not include any mechanistic relationships. In contrast, species-specific, process-based predictions remain scarce at the continental scale. This is regrettable because to secure relevant and accurate predictions it is always desirable to compare predictions derived from different kinds of models applied independently to the same set of species and using the same raw data. Here we compare predictions of range shifts under climate change scenarios for 2100 derived from niche-based models with those of a process-based model for 15 North American boreal and temperate tree species. A general pattern emerged from our comparisons: niche-based models tend to predict a stronger level of extinction and a greater proportion of colonization than the process-based model. This result likely arises because niche-based models do not take phenotypic plasticity and local adaptation into account. Nevertheless, as the two kinds of models rely on different assumptions, their complementarity is revealed by common findings. Both modeling approaches highlight a major potential limitation on species tracking their climatic niche because of migration constraints and identify similar zones where species extirpation is likely. Such convergent predictions from models built on very different principles provide a useful way to offset uncertainties at the continental scale. This study shows that the use in concert of both approaches with their own caveats and advantages is crucial to obtain more robust results and that comparisons among models are needed in the near future to gain accuracy regarding predictions of range shifts under climate change.
Asymmetric Uncertainty Expression for High Gradient Aerodynamics
NASA Technical Reports Server (NTRS)
Pinier, Jeremy T
2012-01-01
When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.
The Role Institutional Research Plays in Navigating the Current Economic Uncertainty
ERIC Educational Resources Information Center
Worley, Mary Beth
2008-01-01
Nationally, state spending for public higher education has been declining as a proportion of state general fund expenditures. Traditionally state-appropriated budgets for public colleges and universities tend to be cut during times of economic crisis, but often these funds are not always restored once the crisis has passed. As a result "the…
Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties
NASA Astrophysics Data System (ADS)
Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.
2018-01-01
Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data providers to include this information.
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Elshall, A. S.; Hanor, J. S.
2012-12-01
Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.
Uncertainty information in climate data records from Earth observation
NASA Astrophysics Data System (ADS)
Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang
2017-07-01
The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.
Affirmative Action and the Principle of Equality
ERIC Educational Resources Information Center
Sasseen, Robert F.
1976-01-01
A critical look is taken at affirmative action, which is called a preferential policy of proportional employment. The author suggests that affirmative action actually denies citizens equality of opportunity, writing racial distinctions into law and holding contempt for Blacks and other pretended beneficiaries. (LBH)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mace, Emily K.; Aalseth, Craig E.; Bonicalzi, Ricco
Abstract. Characterization of two sets of custom unequal length proportional counters is underway at Pacific Northwest National Laboratory (PNNL). These detectors will be used in measurements to determine the absolute activity concentration of gaseous radionuclides (e.g., 37Ar). A set of three detectors has been fabricated based on previous PNNL ultra-low-background proportional counters (ULBPC) designs and now operate in PNNL’s shallow underground counting laboratory. A second set of four counters has also been fabricated using clean assembly of OFHC copper components for use in an above-ground counting laboratory. Characterization of both sets of detectors is underway with measurements of background rates,more » gas gain, energy resolution, and shielding considerations. These results will be presented along with uncertainty estimates of future absolute gas counting measurements.« less
Wave-Particle Dualism in Action
NASA Astrophysics Data System (ADS)
Schleich, Wolfgang P.
The wave-particle dualism, that is the wave nature of particles and the particle nature of light together with the uncertainty relation of Werner Heisenberg and the principle of complementarity formulated by Niels Bohr represent pillars of quantum theory. We provide an introduction into these fascinating yet strange aspects of the microscopic world and summarize key experiments confirming these concepts so alien to our daily life.
CARA: Cognitive Architecture for Reasoning About Adversaries
2012-01-20
synthesis approach taken here the KIDS principle (Keep It Descriptive, Stupid ) applies, and agents and organizations are profiled in great detail...developed two algorithms to make forecasts about adversarial behavior. We developed game-theoretical approaches to reason about group behavior. We...to automatically make forecasts about group behavior together with methods to quantify the uncertainty inherent in such forecasts; • Developed
ERIC Educational Resources Information Center
Al-Hudawi, Shafeeq Hussain Vazhathodi; Fong, Rosy Lai Su; Musah, Mohammed Borhandden; Tahir, Lokman Mohd
2014-01-01
In the Malaysian context, all educational processes at the national level are envisioned by the National Education Philosophy (NEP). The NEP was formed in 1988 in line with the National Principles (Rukun Negara) with the ultimate aims of building a united and progressive society (Ministry of Education, 2001). However, there is uncertainty whether…
Entanglement Entropy of the Six-Dimensional Horowitz-Strominger Black Hole
NASA Astrophysics Data System (ADS)
Li, Huai-Fan; Zhang, Sheng-Li; Wu, Yue-Qin; Ren, Zhao
By using the entanglement entropy method, the statistical entropy of the Bose and Fermi fields in a thin film is calculated and the Bekenstein-Hawking entropy of six-dimensional Horowitz-Strominger black hole is obtained. Here, the Bose and Fermi fields are entangled with the quantum states in six-dimensional Horowitz-Strominger black hole and the fields are outside of the horizon. The divergence of brick-wall model is avoided without any cutoff by the new equation of state density obtained with the generalized uncertainty principle. The calculation implies that the high density quantum states near the event horizon are strongly correlated with the quantum states in black hole. The black hole entropy is a quantum effect. It is an intrinsic characteristic of space-time. The ultraviolet cutoff in the brick-wall model is unreasonable. The generalized uncertainty principle should be considered in the high energy quantum field near the event horizon. Using the quantum statistical method, we directly calculate the partition function of the Bose and Fermi fields under the background of the six-dimensional black hole. The difficulty in solving the wave equations of various particles is overcome.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Lorentz violation and generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Lambiase, Gaetano; Scardigli, Fabio
2018-04-01
Investigations on possible violation of Lorentz invariance have been widely pursued in the last decades, both from theoretical and experimental sides. A comprehensive framework to formulate the problem is the standard model extension (SME) proposed by A. Kostelecky, where violation of Lorentz invariance is encoded into specific coefficients. Here we present a procedure to link the deformation parameter β of the generalized uncertainty principle to the SME coefficients of the gravity sector. The idea is to compute the Hawking temperature of a black hole in two different ways. The first way involves the deformation parameter β , and therefore we get a deformed Hawking temperature containing the parameter β . The second way involves a deformed Schwarzschild metric containing the Lorentz violating terms s¯μ ν of the gravity sector of the SME. The comparison between the two different techniques yields a relation between β and s¯μ ν. In this way bounds on β transferred from s¯μ ν are improved by many orders of magnitude when compared with those derived in other gravitational frameworks. Also the opposite possibility of bounds transferred from β to s¯μ ν is briefly discussed.
Duong, Thien C.; Hackenberg, Robert E.; Landa, Alex; ...
2016-09-20
In this paper, thermodynamic and kinetic diffusivities of uranium–niobium (U–Nb) are re-assessed by means of the CALPHAD (CALculation of PHAse Diagram) methodology. In order to improve the consistency and reliability of the assessments, first-principles calculations are coupled with CALPHAD. In particular, heats of formation of γ -U–Nb are estimated and verified using various density-functional theory (DFT) approaches. These thermochemistry data are then used as constraints to guide the thermodynamic optimization process in such a way that the mutual-consistency between first-principles calculations and CALPHAD assessment is satisfactory. In addition, long-term aging experiments are conducted in order to generate new phase equilibriamore » data at the γ 2/α+γ 2 boundary. These data are meant to verify the thermodynamic model. Assessment results are generally in good agreement with experiments and previous calculations, without showing the artifacts that were observed in previous modeling. The mutual-consistent thermodynamic description is then used to evaluate atomic mobility and diffusivity of γ-U–Nb. Finally, Bayesian analysis is conducted to evaluate the uncertainty of the thermodynamic model and its impact on the system's phase stability.« less
Two additional principles for determining which species to monitor.
Wilson, Howard B; Rhodes, Jonathan R; Possingham, Hugh P
2015-11-01
Monitoring to detect population declines is widespread, but also costly. There is, consequently, a need to optimize monitoring to maximize cost-effectiveness. Here we develop a quantitative decision analysis framework for how to optimally allocate resources for monitoring among species. By keeping the framework simple, we analytically establish two new principles about which species are optimal to monitor for detecting declines: (1) those that lie on the boundary between species being allocated resources for conservation action and species that are not and (2) those with the greatest uncertainty in whether they are declining. These two principles are in addition to other factors that are also important in monitoring decisions, such as complementarity. We demonstrate the efficacy of these principles when other factors are not present, and show how the two principles can be combined. This analysis demonstrates that the most cost-effective species to monitor are ones where the information gained from monitoring is most likely to change the allocation of funds for action, not necessarily the most vulnerable or endangered. We suggest these results are general and apply to all ecological monitoring, not just of biological species: monitoring and information are only valuable when they are likely to change how people act.
Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea
2017-01-01
Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
Li, Yongming; Tong, Shaocheng
The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small neighborhood of zero. Finally, numerical results of practical examples are presented to further demonstrate the effectiveness of the proposed control strategy.The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small neighborhood of zero. Finally, numerical results of practical examples are presented to further demonstrate the effectiveness of the proposed control strategy.
Uncertainty Analysis in Space Radiation Protection
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2011-01-01
Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.
Farmer, T D; Shaw, P J; Williams, I D
2015-05-01
European nations are compelled to reduce reliance on landfill as a destination for household waste, and should, in principle, achieve this goal with due recognition of the aims and principles of the waste hierarchy. Past research has predominantly focused on recycling, whilst interactions between changing waste destinies, causes and drivers of household waste management change, and potential consequences for the goal of the waste hierarchy are less well understood. This study analysed Local Authority Collected Waste (LACW) for England, at national, regional and sub-regional level, in terms of the destination of household waste to landfill, incineration and recycling. Information about waste partnerships, waste management infrastructure and collection systems was collected to help identify and explain changes in waste destinies. Since 1996, the proportion of waste landfilled in England has decreased, in tandem with increases in recycling and incineration. At the regional and sub-regional (Local Authority; LA) level, there have been large variations in the relative proportions of waste landfilled, incinerated and recycled or composted. Annual increases in the proportion of household waste incinerated were typically larger than increases in the proportion recycled. The observed changes took place in the context of legal and financial drivers, and the circumstances of individual LAs (e.g. landfill capacity) also explained the changes seen. Where observed, shifts from landfill towards incineration constitute an approach whereby waste management moves up the waste hierarchy as opposed to an attempt to reach the most preferred option(s); in terms of resource efficiency, this practice is sub-optimal. The requirement to supply incinerators with a feedstock over their lifespan reduces the benefits of developing of recycling and waste reduction, although access to incineration infrastructure permits short-term and marked decreases in the proportion of LACW landfilled. We conclude that there is a need for clearer national strategy and co-ordination to inform and guide policy, practice, planning and investment in infrastructure such that waste management can be better aligned with the principles of the circular economy and resource efficiency. If the ongoing stand-off between national political figures and the waste sector continues, England's waste policy remains destined for indecision. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Precautionary Principle and the Tolerability of Blood Transfusion Risks.
Kramer, Koen; Zaaijer, Hans L; Verweij, Marcel F
2017-03-01
Tolerance for blood transfusion risks is very low, as evidenced by the implementation of expensive blood tests and the rejection of gay men as blood donors. Is this low risk tolerance supported by the precautionary principle, as defenders of such policies claim? We discuss three constraints on applying (any version of) the precautionary principle and show that respecting these implies tolerating certain risks. Consistency means that the precautionary principle cannot prescribe precautions that it must simultaneously forbid taking, considering the harms they might cause. Avoiding counterproductivity requires rejecting precautions that cause more harm than they prevent. Proportionality forbids taking precautions that are more harmful than adequate alternatives. When applying these constraints, we argue, attention should not be restricted to harms that are human caused or that affect human health or the environment. Tolerating transfusion risks can be justified if available precautions have serious side effects, such as high social or economic costs.
NASA Astrophysics Data System (ADS)
Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian
2018-01-01
In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.
Hollon, Matthew F.
2015-01-01
Background By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Objectives Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. Method The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. Results The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Conclusions Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents. PMID:26653701
Basic Principles of Electrical Network Reliability Optimization in Liberalised Electricity Market
NASA Astrophysics Data System (ADS)
Oleinikova, I.; Krishans, Z.; Mutule, A.
2008-01-01
The authors propose to select long-term solutions to the reliability problems of electrical networks in the stage of development planning. The guide lines or basic principles of such optimization are: 1) its dynamical nature; 2) development sustainability; 3) integrated solution of the problems of network development and electricity supply reliability; 4) consideration of information uncertainty; 5) concurrent consideration of the network and generation development problems; 6) application of specialized information technologies; 7) definition of requirements for independent electricity producers. In the article, the major aspects of liberalized electricity market, its functions and tasks are reviewed, with emphasis placed on the optimization of electrical network development as a significant component of sustainable management of power systems.
Theoretical aspects of the equivalence principle
NASA Astrophysics Data System (ADS)
Damour, Thibault
2012-09-01
We review several theoretical aspects of the equivalence principle (EP). We emphasize the unsatisfactory fact that the EP maintains the absolute character of the coupling constants of physics, while general relativity and its generalizations (Kaluza-Klein, …, string theory) suggest that all absolute structures should be replaced by dynamical entities. We discuss the EP-violation phenomenology of dilaton-like models, which is likely to be dominated by the linear superposition of two effects: a signal proportional to the nuclear Coulomb energy, related to the variation of the fine-structure constant, and a signal proportional to the surface nuclear binding energy, related to the variation of the light quark masses. We recall various theoretical arguments (including a recently proposed anthropic argument) suggesting that the EP be violated at a small, but not unmeasurably small level. This motivates the need for improved tests of the EP. These tests are probing new territories in physics that are related to deep, and mysterious, issues in fundamental physics.
Not all built the same? A comparative study of electoral systems and population health.
Patterson, Andrew C
2017-09-01
Much literature depicts a worldwide democratic advantage in population health. However, less research compares health outcomes in the different kinds of democracy or autocracy. In an examination of 179 countries as they existed between 1975 and 2012, advantages in life expectancy and infant health appear most reliably for democracies that include the principle of proportional representation in their electoral rules. Compared to closed autocracies, they had up to 12 or more years of life expectancy on average, 75% less infant mortality, and double the savings in overall mortality for most other age groups. Majoritarian democracies, in contrast, did not experience longitudinal improvements in health relative to closed autocracies. Instead their population health appeared to be on par with or even superseded by competitive autocracies in most models. Findings suggest that the principle of proportional representation may be good for health at the national level. Implications and limitations are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)
NASA Astrophysics Data System (ADS)
Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.
2016-06-01
We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.
Akazawa, K; Nakamura, T; Moriguchi, S; Shimada, M; Nose, Y
1991-07-01
Small sample properties of the maximum partial likelihood estimates for Cox's proportional hazards model depend on the sample size, the true values of regression coefficients, covariate structure, censoring pattern and possibly baseline hazard functions. Therefore, it would be difficult to construct a formula or table to calculate the exact power of a statistical test for the treatment effect in any specific clinical trial. The simulation program, written in SAS/IML, described in this paper uses Monte-Carlo methods to provide estimates of the exact power for Cox's proportional hazards model. For illustrative purposes, the program was applied to real data obtained from a clinical trial performed in Japan. Since the program does not assume any specific function for the baseline hazard, it is, in principle, applicable to any censored survival data as long as they follow Cox's proportional hazards model.
Stoto, Michael A.
2002-01-01
Two examples-the "swine flu affair" in 1976 and the emergence of HIV in the blood supply in the early 1980s-illustrate the difficulties of decision-making in public health. Both cases illustrate trade-offs between product risks and public health benefits, especially with regard to uncertainty in estimates of product risks, public health risks, and the benefits of prevention. The cases also illustrate the tendency of public health policy makers to go all the way or do nothing at all, rather than consider intermediate options that can be adapted as new information emerges. This review suggests three lessons for public health policy makers: (1) be open and honest about scientific uncertainty; (2) communicate with the public, even when the facts are not clear; and (3) consider intermediate, adaptable policy options, such as obtaining more information, thus reducing uncertainty, and building in decision points to reconsider initial policies. Underlying all of these lessons is the need to commission studies to resolve important uncertainties and increase the information base for public communication, and to review regulations and other policy options in the light of the new data that emerge. PMID:12576534
Learning in Noise: Dynamic Decision-Making in a Variable Environment
Gureckis, Todd M.; Love, Bradley C.
2009-01-01
In engineering systems, noise is a curse, obscuring important signals and increasing the uncertainty associated with measurement. However, the negative effects of noise and uncertainty are not universal. In this paper, we examine how people learn sequential control strategies given different sources and amounts of feedback variability. In particular, we consider people’s behavior in a task where short- and long-term rewards are placed in conflict (i.e., the best option in the short-term is worst in the long-term). Consistent with a model based on reinforcement learning principles (Gureckis & Love, in press), we find that learners differentially weight information predictive of the current task state. In particular, when cues that signal state are noisy and uncertain, we find that participants’ ability to identify an optimal strategy is strongly impaired relative to equivalent amounts of uncertainty that obscure the rewards/valuations of those states. In other situations, we find that noise and uncertainty in reward signals may paradoxically improve performance by encouraging exploration. Our results demonstrate how experimentally-manipulated task variability can be used to test predictions about the mechanisms that learners engage in dynamic decision making tasks. PMID:20161328
Stoto, Michael A
2002-01-01
Two examples-the "swine flu affair" in 1976 and the emergence of HIV in the blood supply in the early 1980s-illustrate the difficulties of decision-making in public health. Both cases illustrate trade-offs between product risks and public health benefits, especially with regard to uncertainty in estimates of product risks, public health risks, and the benefits of prevention. The cases also illustrate the tendency of public health policy makers to go all the way or do nothing at all, rather than consider intermediate options that can be adapted as new information emerges. This review suggests three lessons for public health policy makers: (1) be open and honest about scientific uncertainty; (2) communicate with the public, even when the facts are not clear; and (3) consider intermediate, adaptable policy options, such as obtaining more information, thus reducing uncertainty, and building in decision points to reconsider initial policies. Underlying all of these lessons is the need to commission studies to resolve important uncertainties and increase the information base for public communication, and to review regulations and other policy options in the light of the new data that emerge.
Uncertainty relations as Hilbert space geometry
NASA Technical Reports Server (NTRS)
Braunstein, Samuel L.
1994-01-01
Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.
A Model-Based Prognostics Approach Applied to Pneumatic Valves
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Goebel, Kai
2011-01-01
Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
NASA Astrophysics Data System (ADS)
Dorini, F. A.; Cecconello, M. S.; Dorini, L. B.
2016-04-01
It is recognized that handling uncertainty is essential to obtain more reliable results in modeling and computer simulation. This paper aims to discuss the logistic equation subject to uncertainties in two parameters: the environmental carrying capacity, K, and the initial population density, N0. We first provide the closed-form results for the first probability density function of time-population density, N(t), and its inflection point, t*. We then use the Maximum Entropy Principle to determine both K and N0 density functions, treating such parameters as independent random variables and considering fluctuations of their values for a situation that commonly occurs in practice. Finally, closed-form results for the density functions and statistical moments of N(t), for a fixed t > 0, and of t* are provided, considering the uniform distribution case. We carried out numerical experiments to validate the theoretical results and compared them against that obtained using Monte Carlo simulation.
Experimental Uncertainty Associated with Traveling Wave Excitation
2014-09-15
20 2.9 Schematic of the Lumped Model [6] . . . . . . . . . . . . . . . . . . . . . . . 21 2.10 Multiple Coupled Pendulum [7...model to describe the physical system, the authors chose to employ a coupled pendulum model to represent a rotor. This system is shown in Figure 2.10...System mistuning is introduced by altering pendulum lengths. All other system parameters are equal. A linear viscous proportional damping force is
Matthew Warren; Kristell Hergoualc' h; J. Boone Kauffman; Daniel Murdiyarso; Randall Kolka
2017-01-01
Background: A large proportion of the world's tropical peatlands occur in Indonesia where rapid conversion and associated losses of carbon, biodiversity and ecosystem services have brought peatland management to the forefront of Indonesia's climate mitigation efforts. We evaluated peat volume from two commonly referenced maps of peat distribution and depth...
Least-Squares Analysis of Data with Uncertainty in "y" and "x": Algorithms in Excel and KaleidaGraph
ERIC Educational Resources Information Center
Tellinghuisen, Joel
2018-01-01
For the least-squares analysis of data having multiple uncertain variables, the generally accepted best solution comes from minimizing the sum of weighted squared residuals over all uncertain variables, with, for example, weights in x[subscript i] taken as inversely proportional to the variance [delta][subscript xi][superscript 2]. A complication…
Children with Autism Spectrum Disorders Who Do Not Develop Phrase Speech in the Preschool Years
ERIC Educational Resources Information Center
Norrelgen, Fritjof; Fernell, Elisabeth; Eriksson, Mats; Hedvall, Asa; Persson, Clara; Sjölin, Maria; Gillberg, Christopher; Kjellmer, Liselotte
2015-01-01
There is uncertainty about the proportion of children with autism spectrum disorders who do not develop phrase speech during the preschool years. The main purpose of this study was to examine this ratio in a population-based community sample of children. The cohort consisted of 165 children (141 boys, 24 girls) with autism spectrum disorders aged…
Valuating Privacy with Option Pricing Theory
NASA Astrophysics Data System (ADS)
Berthold, Stefan; Böhme, Rainer
One of the key challenges in the information society is responsible handling of personal data. An often-cited reason why people fail to make rational decisions regarding their own informational privacy is the high uncertainty about future consequences of information disclosures today. This chapter builds an analogy to financial options and draws on principles of option pricing to account for this uncertainty in the valuation of privacy. For this purpose, the development of a data subject's personal attributes over time and the development of the attribute distribution in the population are modeled as two stochastic processes, which fit into the Binomial Option Pricing Model (BOPM). Possible applications of such valuation methods to guide decision support in future privacy-enhancing technologies (PETs) are sketched.
Lowe, Winsor H; McPeek, Mark A
2014-08-01
Dispersal is difficult to quantify and often treated as purely stochastic and extrinsically controlled. Consequently, there remains uncertainty about how individual traits mediate dispersal and its ecological effects. Addressing this uncertainty is crucial for distinguishing neutral versus non-neutral drivers of community assembly. Neutral theory assumes that dispersal is stochastic and equivalent among species. This assumption can be rejected on principle, but common research approaches tacitly support the 'neutral dispersal' assumption. Theory and empirical evidence that dispersal traits are under selection should be broadly integrated in community-level research, stimulating greater scrutiny of this assumption. A tighter empirical connection between the ecological and evolutionary forces that shape dispersal will enable richer understanding of this fundamental process and its role in community assembly. Copyright © 2014 Elsevier Ltd. All rights reserved.
Equilibration and analysis of first-principles molecular dynamics simulations of water
NASA Astrophysics Data System (ADS)
Dawson, William; Gygi, François
2018-03-01
First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.
Equilibration and analysis of first-principles molecular dynamics simulations of water.
Dawson, William; Gygi, François
2018-03-28
First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.
Squeezed spin states: Squeezing the spin uncertainty relations
NASA Technical Reports Server (NTRS)
Kitagawa, Masahiro; Ueda, Masahito
1993-01-01
The notion of squeezing in spin systems is clarified, and the principle for spin squeezing is shown. Two twisting schemes are proposed as building blocks for spin squeezing and are shown to reduce the standard quantum noise, s/2, of the coherent S-spin state down to the order of S(sup 1/3) and 1/2. Applications to partition noise suppression are briefly discussed.
NASA Technical Reports Server (NTRS)
Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro
1993-01-01
A review of current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.
On the Compressive Sensing Systems (Part 1)
2015-02-01
resolution between targets of classical radar is limited by the radar uncertainty principle. B. Fundamentals on CS and CS-Based Radar ( CSR ) Under...appropriate conditions, CSR can beat the traditional radar. We now consider K targets with unknown range-velocities and corresponding reflection...sparse target scene. A CSR has the following features: 1) Eliminating the need of matched filter at the receiver; 2) Requiring low sampling bandwidth
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
Exact symmetries in the velocity fluctuations of a hot Brownian swimmer
NASA Astrophysics Data System (ADS)
Falasco, Gianmaria; Pfaller, Richard; Bregulla, Andreas P.; Cichos, Frank; Kroy, Klaus
2016-09-01
Symmetries constrain dynamics. We test this fundamental physical principle, experimentally and by molecular dynamics simulations, for a hot Janus swimmer operating far from thermal equilibrium. Our results establish scalar and vectorial steady-state fluctuation theorems and a thermodynamic uncertainty relation that link the fluctuating particle current to its entropy production at an effective temperature. A Markovian minimal model elucidates the underlying nonequilibrium physics.
Measuring Speed Of Rotation With Two Brushless Resolvers
NASA Technical Reports Server (NTRS)
Howard, David E.
1995-01-01
Speed of rotation of shaft measured by use of two brushless shaft-angle resolvers aligned so electrically and mechanically in phase with each other. Resolvers and associated circuits generate voltage proportional to speed of rotation (omega) in both magnitude and sign. Measurement principle exploits simple trigonometric identity.
Low order H∞ optimal control for ACFA blended wing body aircraft
NASA Astrophysics Data System (ADS)
Haniš, T.; Kucera, V.; Hromčík, M.
2013-12-01
Advanced nonconvex nonsmooth optimization techniques for fixed-order H∞ robust control are proposed in this paper for design of flight control systems (FCS) with prescribed structure. Compared to classical techniques - tuning of and successive closures of particular single-input single-output (SISO) loops like dampers, attitude stabilizers, etc. - all loops are designed simultaneously by means of quite intuitive weighting filters selection. In contrast to standard optimization techniques, though (H2, H∞ optimization), the resulting controller respects the prescribed structure in terms of engaged channels and orders (e. g., proportional (P), proportional-integral (PI), and proportional-integralderivative (PID) controllers). In addition, robustness with regard to multimodel uncertainty is also addressed which is of most importance for aerospace applications as well. Such a way, robust controllers for various Mach numbers, altitudes, or mass cases can be obtained directly, based only on particular mathematical models for respective combinations of the §ight parameters.
Design and experimental evaluation of robust controllers for a two-wheeled robot
NASA Astrophysics Data System (ADS)
Kralev, J.; Slavov, Ts.; Petkov, P.
2016-11-01
The paper presents the design and experimental evaluation of two alternative μ-controllers for robust vertical stabilisation of a two-wheeled self-balancing robot. The controllers design is based on models derived by identification from closed-loop experimental data. In the first design, a signal-based uncertainty representation obtained directly from the identification procedure is used, which leads to a controller of order 29. In the second design the signal uncertainty is approximated by an input multiplicative uncertainty, which leads to a controller of order 50, subsequently reduced to 30. The performance of the two μ-controllers is compared with the performance of a conventional linear quadratic controller with 17th-order Kalman filter. A proportional-integral controller of the rotational motion around the vertical axis is implemented as well. The control code is generated using Simulink® controller models and is embedded in a digital signal processor. Results from the simulation of the closed-loop system as well as experimental results obtained during the real-time implementation of the designed controllers are given. The theoretical investigation and experimental results confirm that the closed-loop system achieves robust performance in respect to the uncertainties related to the identified robot model.
Caresana, Marco; Helmecke, Manuela; Kubancak, Jan; Manessi, Giacomo Paolo; Ott, Klaus; Scherpelz, Robert; Silari, Marco
2014-10-01
This paper discusses an intercomparison campaign performed in the mixed radiation field at the CERN-EU (CERF) reference field facility. Various instruments were employed: conventional and extended-range rem counters including a novel instrument called LUPIN, a bubble detector using an active counting system (ABC 1260) and two tissue-equivalent proportional counters (TEPCs). The results show that the extended range instruments agree well within their uncertainties and within 1σ with the H*(10) FLUKA value. The conventional rem counters are in good agreement within their uncertainties and underestimate H*(10) as measured by the extended range instruments and as predicted by FLUKA. The TEPCs slightly overestimate the FLUKA value but they are anyhow consistent with it when taking the comparatively large total uncertainties into account, and indicate that the non-neutron part of the stray field accounts for ∼30 % of the total H*(10). © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Connell-Carrick, Kelli
2007-01-01
Methamphetamine use and production is changing child welfare practice. Methamphetamine is a significant public health threat (National Institute of Justice, 1999) reaching epidemic proportions (Anglin, Burke, Perrochet, Stamper, & Dawud-Nouris, 2000). The manufacturing of methamphetamine is a serious problem for the child welfare system, yet…
Economic method for measuring ultra-low flow rates of fluids
NASA Technical Reports Server (NTRS)
Bogdanovic, J. A.; Keller, W. F.
1970-01-01
Capillary tube flowmeter measures ultra-low flows of very corrosive fluids /such as chlorine trifluoride and liquid fluorine/ and other liquids with reasonable accuracy. Flowmeter utilizes differential pressure transducer and operates on the principle that for laminar flow in the tube, pressure drop is proportional to flow rate.
A Lab Exercise Explaining Hardy-Weinberg Equilibrium and Evolution Effectively.
ERIC Educational Resources Information Center
Winterer, Juliette
2001-01-01
Presents a set of six activities in population genetics for a college-level biology course that helps students understand the Hardy-Weinberg principle. Activities focus on characterizing a population, Hardy-Weinberg proportions, genetic drift, mutation and selection, population size and divergence, and secondary contact. The only materials…
Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).
ERIC Educational Resources Information Center
Guthrie, Jim
1995-01-01
Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.
2015-12-01
Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.
Finding meaning in art: Preferred levels of ambiguity in art appreciation
Jakesch, Martina; Leder, Helmut
2011-01-01
Uncertainty is typically not desirable in everyday experiences, but uncertainty in the form of ambiguity may be a defining feature of aesthetic experiences of modern art. In this study, we examined different hypotheses concerning the quantity and quality of information appreciated in art. Artworks were shown together with auditorily presented statements. We tested whether the amount of information, the amount of matching information, or the proportion of matching to nonmatching statements apparent in a picture (levels of ambiguity) affect liking and interestingness. Only the levels of ambiguity predicted differences in the two dependent variables. These findings reveal that ambiguity is an important determinant of aesthetic appreciation and that a certain level of ambiguity is appreciable. PMID:19565431
High-precision half-life determination for 21Na using a 4 π gas-proportional counter
NASA Astrophysics Data System (ADS)
Finlay, P.; Laffoley, A. T.; Ball, G. C.; Bender, P. C.; Dunlop, M. R.; Dunlop, R.; Hackman, G.; Leslie, J. R.; MacLean, A. D.; Miller, D.; Moukaddam, M.; Olaizola, B.; Severijns, N.; Smith, J. K.; Southall, D.; Svensson, C. E.
2017-08-01
A high-precision half-life measurement for the superallowed β+ transition between the isospin T =1 /2 mirror nuclei 21Na and 21Ne has been performed at the TRIUMF-ISAC radioactive ion beam facility yielding T1 /2=22.4506 (33 ) s, a result that is a factor of 4 more precise than the previous world-average half-life for 21Na and represents the single most precisely determined half-life for a transition between mirror nuclei to date. The contribution to the uncertainty in the 21Na F tmirror value due to the half-life is now reduced to the level of the nuclear-structure-dependent theoretical corrections, leaving the branching ratio as the dominant experimental uncertainty.
Auger recombination in sodium iodide
NASA Astrophysics Data System (ADS)
McAllister, Andrew; Kioupakis, Emmanouil; Åberg, Daniel; Schleife, André
2014-03-01
Scintillators are an important tool used to detect high energy radiation - both in the interest of national security and in medicine. However, scintillator detectors currently suffer from lower energy resolutions than expected from basic counting statistics. This has been attributed to non-proportional light yield compared to incoming radiation, but the specific mechanism for this non-proportionality has not been identified. Auger recombination is a non-radiative process that could be contributing to the non-proportionality of scintillating materials. Auger recombination comes in two types - direct and phonon-assisted. We have used first-principles calculations to study Auger recombination in sodium iodide, a well characterized scintillating material. Our findings indicate that phonon-assisted Auger recombination is stronger in sodium iodide than direct Auger recombination. Computational resources provided by LLNL and NERSC. Funding provided by NA-22.
Trajectory formation principles are the same after mild or moderate stroke
van Dokkum, Liesjet Elisabeth Henriette; Froger, Jérôme; Gouaïch, Abdelkader; Laffont, Isabelle
2017-01-01
When we make rapid reaching movements, we have to trade speed for accuracy. To do so, the trajectory of our hand is the result of an optimal balance between feed-forward and feed-back control in the face of signal-dependant noise in the sensorimotor system. How far do these principles of trajectory formation still apply after a stroke, for persons with mild to moderate sensorimotor deficits who recovered some reaching ability? Here, we examine the accuracy of fast hand reaching movements with a focus on the information capacity of the sensorimotor system and its relation to trajectory formation in young adults, in persons who had a stroke and in age-matched control participants. We find that persons with stroke follow the same trajectory formation principles, albeit parameterized differently in the face of higher sensorimotor uncertainty. Higher directional errors after a stroke result in less feed-forward control, hence more feed-back loops responsible for segmented movements. As a consequence, movements are globally slower to reach the imposed accuracy, and the information throughput of the sensorimotor system is lower after a stroke. The fact that the most abstract principles of motor control remain after a stroke suggests that clinicians can capitalize on existing theories of motor control and learning to derive principled rehabilitation strategies. PMID:28329000
The principle of finiteness - a guideline for physical laws
NASA Astrophysics Data System (ADS)
Sternlieb, Abraham
2013-04-01
I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.
Practices in Adequate Structural Design
NASA Technical Reports Server (NTRS)
Ryan, Robert S.
1989-01-01
Structural design and verification of space vehicles and space systems is a very tricky and awe inspiring business, particularly for manned missions. Failures in the missions with loss of life is devastating personally and nationally. The scope of the problem is driven by high performance requirements which push state-of-the-art technologies, creating high sensitivites to small variations and uncertainties. Insurance of safe, reliable flight dictates the use of sound principles, procedures, analysis, and testing. Many of those principles which were refocused by the Space Shuttle Challenger (51-L) accident on January 26, 1986, and the activities conducted to insure safe shuttle reflights are discussed. The emphasis will be focused on engineering, while recognizing that project and project management are also key to success.
Practices in adequate structural design
NASA Astrophysics Data System (ADS)
Ryan, Robert S.
1989-01-01
Structural design and verification of space vehicles and space systems is a very tricky and awe inspiring business, particularly for manned missions. Failures in the missions with loss of life is devastating personally and nationally. The scope of the problem is driven by high performance requirements which push state-of-the-art technologies, creating high sensitivites to small variations and uncertainties. Insurance of safe, reliable flight dictates the use of sound principles, procedures, analysis, and testing. Many of those principles which were refocused by the Space Shuttle Challenger (51-L) accident on January 26, 1986, and the activities conducted to insure safe shuttle reflights are discussed. The emphasis will be focused on engineering, while recognizing that project and project management are also key to success.
Non-precautionary aspects of toxicology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grandjean, Philippe
2005-09-01
Empirical studies in toxicology aim at deciphering complex causal relationships, especially in regard to human disease etiologies. Several scientific traditions limit the usefulness of documentation from current toxicological research, in regard to decision-making based on the precautionary principle. Among non-precautionary aspects of toxicology are the focus on simplified model systems and the effects of single hazards, one by one. Thus, less attention is paid to sources of variability and uncertainty, including individual susceptibility, impacts of mixed and variable exposures, susceptible life-stages, and vulnerable communities. In emphasizing the need for confirmatory evidence, toxicology tends to penalize false positives more than falsemore » negatives. An important source of uncertainty is measurement error that results in misclassification, especially in regard to exposure assessment. Standard statistical analysis assumes that the exposure is measured without error, and imprecisions will usually result in an underestimation of the dose-effect relationship. In testing whether an effect could be considered a possible result of natural variability, a 5% limit for 'statistical significance' is usually applied, even though it may rule out many findings of causal associations, simply because the study was too small (and thus lacked statistical power) or because some imprecision or limited sensitivity of the parameters precluded a more definitive observation. These limitations may be aggravated when toxicology is influenced by vested interests. Because current toxicology overlooks the important goal of achieving a better characterization of uncertainties and their implications, research approaches should be revised and strengthened to counteract the innate ideological biases, thereby supporting our confidence in using toxicology as a main source of documentation and in using the precautionary principle as a decision procedure in the public policy arena.« less
NASA Technical Reports Server (NTRS)
Fortini, Pierluigi; Onofrio, Roberto; Rioli, Alessandro
1993-01-01
A review on the current efforts to approach and to surpass the fundamental limit in the sensitivity of the Weber type gravitational wave antennae is reported. Applications of quantum non-demolition techniques to the concrete example of an antenna resonant with the transducer are discussed in detail. Analogies and differences from the framework of the squeezed states in quantum optics are discussed.
Cyber Warfare: New Character with Strategic Results
2013-03-01
The advent of cyber warfare has sparked a debate amongst theorists as to whether timeless Clausewitzian principles remain true in the 21st century...Violence, uncertainty, and rationality still accurately depict the nature of cyber warfare , however, its many defining attributes and means by which...this style of warfare is conducted has definitively changed the character of war. Although cyber warfare is contested in the cyber domain, it often
ASRDI oxygen technology survey. Volume 6: Flow measurement instrumentation
NASA Technical Reports Server (NTRS)
Mann, D. B.
1974-01-01
A summary is provided of information available on liquid and gaseous oxygen flowmetering including an evaluation of commercial meters. The instrument types, physical principles of measurement, and performance characteristics are described. Problems concerning flow measurements of less than plus or minus two percent uncertainty are reviewed. Recommendations concerning work on flow reference systems, the use of surrogate fluids, and standard tests for oxygen flow measurements are also presented.
Strategic alliance as a competitive tactics for biological-pharmacy industry.
Liu, Chuanming; Wang, Ling; Qi, Ershi
2005-01-01
Biological-pharmacy industry refers to biotechnology companies and pharmacy makers. Because of the uncertainty and time-lag in the field of biological-pharmacy, the former is confronted with lacking of capital and the later is faced with improving technique-innovation and product-exploitation. This paper analyzes basic operation principle of strategic alliance, and related strategies are also put forward for biological-pharmacy enterprise to carry out.
ERIC Educational Resources Information Center
Balve, Patrick; Krüger, Volker; Tolstrup Sørensen, Lene
2017-01-01
Problem-based learning (PBL) has proven to be highly effective for educating students in an active and self-motivated manner in various disciplines. Student projects carried out following PBL principles are very dynamic and carry a high level of uncertainty, both conditions under which agile project management approaches are assumed to be highly…
Quantum noise and the threshold of hearing
NASA Technical Reports Server (NTRS)
Bialek, W.; Schweitzer, A.
1985-01-01
It is argued that the sensitivity of the ear reaches a limit imposed by the uncertainty principle. This is possible only if the receptor cell holds the detector elements in a special nonequilibrium state which has the same noise characteristics as a ground (T = 0 K) state. To accomplish this 'active cooling' the molecular dynamics of the system must maintain quantum mechanical coherence over the time scale of the measurement.
ERIC Educational Resources Information Center
Malgieri, Massimiliano; Onorato, Pasquale; De Ambrosis, Anna
2017-01-01
In this paper we present the results of a research-based teaching-learning sequence on introductory quantum physics based on Feynman's sum over paths approach in the Italian high school. Our study focuses on students' understanding of two founding ideas of quantum physics, wave particle duality and the uncertainty principle. In view of recent…
Robust regression on noisy data for fusion scaling laws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verdoolaege, Geert, E-mail: geert.verdoolaege@ugent.be; Laboratoire de Physique des Plasmas de l'ERM - Laboratorium voor Plasmafysica van de KMS
2014-11-15
We introduce the method of geodesic least squares (GLS) regression for estimating fusion scaling laws. Based on straightforward principles, the method is easily implemented, yet it clearly outperforms established regression techniques, particularly in cases of significant uncertainty on both the response and predictor variables. We apply GLS for estimating the scaling of the L-H power threshold, resulting in estimates for ITER that are somewhat higher than predicted earlier.
Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.
Hsieh, I-Hui; Saberi, Kourosh
2016-02-01
How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.
NASA Astrophysics Data System (ADS)
Braun, David J.; Sutas, Andrius; Vijayakumar, Sethu
2017-01-01
Theory predicts that parametrically excited oscillators, tuned to operate under resonant condition, are capable of large-amplitude oscillation useful in diverse applications, such as signal amplification, communication, and analog computation. However, due to amplitude saturation caused by nonlinearity, lack of robustness to model uncertainty, and limited sensitivity to parameter modulation, these oscillators require fine-tuning and strong modulation to generate robust large-amplitude oscillation. Here we present a principle of self-tuning parametric feedback excitation that alleviates the above-mentioned limitations. This is achieved using a minimalistic control implementation that performs (i) self-tuning (slow parameter adaptation) and (ii) feedback pumping (fast parameter modulation), without sophisticated signal processing past observations. The proposed approach provides near-optimal amplitude maximization without requiring model-based control computation, previously perceived inevitable to implement optimal control principles in practical application. Experimental implementation of the theory shows that the oscillator self-tunes itself near to the onset of dynamic bifurcation to achieve extreme sensitivity to small resonant parametric perturbations. As a result, it achieves large-amplitude oscillations by capitalizing on the effect of nonlinearity, despite substantial model uncertainties and strong unforeseen external perturbations. We envision the present finding to provide an effective and robust approach to parametric excitation when it comes to real-world application.
Public Perceptions of Regulatory Costs, Their Uncertainty and Interindividual Distribution.
Johnson, Branden B; Finkel, Adam M
2016-06-01
Public perceptions of both risks and regulatory costs shape rational regulatory choices. Despite decades of risk perception studies, this article is the first on regulatory cost perceptions. A survey of 744 U.S. residents probed: (1) How knowledgeable are laypeople about regulatory costs incurred to reduce risks? (2) Do laypeople see official estimates of cost and benefit (lives saved) as accurate? (3) (How) do preferences for hypothetical regulations change when mean-preserving spreads of uncertainty replace certain cost or benefit? and (4) (How) do preferences change when unequal interindividual distributions of hypothetical regulatory costs replace equal distributions? Respondents overestimated costs of regulatory compliance, while assuming agencies underestimate costs. Most assumed agency estimates of benefits are accurate; a third believed both cost and benefit estimates are accurate. Cost and benefit estimates presented without uncertainty were slightly preferred to those surrounded by "narrow uncertainty" (a range of costs or lives entirely within a personally-calibrated zone without clear acceptance or rejection of tradeoffs). Certain estimates were more preferred than "wide uncertainty" (a range of agency estimates extending beyond these personal bounds, thus posing a gamble between favored and unacceptable tradeoffs), particularly for costs as opposed to benefits (but even for costs a quarter of respondents preferred wide uncertainty to certainty). Agency-acknowledged uncertainty in general elicited mixed judgments of honesty and trustworthiness. People preferred egalitarian distributions of regulatory costs, despite skewed actual cost distributions, and preferred progressive cost distributions (the rich pay a greater than proportional share) to regressive ones. Efficient and socially responsive regulations require disclosure of much more information about regulatory costs and risks. © 2016 Society for Risk Analysis.
Ekerdt, David J
2005-01-01
The assisted living environment lacks the satisfying clarity of the consumer model (a stay at the Holiday Inn) or the medical model (the hospital or nursing home). Yet the ambiguity of assisted living is unavoidable because it shelters individuals whose needs are changing, the model of care requires extensive negotiation with residents, and staff members must continually compromise as they implement the principles. Assisted living is a place where uncertainty is managed, not resolved. This indicates a need for the further pursuit of qualitative research, such as reported by these articles and others (e.g., Carder, 2002), to explore how participants construct, make sense of, and interpret their daily experience in assisted living.
Apparatus for accurate density measurements of fluids based on a magnetic suspension balance
NASA Astrophysics Data System (ADS)
Gong, Maoqiong; Li, Huiya; Guo, Hao; Dong, Xueqiang; Wu, J. F.
2012-06-01
A new apparatus for accurate pressure, density and temperature (p, ρ, T) measurements over wide ranges of (p, ρ, T) (90 K to 290 K; 0 MPa to 3 MPa; 0 kg/m3 to 2000 kg/m3) is described. This apparatus is based on a magnetic suspension balance which applies the Archimedes' buoyancy principle. In order to verify the new apparatus, comprehensive (p, ρ, T) measurements on pure nitrogen were carried out. The maximum relative standard uncertainty is 0.09% in density. The maximum standard uncertainty in temperature is 5 mK, and that in pressure is 250 Pa for 1.5 MPa and 390 Pa for 3MPa full scale range respectively. The experimental data were compared with selected literature data and good agreements were found.
Physics of Non-Inertial Reference Frames
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamalov, Timur F.
2010-12-22
Physics of non-inertial reference frames is a generalizing of Newton's laws to any reference frames. It is the system of general axioms for classical and quantum mechanics. The first, Kinematics Principle reads: the kinematic state of a body free of forces conserves and equal in absolute value to an invariant of the observer's reference frame. The second, Dynamics Principle extended Newton's second law to non-inertial reference frames and also contains additional variables there are higher derivatives of coordinates. Dynamics Principle reads: a force induces a change in the kinematic state of the body and is proportional to the rate ofmore » its change. It is mean that if the kinematic invariant of the reference frame is n-th derivative with respect the time, then the dynamics of a body being affected by the force F is described by the 2n-th differential equation. The third, Statics Principle reads: the sum of all forces acting a body at rest is equal to zero.« less
Developing a policy for paediatric biobanks: principles for good practice
Hens, Kristien; Van El, Carla E; Borry, Pascal; Cambon-Thomsen, Anne; Cornel, Martina C; Forzano, Francesca; Lucassen, Anneke; Patch, Christine; Tranebjaerg, Lisbeth; Vermeulen, Eric; Salvaterra, Elena; Tibben, Aad; Dierickx, Kris
2013-01-01
The participation of minors in biobank research can offer great benefits for science and health care. However, as minors are a vulnerable population they are also in need of adequate protective measures when they are enrolled in research. Research using biobanked biological samples from children poses additional ethical issues to those raised by research using adult biobanks. For example, small children have only limited capacity, if any, to understand the meaning and implications of the research and to give a documented agreement to it. Older minors are gradually acquiring this capacity. We describe principles for good practice related to the inclusion of minors in biobank research, focusing on issues related to benefits and subsidiarity, consent, proportionality and return of results. Some of these issues are currently heavily debated, and we conclude by providing principles for good practice for policy makers of biobanks, researchers and anyone involved in dealing with stored tissue samples from children. Actual implementation of the principles will vary according to different jurisdictions. PMID:22713814
Greek classicism in living structure? Some deductive pathways in animal morphology.
Zweers, G A
1985-01-01
Classical temples in ancient Greece show two deterministic illusionistic principles of architecture, which govern their functional design: geometric proportionalism and a set of illusion-strengthening rules in the proportionalism's "stochastic margin". Animal morphology, in its mechanistic-deductive revival, applies just one architectural principle, which is not always satisfactory. Whether a "Greek Classical" situation occurs in the architecture of living structure is to be investigated by extreme testing with deductive methods. Three deductive methods for explanation of living structure in animal morphology are proposed: the parts, the compromise, and the transformation deduction. The methods are based upon the systems concept for an organism, the flow chart for a functionalistic picture, and the network chart for a structuralistic picture, whereas the "optimal design" serves as the architectural principle for living structure. These methods show clearly the high explanatory power of deductive methods in morphology, but they also make one open end most explicit: neutral issues do exist. Full explanation of living structure asks for three entries: functional design within architectural and transformational constraints. The transformational constraint brings necessarily in a stochastic component: an at random variation being a sort of "free management space". This variation must be a variation from the deterministic principle of the optimal design, since any transformation requires space for plasticity in structure and action, and flexibility in role fulfilling. Nevertheless, finally the question comes up whether for animal structure a similar situation exists as in Greek Classical temples. This means that the at random variation, that is found when the optimal design is used to explain structure, comprises apart from a stochastic part also real deviations being yet another deterministic part. This deterministic part could be a set of rules that governs actualization in the "free management space".
Kottke, Thomas E; Huebsch, Jacquelyn A; McGinnis, Paul; Nichols, Jolleen M; Parker, Emily D; Tillema, Juliana O; Maciosek, Michael V
2016-01-01
Context: Primary care practice. Objective: To test whether the principles of complex adaptive systems are applicable to implementation of team-based primary care. Design: We used complex adaptive system principles to implement team-based care in a private, five-clinic primary care practice. We compared randomly selected samples of patients with coronary heart disease (CHD) and diabetes before system implementation (March 1, 2009, to February 28, 2010) and after system implementation (December 1, 2011, to March 31, 2013). Main Outcome Measures: Rates of patients meeting the composite goals for CHD (blood pressure < 140/90 mmHg, low-density lipoprotein cholesterol level < 100 mg/dL, tobacco-free, and using aspirin unless contraindicated) and diabetes (CHD goal plus hemoglobin A1c concentration < 8%) before and after the intervention. We also measured provider and patient satisfaction with preventive services. Results: The proportion of patients with CHD who met the composite goal increased from 40.3% to 59.9% (p < 0.0001) because documented aspirin use increased (65.2%–97.5%, p < 0.0001) and attainment of the cholesterol goal increased (77.0%–83.9%, p = 0.0041). The proportion of diabetic patients meeting the composite goal rose from 24.5% to 45.4% (p < 0.0001) because aspirin use increased (58.6%–97.6%, p < 0.0001). Increased percentages of patients meeting the CHD and diabetes composite goals were not significantly different (p = 0.2319). Provider satisfaction with preventive services delivery increased significantly (p = 0.0017). Patient satisfaction improved but not significantly. Conclusion: Principles of complex adaptive systems can be used to implement team-based care systems for patients with CHD and possibly diabetic patients. PMID:26784851
Kottke, Thomas E; Huebsch, Jacquelyn A; Mcginnis, Paul; Nichols, Jolleen M; Parker, Emily D; Tillema, Juliana O; Maciosek, Michael V
2016-01-01
Primary care practice. To test whether the principles of complex adaptive systems are applicable to implementation of team-based primary care. We used complex adaptive system principles to implement team-based care in a private, five-clinic primary care practice. We compared randomly selected samples of patients with coronary heart disease (CHD) and diabetes before system implementation (March 1, 2009, to February 28, 2010) and after system implementation (December 1, 2011, to March 31, 2013). Rates of patients meeting the composite goals for CHD (blood pressure < 140/90 mmHg, low-density lipoprotein cholesterol level < 100 mg/dL, tobacco-free, and using aspirin unless contraindicated) and diabetes (CHD goal plus hemoglobin A1c concentration < 8%) before and after the intervention. We also measured provider and patient satisfaction with preventive services. The proportion of patients with CHD who met the composite goal increased from 40.3% to 59.9% (p < 0.0001) because documented aspirin use increased (65.2%-97.5%, p < 0.0001) and attainment of the cholesterol goal increased (77.0%-83.9%, p = 0.0041). The proportion of diabetic patients meeting the composite goal rose from 24.5% to 45.4% (p < 0.0001) because aspirin use increased (58.6%-97.6%, p < 0.0001). Increased percentages of patients meeting the CHD and diabetes composite goals were not significantly different (p = 0.2319). Provider satisfaction with preventive services delivery increased significantly (p = 0.0017). Patient satisfaction improved but not significantly. Principles of complex adaptive systems can be used to implement team-based care systems for patients with CHD and possibly diabetic patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, T.D.; Shaw, P.J.; Williams, I.D., E-mail: idw@soton.ac.uk
Highlights: • Critical analysis of municipal waste management practices and performance in England. • Trends visualised via innovative ternary plots and changes and reasons explored. • Performance 1996–2013 moved slowly away from landfill dominance. • Large variations in %s of waste landfilled, incinerated and recycled/composted. • Progress to resource efficiency slow; affected by poor planning and hostile disputes. - Abstract: European nations are compelled to reduce reliance on landfill as a destination for household waste, and should, in principle, achieve this goal with due recognition of the aims and principles of the waste hierarchy. Past research has predominantly focused onmore » recycling, whilst interactions between changing waste destinies, causes and drivers of household waste management change, and potential consequences for the goal of the waste hierarchy are less well understood. This study analysed Local Authority Collected Waste (LACW) for England, at national, regional and sub-regional level, in terms of the destination of household waste to landfill, incineration and recycling. Information about waste partnerships, waste management infrastructure and collection systems was collected to help identify and explain changes in waste destinies. Since 1996, the proportion of waste landfilled in England has decreased, in tandem with increases in recycling and incineration. At the regional and sub-regional (Local Authority; LA) level, there have been large variations in the relative proportions of waste landfilled, incinerated and recycled or composted. Annual increases in the proportion of household waste incinerated were typically larger than increases in the proportion recycled. The observed changes took place in the context of legal and financial drivers, and the circumstances of individual LAs (e.g. landfill capacity) also explained the changes seen. Where observed, shifts from landfill towards incineration constitute an approach whereby waste management moves up the waste hierarchy as opposed to an attempt to reach the most preferred option(s); in terms of resource efficiency, this practice is sub-optimal. The requirement to supply incinerators with a feedstock over their lifespan reduces the benefits of developing of recycling and waste reduction, although access to incineration infrastructure permits short-term and marked decreases in the proportion of LACW landfilled. We conclude that there is a need for clearer national strategy and co-ordination to inform and guide policy, practice, planning and investment in infrastructure such that waste management can be better aligned with the principles of the circular economy and resource efficiency. If the ongoing stand-off between national political figures and the waste sector continues, England’s waste policy remains destined for indecision.« less
The ends of uncertainty: Air quality science and planning in Central California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fine, James
Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less
Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods
NASA Astrophysics Data System (ADS)
Davis, A. D.
2015-12-01
The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.
Spacetime and Euclidean geometry
NASA Astrophysics Data System (ADS)
Brill, Dieter; Jacobson, Ted
2006-04-01
Using only the principle of relativity and Euclidean geometry we show in this pedagogical article that the square of proper time or length in a two-dimensional spacetime diagram is proportional to the Euclidean area of the corresponding causal domain. We use this relation to derive the Minkowski line element by two geometric proofs of the spacetime Pythagoras theorem.
Challenges and Opportunities for State Systems of Community Colleges: A Document Analysis
ERIC Educational Resources Information Center
Salinas, Cristobal; Friedel, Janice Nahra
2016-01-01
From their very beginnings, the community colleges have demonstrated a commitment to their founding principles of access, affordability, and quality. However, following February 2009, when President Obama set forth the agenda for U.S. postsecondary education to have the highest proportion of college graduates in the world by 2020, priority shifted…
Economic, neurobiological, and behavioral perspectives on building America’s future workforce
Knudsen, Eric I.; Heckman, James J.; Cameron, Judy L.; Shonkoff, Jack P.
2006-01-01
A growing proportion of the U.S. workforce will have been raised in disadvantaged environments that are associated with relatively high proportions of individuals with diminished cognitive and social skills. A cross-disciplinary examination of research in economics, developmental psychology, and neurobiology reveals a striking convergence on a set of common principles that account for the potent effects of early environment on the capacity for human skill development. Central to these principles are the findings that early experiences have a uniquely powerful influence on the development of cognitive and social skills and on brain architecture and neurochemistry, that both skill development and brain maturation are hierarchical processes in which higher level functions depend on, and build on, lower level functions, and that the capacity for change in the foundations of human skill development and neural circuitry is highest earlier in life and decreases over time. These findings lead to the conclusion that the most efficient strategy for strengthening the future workforce, both economically and neurobiologically, and improving its quality of life is to invest in the environments of disadvantaged children during the early childhood years. PMID:16801553
Direct bonding in diastema closure--high drama, immediate resolution.
Blitz, N
1996-07-01
Aesthetic rehabilitation in complex diastema closure cases is guided by the principles of proportion. The width to length ratio of the centrals must be pleasing. Achievement of this proper balance dictates treatment. It determines the following: 1) the amount of distal proximal reduction; 2) the decision to completely veneer the incisors vs. just adding to the interproximal; 3) the number of teeth to be treated; 4) the placement and location of naturally occurring prominences and concavities to create the illusion of a narrower tooth. The proper accommodation of these four topics will permit the maintenance or restoration of acceptable dimensions in the centrals. If they are made to appear harmonious then the principle of "golden proportion" (1.6:1:0.6) can be achieved among the centrals, laterals and cuspids. Direct bonding in diastema closure cases allow the dentist and the patient complete control in the formation of that smile. This treatment modality is challenging and ultimately rewarding for the patient and the dentist. At times it enables us to restore form and function and to make our patients whole again not just figuratively but literally.
Forest management under uncertainty for multiple bird population objectives
Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.
2005-01-01
We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.
Petrich, Nicholas T.; Spak, Scott N.; Carmichael, Gregory R.; Hu, Dingfei; Martinez, Andres; Hornbuckle, Keri C.
2013-01-01
Passive air samplers (PAS) including polyurethane foam (PUF) are widely deployed as an inexpensive and practical way to sample semi-volatile pollutants. However, concentration estimates from PAS rely on constant empirical mass transfer rates, which add unquantified uncertainties to concentrations. Here we present a method for modeling hourly sampling rates for semi-volatile compounds from hourly meteorology using first-principle chemistry, physics, and fluid dynamics, calibrated from depuration experiments. This approach quantifies and explains observed effects of meteorology on variability in compound-specific sampling rates and analyte concentrations; simulates nonlinear PUF uptake; and recovers synthetic hourly concentrations at a reference temperature. Sampling rates are evaluated for polychlorinated biphenyl congeners at a network of Harner model samplers in Chicago, Illinois during 2008, finding simulated average sampling rates within analytical uncertainty of those determined from loss of depuration compounds, and confirming quasi-linear uptake. Results indicate hourly, daily and interannual variability in sampling rates, sensitivity to temporal resolution in meteorology, and predictable volatility-based relationships between congeners. We quantify importance of each simulated process to sampling rates and mass transfer and assess uncertainty contributed by advection, molecular diffusion, volatilization, and flow regime within the PAS, finding PAS chamber temperature contributes the greatest variability to total process uncertainty (7.3%). PMID:23837599
Zagmutt, Francisco J; Sempier, Stephen H; Hanson, Terril R
2013-10-01
Emerging diseases (ED) can have devastating effects on agriculture. Consequently, agricultural insurance for ED can develop if basic insurability criteria are met, including the capability to estimate the severity of ED outbreaks with associated uncertainty. The U.S. farm-raised channel catfish (Ictalurus punctatus) industry was used to evaluate the feasibility of using a disease spread simulation modeling framework to estimate the potential losses from new ED for agricultural insurance purposes. Two stochastic models were used to simulate the spread of ED between and within channel catfish ponds in Mississippi (MS) under high, medium, and low disease impact scenarios. The mean (95% prediction interval (PI)) proportion of ponds infected within disease-impacted farms was 7.6% (3.8%, 22.8%), 24.5% (3.8%, 72.0%), and 45.6% (4.0%, 92.3%), and the mean (95% PI) proportion of fish mortalities in ponds affected by the disease was 9.8% (1.4%, 26.7%), 49.2% (4.7%, 60.7%), and 88.3% (85.9%, 90.5%) for the low, medium, and high impact scenarios, respectively. The farm-level mortality losses from an ED were up to 40.3% of the total farm inventory and can be used for insurance premium rate development. Disease spread modeling provides a systematic way to organize the current knowledge on the ED perils and, ultimately, use this information to help develop actuarially sound agricultural insurance policies and premiums. However, the estimates obtained will include a large amount of uncertainty driven by the stochastic nature of disease outbreaks, by the uncertainty in the frequency of future ED occurrences, and by the often sparse data available from past outbreaks. © 2013 Society for Risk Analysis.
Quantifying and communicating the uncertainty of mineral resource evaluations
NASA Astrophysics Data System (ADS)
Mee, Katy; Marchant, Ben; Mankelow, Joseph; Deady, Eimear
2015-04-01
Three-dimensional subsurface models are increasingly being used to assess the value of sand and gravel mineral deposits. Planners might use this information to decide when deposits should be protected from new developments. The models are generally based on interpretations of relatively sparse boreholes and are therefore uncertain. This uncertainty propagates into the predictions of the value of the deposit and must be quantified and communicated to planners in a manner which permits informed decision-making. We discuss these issues in relation to a 60 km by 40 km study area in the south of England. We use the interpretations of 630 boreholes to build statistical models of the subsurface. Mineral deposit categories are defined in terms of the ratio of mineral depth to overburden depth and the proportion of fine particles within the mineral. We use a linear model of coregionalization to model the spatial distribution of these parameters. Furthermore, we use stochastic simulation methods to produce maps of the probability of each category of mineral deposit occurring at each location in the study area. These maps indicate where deposits of suitable sand and gravel might be expected to occur. However, they are only telling us the probability that if a borehole was to be drilled at a location that its contents would satisfy the criteria of each mineral category. Planners require information for areas much larger than a single borehole. Therefore, we demonstrate how the model can be up-scaled to a 1 km2 site. We again use a stochastic simulation method to produce box-whisker plots which illustrate the proportions of gravels, sands, fine sands and fine material that are predicted to occur in the region and the uncertainty associated with the predictions.
Carvalho, A B; Sampaio, M C; Varandas, F R; Klaczko, L B
1998-01-01
Most sexually reproducing species have sexual proportions around 1:1. This major biological phenomenon remained unexplained until 1930, when FISHER proposed that it results from a mechanism of natural selection. Here we report the first experimental test of his model that obeys all its assumptions. We used a naturally occurring X-Y meiotic drive system--the sex-ratio trait of Drosophila mediopunctat--to generate female-biased experimental populations. As predicted by FISHER, these populations evolved toward equal sex proportions due to natural selection, by accumulation of autosomal alleles that direct the parental reproductive effort toward the rare sex. Classical Fisherian evolution is a rather slow mechanism: despite a very large amount of genetic variability, the experimental populations evolved from 16% of males to 32% of males in 49 generations and would take 330 generations (29 years) to reach 49%. This slowness has important implications for species potentially endangered by skewed sexual proportions, such as reptiles with temperature sex determination. PMID:9504919
Portfolio theory and cost-effectiveness analysis: a further discussion.
Sendi, Pedram; Al, Maiwenn J; Rutten, Frans F H
2004-01-01
Portfolio theory has been suggested as a means to improve the risk-return characteristics of investments in health-care programs through diversification when costs and effects are uncertain. This approach is based on the assumption that the investment proportions are not subject to uncertainty and that the budget can be invested in toto in health-care programs. In the present paper we develop an algorithm that accounts for the fact that investment proportions in health-care programs may be uncertain (due to the uncertainty associated with costs) and limited (due to the size of the programs). The initial budget allocation across programs may therefore be revised at the end of the investment period to cover the extra costs of some programs with the leftover budget of other programs in the portfolio. Once the total budget is equivalent to or exceeds the expected costs of the programs in the portfolio, the initial budget allocation policy does not impact the risk-return characteristics of the combined portfolio, i.e., there is no benefit from diversification anymore. The applicability of portfolio methods to improve the risk-return characteristics of investments in health care is limited to situations where the available budget is much smaller than the expected costs of the programs to be funded.
Target Uncertainty Mediates Sensorimotor Error Correction
Vijayakumar, Sethu; Wolpert, Daniel M.
2017-01-01
Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323
Target Uncertainty Mediates Sensorimotor Error Correction.
Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M
2017-01-01
Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.
Perfetti, Christopher M.; Rearden, Bradley T.
2016-03-01
The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less
NASA Astrophysics Data System (ADS)
Siddique, Sami; Jaffray, David
2007-03-01
A central purpose of image-guidance is to assist the interventionalist with feedback of geometric performance in the direction of therapy delivery. Tradeoffs exist between accuracy, precision and the constraints imposed by parameters used in the generation of images. A framework that uses geometric performance as feedback to control these parameters can balance such tradeoffs in order to maintain the requisite localization precision for a given clinical procedure. We refer to this principle as Active Image-Guidance (AIG). This framework requires estimates of the uncertainty in the estimated location of the object of interest. In this study, a simple fiducial marker detected under X-ray fluoroscopy is considered and it is shown that a relation exists between the applied imaging dose and the uncertainty in localization for a given observer. A robust estimator of the location of a fiducial in the thorax during respiration under X-ray fluoroscopy is demonstrated using a particle filter based approach that outputs estimates of the location and the associated spatial uncertainty. This approach gives an rmse of 1.3mm and the uncertainty estimates are found to be correlated with the error in the estimates. Furthermore, the particle filtering approach is employed to output location estimates and the associated uncertainty not only at instances of pulsed exposure but also between exposures. Such a system has applications in image-guided interventions (surgery, radiotherapy, interventional radiology) where there are latencies between the moment of imaging and the act of intervention.
Krom, André
2011-10-01
Effective infectious disease control may require states to restrict the liberty of individuals. Since preventing harm to others is almost universally accepted as a legitimate (prima facie) reason for restricting the liberty of individuals, it seems plausible to employ a mid-level harm principle in infectious disease control. Moral practices like infectious disease control support - or even require - a certain level of theory-modesty. However, employing a mid-level harm principle in infectious disease control faces at least three problems. First, it is unclear what we gain by attaining convergence on a specific formulation of the harm principle. Likely candidates for convergence, a harm principle aimed at preventing harmful conduct, supplemented by considerations of effectiveness and always choosing the least intrusive means still leave ample room for normative disagreement. Second, while mid-level principles are sometimes put forward in response to the problem of normative theories attaching different weight to moral principles, employing a mid-level harm principle completely leaves open how to determine what weight to attach to it in application. Third, there appears to be a trade-off between attaining convergence and finding a formulation of the harm principle that can justify liberty-restrictions in all situations of contagion, including interventions that are commonly allowed. These are not reasons to abandon mid-level theorizing altogether. But there is no reason to be too theory-modest in applied ethics. Morally justifying e.g. if a liberty-restriction in infectious disease control is proportional to the aim of harm-prevention, promptly requires moving beyond the mid-level harm principle. © 2011 Blackwell Publishing Ltd.
Integral control for population management.
Guiver, Chris; Logemann, Hartmut; Rebarber, Richard; Bill, Adam; Tenhumberg, Brigitte; Hodgson, Dave; Townley, Stuart
2015-04-01
We present a novel management methodology for restocking a declining population. The strategy uses integral control, a concept ubiquitous in control theory which has not been applied to population dynamics. Integral control is based on dynamic feedback-using measurements of the population to inform management strategies and is robust to model uncertainty, an important consideration for ecological models. We demonstrate from first principles why such an approach to population management is suitable via theory and examples.
A survey and new measurements of ice vapor pressure at temperatures between 170 and 250K
NASA Technical Reports Server (NTRS)
Marti, James; Mauersberger, Konrad
1993-01-01
New measurements of ice vapor pressures at temperatures between 170 and 250 K are presented and published vapor pressure data are summarized. An empirical vapor pressure equation was derived and allows prediction of vapor pressures between 170 k and the triple point of water with an accuracy of approximately 2 percent. Predictions obtained agree, within experimental uncertainty, with the most reliable equation derived from thermodynamic principles.
Asymmetric information and economics
NASA Astrophysics Data System (ADS)
Frieden, B. Roy; Hawkins, Raymond J.
2010-01-01
We present an expression of the economic concept of asymmetric information with which it is possible to derive the dynamical laws of an economy. To illustrate the utility of this approach we show how the assumption of optimal information flow leads to a general class of investment strategies including the well-known Q theory of Tobin. Novel consequences of this formalism include a natural definition of market efficiency and an uncertainty principle relating capital stock and investment flow.
Thermal expansion in UO 2 determined by high-energy X-ray diffraction
Guthrie, M.; Benmore, C. J.; Skinner, L. B.; ...
2016-06-24
In this study, we present crystallographic analyses of high-energy X-ray diffraction data on polycrystalline UO 2 up to the melting temperature. The Rietveld refinements of our X-ray data are in agreement with previous measurements, but are systematically located around the upper bound of their uncertainty, indicating a slightly steeper trend of thermal expansion compared to established values. This observation is consistent with recent first principles calculations.
Cultivating the Grapevine: An Analysis of Rumor Principles and Concepts
2015-12-01
the U.K.197 Communications expert Timothy Coombs claims, “most protesters were pro-Tibet and were upset by China’s treatment of Tibet, while other... Crisis Communication : International Perspectives on Hits and Misses, ed. Amiso M. George and Cornelius B. Pratt (New York: Routledge Taylor and Francis...ambiguity and uncertainty.2 As Bordia and Difonzo assert, rumor is a form of interpersonal communication , within a group, which is intended to “reduce
Statistical aspects of the Klein-Gordon oscillator in the frame work of GUP
NASA Astrophysics Data System (ADS)
Khosropour, B.
2018-01-01
Investigation in perturbative string theory and quantum gravity suggest that there is a measurable minimal length in nature. In this work, according to generalized uncertainty principle, we study the statistical characteristics of Klein-Gordon Oscillator (KLO). The modified energy spectrum of the KLO are obtained. The generalized thermodynamical quantities of the KLO such as partition function, mean energy and entropy are calculated by using the modified energy spectrum.
2011-11-30
OH: South- Western Cengage Learning. Mankiw , N. G. (2006). Principles of economics (4th ed.). Mason, OH: Thompson South- Western. Private...When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic times, it is now more...decision uncertainties. When the choice to in-source or outsource an installation function or service requirement exists, in these challenging economic
Wave Functions for Time-Dependent Dirac Equation under GUP
NASA Astrophysics Data System (ADS)
Zhang, Meng-Yao; Long, Chao-Yun; Long, Zheng-Wen
2018-04-01
In this work, the time-dependent Dirac equation is investigated under generalized uncertainty principle (GUP) framework. It is possible to construct the exact solutions of Dirac equation when the time-dependent potentials satisfied the proper conditions. In (1+1) dimensions, the analytical wave functions of the Dirac equation under GUP have been obtained for the two kinds time-dependent potentials. Supported by the National Natural Science Foundation of China under Grant No. 11565009
Ultraviolet Spectral Irradiance Scale Comparison: 210 nm to 300 nm
Thompson, Ambler; Early, Edward A.; O’Brian, Thomas R.
1998-01-01
Comparison of the irradiances from a number of ultraviolet spectral irradiance standards, based on different physical principles, showed agreement to within their combined standard uncertainties as assigned to them by NIST. The wavelength region of the spectral irradiance comparison was from 210 nm to 300 nm. The spectral irradiance sources were: an electron storage ring, 1000 W quartz-halogen lamps, deuterium arc lamps, and a windowless argon miniarc. PMID:28009378
The Restricted Isometry Property for Time-Frequency Structured Random Matrices
2011-06-16
tests illustrating the use of Ψg for compressive sensing are presented in [41]. They illustrate that empirically Ψg performs very similarly to a...E.J., J., Tao, T., Romberg , J.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans...Inform. Theory 52(2), 489–509 (2006) [12] Candès, E.J., Romberg , J., Tao, T.: Stable signal recovery from incomplete and inaccurate mea- surements. Comm
Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?
Geier, J.; Voss, C.I.; Dverstorp, B.
2002-01-01
A simple evaluation can be used to characterize the capacity of crystalline bedrock to act as a barrier to release radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealized models of pore geometry. Application to an intensively characterized site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geological barrier performance. Comparison with seven other less intensively characterized crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterized by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.
Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?
Geier, J.; Voss, C.I.; Dverstorp, B.
2002-01-01
A simple evaluation can be used to characterise the capacity of crystalline bedrock to act as a barrier to releases of radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealised models of pore geometry. Application to an intensively characterised site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geologic-barrier performance. Comparison with seven other less intensively characterised crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterised by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.
Proton elastic form factor ratios to Q2=3.5GeV2 by polarization transfer
NASA Astrophysics Data System (ADS)
Punjabi, V.; Perdrisat, C. F.; Aniol, K. A.; Baker, F. T.; Berthot, J.; Bertin, P. Y.; Bertozzi, W.; Besson, A.; Bimbot, L.; Boeglin, W. U.; Brash, E. J.; Brown, D.; Calarco, J. R.; Cardman, L. S.; Chai, Z.; Chang, C.-C.; Chen, J.-P.; Chudakov, E.; Churchwell, S.; Cisbani, E.; Dale, D. S.; Leo, R. De; Deur, A.; Diederich, B.; Domingo, J. J.; Epstein, M. B.; Ewell, L. A.; Fissum, K. G.; Fleck, A.; Fonvieille, H.; Frullani, S.; Gao, J.; Garibaldi, F.; Gasparian, A.; Gerstner, G.; Gilad, S.; Gilman, R.; Glamazdin, A.; Glashausser, C.; Gomez, J.; Gorbenko, V.; Green, A.; Hansen, J.-O.; Howell, C. R.; Huber, G. M.; Iodice, M.; de Jager, C. W.; Jaminion, S.; Jiang, X.; Jones, M. K.; Kahl, W.; Kelly, J. J.; Khayat, M.; Kramer, L. H.; Kumbartzki, G.; Kuss, M.; Lakuriki, E.; Laveissière, G.; Lerose, J. J.; Liang, M.; Lindgren, R. A.; Liyanage, N.; Lolos, G. J.; Macri, R.; Madey, R.; Malov, S.; Margaziotis, D. J.; Markowitz, P.; McCormick, K.; McIntyre, J. I.; Meer, R. L.; Michaels, R.; Milbrath, B. D.; Mougey, J. Y.; Nanda, S. K.; Offermann, E. A.; Papandreou, Z.; Pentchev, L.; Petratos, G. G.; Piskunov, N. M.; Pomatsalyuk, R. I.; Prout, D. L.; Quéméner, G.; Ransome, R. D.; Raue, B. A.; Roblin, Y.; Roche, R.; Rutledge, G.; Rutt, P. M.; Saha, A.; Saito, T.; Sarty, A. J.; Smith, T. P.; Sorokin, P.; Strauch, S.; Suleiman, R.; Takahashi, K.; Templon, J. A.; Todor, L.; Ulmer, P. E.; Urciuoli, G. M.; Vernin, P.; Vlahovic, B.; Voskanyan, H.; Wijesooriya, K.; Wojtsekhowski, B. B.; Woo, R. J.; Xiong, F.; Zainea, G. D.; Zhou, Z.-L.
2005-05-01
The ratio of the proton elastic electromagnetic form factors, GEp/GMp, was obtained by measuring Pt and Pℓ, the transverse and longitudinal recoil proton polarization components, respectively, for the elastic e→p→ep→reaction in the four-momentum transfer squared range of 0.5 to 3.5GeV2. In the single-photon exchange approximation, GEp/GMp is directly proportional to Pt/Pℓ. The simultaneous measurement of Pt and Pℓ in a polarimeter reduces systematic uncertainties. The results for GEp/GMp show a systematic decrease with increasing Q2, indicating for the first time a definite difference in the distribution of charge and magnetization in the proton. The data have been reanalyzed and their systematic uncertainties have become significantly smaller than those reported previously.
Traas, T P; Luttik, R; Jongbloed, R H
1996-08-01
In previous studies, the risk of toxicant accumulation in food chains was used to calculate quality criteria for surface water and soil. A simple algorithm was used to calculate maximum permissable concentrations [MPC = no-observed-effect concentration/bioconcentration factor(NOEC/BCF)]. These studies were limited to simple food chains. This study presents a method to calculate MPCs for more complex food webs of predators. The previous method is expanded. First, toxicity data (NOECs) for several compounds were corrected for differences between laboratory animals and animals in the wild. Second, for each compound, it was assumed these NOECs were a sample of a log-logistic distribution of mammalian and avian NOECs. Third, bioaccumulation factors (BAFs) for major food items of predators were collected and were assumed to derive from different log-logistic distributions of BAFs. Fourth, MPCs for each compound were calculated using Monte Carlo sampling from NOEC and BAF distributions. An uncertainty analysis for cadmium was performed to identify the most uncertain parameters of the model. Model analysis indicated that most of the prediction uncertainty of the model can be ascribed to uncertainty of species sensitivity as expressed by NOECs. A very small proportion of model uncertainty is contributed by BAFs from food webs. Correction factors for the conversion of NOECs from laboratory conditions to the field have some influence on the final value of MPC5, but the total prediction uncertainty of the MPC is quite large. It is concluded that the uncertainty in species sensitivity is quite large. To avoid unethical toxicity testing with mammalian or avian predators, it cannot be avoided to use this uncertainty in the method proposed to calculate MPC distributions. The fifth percentile of the MPC is suggested as a safe value for top predators.
[Selected ethical problems of oncologic patients during the terminal period].
Iwaszczyszyn, J; Kwiecińska, A
2001-01-01
Patient suffering from terminal disease is depended on his environment more than any other one. He often suffers from nervous break down, anxiety and fear and he is usually unprotected from the environment. Fast development of medical science and its technicisation can lead towards dehumanization and lack of psychological and spiritual care, which should be based on clear ethical principles. Main lines of ethical principles of Health Service which are included in Deontological Code of Physicians and Collection of ethical principles for a qualified nurse are the main rules how to proceed as to fulfill the rule: "benefit of a patient is the superior law." According to its speciality Palliative Medicine introduces also four general ethical principles: 1. Patient will is a rule of treatment. 2. The principle of proportion--benefits from the treatment should be higher than losses and suffering from iatrogenic acting. 3. The principle of equality--stop taking a cure does not differ from not undertaking treatment. 4. The principle of relativity--life is not an absolute good, death is not an absolute evil. Holistic acts of Palliative Medicine determines also specific ethical attitudes, especially in the following: 1. Communication between a therapist and a patient and his family (interpersonal attitudes). 2. Procedures how to lessen suffering and its interpretation according to culture, tradition and religion ("nonsense and significance of suffering"). 3. Negation of euthanasia. 4. Spiritual, psychological and social care of patients.
Age-related inequalities in health and healthcare: the life stages approach.
Jecker, Nancy S
2018-06-01
How should healthcare systems prepare to care for growing numbers and proportions of older people? Older people generally suffer worse health than younger people do. Should societies take steps to reduce age-related health inequalities? Some express concern that doing so would increase age-related inequalities in healthcare. This paper addresses this debate by (1) presenting an argument in support of three principles for distributing scarce resources between age groups; (2) framing these principles of age group justice in terms of life stages; and (3) indicating policy implications that merit further attention in light of rapidly aging societies. © 2017 John Wiley & Sons Ltd.
Climate system properties determining the social cost of carbon
NASA Astrophysics Data System (ADS)
Otto, Alexander; Todd, Benjamin J.; Bowerman, Niel; Frame, David J.; Allen, Myles R.
2013-06-01
The choice of an appropriate scientific target to guide global mitigation efforts is complicated by uncertainties in the temperature response to greenhouse gas emissions. Much climate policy discourse has been based on the equilibrium global mean temperature increase following a concentration stabilization scenario. This is determined by the equilibrium climate sensitivity (ECS) which, in many studies, shows persistent, fat-tailed uncertainty. However, for many purposes, the equilibrium response is less relevant than the transient response. Here, we show that one prominent policy variable, the social cost of carbon (SCC), is generally better constrained by the transient climate response (TCR) than by the ECS. Simple analytic expressions show the SCC to be directly proportional to the TCR under idealized assumptions when the rate at which we discount future damage equals 2.8%. Using ensemble simulations of a simple climate model we find that knowing the true value of the TCR can reduce the relative uncertainty in the SCC substantially more, up to a factor of 3, than knowing the ECS under typical discounting assumptions. We conclude that the TCR, which is better constrained by observations, less subject to fat-tailed uncertainty and more directly related to the SCC, is generally preferable to the ECS as a single proxy for the climate response in SCC calculations.
Robust gaze-steering of an active vision system against errors in the estimated parameters
NASA Astrophysics Data System (ADS)
Han, Youngmo
2015-01-01
Gaze-steering is often used to broaden the viewing range of an active vision system. Gaze-steering procedures are usually based on estimated parameters such as image position, image velocity, depth and camera calibration parameters. However, there may be uncertainties in these estimated parameters because of measurement noise and estimation errors. In this case, robust gaze-steering cannot be guaranteed. To compensate for such problems, this paper proposes a gaze-steering method based on a linear matrix inequality (LMI). In this method, we first propose a proportional derivative (PD) control scheme on the unit sphere that does not use depth parameters. This proposed PD control scheme can avoid uncertainties in the estimated depth and camera calibration parameters, as well as inconveniences in their estimation process, including the use of auxiliary feature points and highly non-linear computation. Furthermore, the control gain of the proposed PD control scheme on the unit sphere is designed using LMI such that the designed control is robust in the presence of uncertainties in the other estimated parameters, such as image position and velocity. Simulation results demonstrate that the proposed method provides a better compensation for uncertainties in the estimated parameters than the contemporary linear method and steers the gaze of the camera more steadily over time than the contemporary non-linear method.
Cost of remembering a bit of information
NASA Astrophysics Data System (ADS)
Chiuchiù; , D.; López-Suárez, M.; Neri, I.; Diamantini, M. C.; Gammaitoni, L.
2018-05-01
In 1961, Landauer [R. Landauer, IBM J. Res. Develop. 5, 183 (1961), 10.1147/rd.53.0183] pointed out that resetting a binary memory requires a minimum energy of kBT ln(2 ) . However, once written, any memory is doomed to lose its content if no action is taken. To avoid memory losses, a refresh procedure is periodically performed. We present a theoretical model and an experiment on a microelectromechanical system to evaluate the minimum energy required to preserve one bit of information over time. Two main conclusions are drawn: (i) in principle, the energetic cost to preserve information for a fixed time duration with a given error probability can be arbitrarily reduced if the refresh procedure is performed often enough, and (ii) the Heisenberg uncertainty principle sets an upper bound on the memory lifetime.
Evaluating the uncertainty of input quantities in measurement models
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Elster, Clemens
2014-06-01
The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hub, Martina; Thieke, Christian; Kessler, Marc L.
2012-04-15
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts formore » the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.« less
NASA Astrophysics Data System (ADS)
Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.
2017-04-01
Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.
Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.
2012-01-01
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well. PMID:22482640
Can neutrino mass be deduced from beta particle spectrum?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Semkow, T.M.
1993-12-31
With 17-keV neutrino faith being uncertain, it is important to examine the effects of detector resolution and response on the detection limits of massive neutrino. The authors use Fermi theory and generate by Monte Carlo up to 5-10{sup 9} {beta}{sup {minus}} decay events from {sup 35}S. The {beta}{sup {minus}} spectra are then resolved by {chi}{sup 2} minimization. We show that given high statistics and accurate knowledge of the response function it should be possible to detect neutrino mass with a proportional detector, particularly with the gas-scintillation proportional detector, in addition to semiconductor, in addition to semiconductor detectors. This paper presentsmore » a design of double-chamber Xe gas-scintillation proportional detector in which the backscattering effects are suppressed. However, even the slight uncertainties in the response functions as well as {approximately} 10{sup {minus}3} relative energy nonlinearities in the {beta}{sup {minus}} spectrum may create an artificial effect of neutrino mass.« less
Uncertainty in geocenter estimates in the context of ITRF2014
NASA Astrophysics Data System (ADS)
Riddell, Anna R.; King, Matt A.; Watson, Christopher S.; Sun, Yu; Riva, Riccardo E. M.; Rietbroek, Roelof
2017-05-01
Uncertainty in the geocenter position and its subsequent motion affects positioning estimates on the surface of the Earth and downstream products such as site velocities, particularly the vertical component. The current version of the International Terrestrial Reference Frame, ITRF2014, derives its origin as the long-term averaged center of mass as sensed by satellite laser ranging (SLR), and by definition, it adopts only linear motion of the origin with uncertainty determined using a white noise process. We compare weekly SLR translations relative to the ITRF2014 origin, with network translations estimated from station displacements from surface mass transport models. We find that the proportion of variance explained in SLR translations by the model-derived translations is on average less than 10%. Time-correlated noise and nonlinear rates, particularly evident in the Y and Z components of the SLR translations with respect to the ITRF2014 origin, are not fully replicated by the model-derived translations. This suggests that translation-related uncertainties are underestimated when a white noise model is adopted and that substantial systematic errors remain in the data defining the ITRF origin. When using a white noise model, we find uncertainties in the rate of SLR X, Y, and Z translations of ±0.03, ±0.03, and ±0.06, respectively, increasing to ±0.13, ±0.17, and ±0.33 (mm/yr, 1 sigma) when a power law and white noise model is adopted.
ON HIGHLY CLUMPED MAGNETIC WIND MODELS FOR COOL EVOLVED STARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, G. M.
2010-09-10
Recently, it has been proposed that the winds of non-pulsating and non-dusty K and M giants and supergiants may be driven by some form of magnetic pressure acting on highly clumped wind material. While many researchers believe that magnetic processes are responsible for cool evolved stellar winds, existing MHD and Alfven wave-driven wind models have magnetic fields that are essentially radial and tied to the photosphere. The clumped magnetic wind scenario is quite different in that the magnetic flux is also being carried away from the star with the wind. We test this clumped wind hypothesis by computing continuum radiomore » fluxes from the {zeta} Aur semiempirical model of Baade et al., which is based on wind-scattered line profiles. The radio continuum opacity is proportional to the electron density squared, while the line scattering opacity is proportional to the gas density. This difference in proportionality provides a test for the presence of large clumping factors. We derive the radial distribution of clump factors (CFs) for {zeta} Aur by comparing the nonthermal pressures required to produce the semiempirical velocity distribution with the expected thermal pressures. The CFs are {approx}5 throughout the sub-sonic inner wind region and then decline outward. These implied clumping factors lead to excess radio emission at 2.0 cm, while at 6.2 cm it improves agreement with the smooth unclumped model. Smaller clumping factors of {approx}2 lead to better overall agreement but also increase the discrepancy at 2 cm. These results do not support the magnetic clumped wind hypothesis and instead suggest that inherent uncertainties in the underlying semiempirical model probably dominate uncertainties in predicted radio fluxes. However, new ultraviolet line and radio continuum observations are needed to test the new generations of inhomogeneous magnetohydrodynamic wind models.« less
Assessing climate change-robustness of protected area management plans-The case of Germany.
Geyer, Juliane; Kreft, Stefan; Jeltsch, Florian; Ibisch, Pierre L
2017-01-01
Protected areas are arguably the most important instrument of biodiversity conservation. To keep them fit under climate change, their management needs to be adapted to address related direct and indirect changes. In our study we focus on the adaptation of conservation management planning, evaluating management plans of 60 protected areas throughout Germany with regard to their climate change-robustness. First, climate change-robust conservation management was defined using 11 principles and 44 criteria, which followed an approach similar to sustainability standards. We then evaluated the performance of individual management plans concerning the climate change-robustness framework. We found that climate change-robustness of protected areas hardly exceeded 50 percent of the potential performance, with most plans ranking in the lower quarter. Most Natura 2000 protected areas, established under conservation legislation of the European Union, belong to the sites with especially poor performance, with lower values in smaller areas. In general, the individual principles showed very different rates of accordance with our principles, but similarly low intensity. Principles with generally higher performance values included holistic knowledge management, public accountability and acceptance as well as systemic and strategic coherence. Deficiencies were connected to dealing with the future and uncertainty. Lastly, we recommended the presented principles and criteria as essential guideposts that can be used as a checklist for working towards more climate change-robust planning.
Assessing climate change-robustness of protected area management plans—The case of Germany
Geyer, Juliane; Kreft, Stefan; Jeltsch, Florian; Ibisch, Pierre L.
2017-01-01
Protected areas are arguably the most important instrument of biodiversity conservation. To keep them fit under climate change, their management needs to be adapted to address related direct and indirect changes. In our study we focus on the adaptation of conservation management planning, evaluating management plans of 60 protected areas throughout Germany with regard to their climate change-robustness. First, climate change-robust conservation management was defined using 11 principles and 44 criteria, which followed an approach similar to sustainability standards. We then evaluated the performance of individual management plans concerning the climate change-robustness framework. We found that climate change-robustness of protected areas hardly exceeded 50 percent of the potential performance, with most plans ranking in the lower quarter. Most Natura 2000 protected areas, established under conservation legislation of the European Union, belong to the sites with especially poor performance, with lower values in smaller areas. In general, the individual principles showed very different rates of accordance with our principles, but similarly low intensity. Principles with generally higher performance values included holistic knowledge management, public accountability and acceptance as well as systemic and strategic coherence. Deficiencies were connected to dealing with the future and uncertainty. Lastly, we recommended the presented principles and criteria as essential guideposts that can be used as a checklist for working towards more climate change-robust planning. PMID:28982187
Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances
NASA Astrophysics Data System (ADS)
Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng
2016-04-01
Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.
Nuclear Physics Meets the Sources of the Ultra-High Energy Cosmic Rays.
Boncioli, Denise; Fedynitch, Anatoli; Winter, Walter
2017-07-07
The determination of the injection composition of cosmic ray nuclei within astrophysical sources requires sufficiently accurate descriptions of the source physics and the propagation - apart from controlling astrophysical uncertainties. We therefore study the implications of nuclear data and models for cosmic ray astrophysics, which involves the photo-disintegration of nuclei up to iron in astrophysical environments. We demonstrate that the impact of nuclear model uncertainties is potentially larger in environments with non-thermal radiation fields than in the cosmic microwave background. We also study the impact of nuclear models on the nuclear cascade in a gamma-ray burst radiation field, simulated at a level of complexity comparable to the most precise cosmic ray propagation code. We conclude with an isotope chart describing which information is in principle necessary to describe nuclear interactions in cosmic ray sources and propagation.
Experimental Test of Heisenberg's Measurement Uncertainty Relation Based on Statistical Distances.
Ma, Wenchao; Ma, Zhihao; Wang, Hengyan; Chen, Zhihua; Liu, Ying; Kong, Fei; Li, Zhaokai; Peng, Xinhua; Shi, Mingjun; Shi, Fazhan; Fei, Shao-Ming; Du, Jiangfeng
2016-04-22
Incompatible observables can be approximated by compatible observables in joint measurement or measured sequentially, with constrained accuracy as implied by Heisenberg's original formulation of the uncertainty principle. Recently, Busch, Lahti, and Werner proposed inaccuracy trade-off relations based on statistical distances between probability distributions of measurement outcomes [P. Busch et al., Phys. Rev. Lett. 111, 160405 (2013); P. Busch et al., Phys. Rev. A 89, 012129 (2014)]. Here we reformulate their theoretical framework, derive an improved relation for qubit measurement, and perform an experimental test on a spin system. The relation reveals that the worst-case inaccuracy is tightly bounded from below by the incompatibility of target observables, and is verified by the experiment employing joint measurement in which two compatible observables designed to approximate two incompatible observables on one qubit are measured simultaneously.
A Review of the Battle of Britain in Context of AFM 1-1 Principles of War
1984-04-01
consists of discussion questions and rationale as an aid for leading a guided discussion.j DD IFO 1473 EDITION OF INOV 65 IS OBSOLETE UNCLASSIFIED...challenge facing England. In June of 1940, Hitler ruled a vast proportion of Europe after easy victories in Poland , Denmark, Norway, and Belgium. On
Pareto 80/20 Law: Derivation via Random Partitioning
ERIC Educational Resources Information Center
Lipovetsky, Stan
2009-01-01
The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…
Sinking in Quicksand: An Applied Approach to the Archimedes Principle
ERIC Educational Resources Information Center
Evans, G. M.; Evans, S. C.; Moreno-Atanasio, R.
2015-01-01
The objective of this paper is to present a laboratory experiment that explains the phenomenon of sinking in quicksand simulated as a fluidized bed. The paper demonstrates experimentally and theoretically that the proportion of a body that sinks in quicksand depends on the volume fraction of solids and the density of the body relative to the…
Sensitivities of seismic velocities to temperature, pressure and composition in the lower mantle
NASA Astrophysics Data System (ADS)
Trampert, Jeannot; Vacher, Pierre; Vlaar, Nico
2001-08-01
We calculated temperature, pressure and compositional sensitivities of seismic velocities in the lower mantle using latest mineral physics data. The compositional variable refers to the volume proportion of perovskite in a simplified perovskite-magnesiowüstite mantle assemblage. The novelty of our approach is the exploration of a reasonable range of input parameters which enter the lower mantle extrapolations. This leads to realistic error bars on the sensitivities. Temperature variations can be inferred throughout the lower mantle within a good degree of precision. Contrary to the uppermost mantle, modest compositional changes in the lower mantle can be detected by seismic tomography, with a larger uncertainty though. A likely trade-off between temperature and composition will be largely determined by uncertainties in tomography itself. Given current sources of uncertainties on recent data, anelastic contributions to the temperature sensitivities (calculated using Karato's approach) appear less significant than previously thought. Recent seismological determinations of the ratio of relative S to P velocity heterogeneity can be entirely explain by thermal effects, although isolated spots beneath Africa and the Central Pacific in the lowermost mantle may ask for a compositional origin.
NASA Astrophysics Data System (ADS)
Yang, I.-Sheng
2018-03-01
We derive the time scale for two initially pure subsystems to become entangled with each other through an arbitrary Hamiltonian that couples them. The entanglement timescale is inversely proportional to the "correlated uncertainty" between the two subsystems, a quantity which we will define and analyze in this paper. Our result is still applicable when one of the subsystems started in an arbitrarily mixed state, thus it generalizes the well-known "decoherence time scale" while coupled to a thermal state.
Zheng, Yelong; Lu, Hongyu; Yin, Wei; Tao, Dashuai; Shi, Lichun; Tian, Yu
2016-10-07
Forces acted on legs of water-walking arthropods with weights in dynes are of great interest for entomologist, physicists, and engineers. While their floating mechanism has been recognized, the in vivo leg forces stationary have not yet been simultaneously achieved. In this study, their elegant bright-edged leg shadows are used to make the tiny forces visible and measurable based on the updated Archimedes' principle. The force was approximately proportional to the shadow area with a resolution from nanonewton to piconewton/pixel. The sum of leg forces agreed well with the body weight measured with an accurate electronic balance, which verified updated Archimedes' principle at the arthropod level. The slight changes of vertical body weight focus position and the body pitch angle have also been revealed for the first time. The visualization of tiny force by shadow is cost-effective and very sensitive and could be used in many other applications.
Heisenberg's uncertainty principle for simultaneous measurement of positive-operator-valued measures
NASA Astrophysics Data System (ADS)
Miyadera, Takayuki; Imai, Hideki
2008-11-01
A limitation on simultaneous measurement of two arbitrary positive-operator-valued measures is discussed. In general, simultaneous measurement of two noncommutative observables is only approximately possible. Following Werner’s formulation, we introduce a distance between observables to quantify an accuracy of measurement. We derive an inequality that relates the achievable accuracy with noncommutativity between two observables. As a byproduct a necessary condition for two positive-operator-valued measures to be simultaneously measurable is obtained.
2006-09-01
expected advancements in information technology and library science offer the best hope of resolving the above concerns. vi • An EWA will be...information technology and library science must be utilized to accomplish this. Some DOD research investment may be required to resolve DOD specific...distributed assessment process that exploits the documentation of all of the CEST issues, advances in information technology and library science , and the
Determination of the Unstable States of the Solid State Plasma in Semiconductor Devices
1988-05-01
of the carrier moving through the lattice potentials, which alter the carrier’s response to an external electromag- netic field. so If the average...see quantum mechanical affects from the lattice potentials and a spread in carrier momentums due to the Heisenburg Uncertainty Principle. We can...us to account for the quantum mechanical source of the plasma. That source is the lattice . At values of the quantum compression parameter near unity
Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited
NASA Astrophysics Data System (ADS)
Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong
2018-03-01
By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.
High-performance multi-channel fiber-based absolute distance measuring interferometer system
NASA Astrophysics Data System (ADS)
Deck, Leslie L.
2009-08-01
I describe the principle of operation and performance of a fiber-based absolute distance measuring interferometer system with 60 independent simultaneous channels. The system was designed for demanding applications requiring passive, electrically immune sensors with an extremely long MTTF. In addition to providing better than 0.3nm measurement repeatability at 5KHz for all channels, the system demonstrated absolute distance uncertainty of less than 5nm over a 500 micron measurement range.
Entropy of Vaidya Black Hole on Apparent Horizon with Minimal Length Revisited
NASA Astrophysics Data System (ADS)
Tang, Hao; Wu, Bin; Sun, Cheng-yi; Song, Yu; Yue, Rui-hong
2018-07-01
By considering the generalized uncertainty principle, the degrees of freedom near the apparent horizon of Vaidya black hole are calculated with the thin film model. The result shows that a cut-off can be introduced naturally rather than taking by hand. Furthermore, if the minimal length is chosen to be a specific value, the statistical entropy will satisfy the conventional area law at the horizon, which might reveal some deep things of the minimal length.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooperstock, F.I., E-mail: cooperst@uvic.ca; Dupre, M.J., E-mail: mdupre@tulane.edu
We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energy–momentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system.more » The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum. -- Highlights: •We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. •Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energy–momentum. •Localized energy via the Ricci integral is consistent with the energy localization hypothesis. •New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. •Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energy–momentum in strong gravity extreme.« less
Chloride and bromide sources in water: Quantitative model use and uncertainty
NASA Astrophysics Data System (ADS)
Horner, Kyle N.; Short, Michael A.; McPhail, D. C.
2017-06-01
Dissolved chloride is a commonly used geochemical tracer in hydrological studies. Assumptions underlying many chloride-based tracer methods do not hold where processes such as halide-bearing mineral dissolution, fluid mixing, or diffusion modify dissolved Cl- concentrations. Failure to identify, quantify, or correct such processes can introduce significant uncertainty to chloride-based tracer calculations. Mass balance or isotopic techniques offer a means to address this uncertainty, however, concurrent evaporation or transpiration can complicate corrections. In this study Cl/Br ratios are used to derive equations that can be used to correct a solution's total dissolved Cl- and Br- concentration for inputs from mineral dissolution and/or binary mixing. We demonstrate the equations' applicability to waters modified by evapotranspiration. The equations can be used to quickly determine the maximum proportion of dissolved Cl- and Br- from each end-member, providing no halide-bearing minerals have precipitated and the Cl/Br ratio of each end member is known. This allows rapid evaluation of halite dissolution or binary mixing contributions to total dissolved Cl- and Br-. Equation sensitivity to heterogeneity and analytical uncertainty is demonstrated through bench-top experiments simulating halite dissolution and variable degrees of evapotranspiration, as commonly occur in arid environments. The predictions agree with the experimental results to within 6% and typically much less, with the sensitivity of the predicted results varying as a function of end-member compositions and analytical uncertainty. Finally, we present a case-study illustrating how the equations presented here can be used to quantify Cl- and Br- sources and sinks in surface water and groundwater and how the equations can be applied to constrain uncertainty in chloride-based tracer calculations.
Using analogues to quantify geological uncertainty in stochastic reserve modelling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wells, B.; Brown, I.
1995-08-01
The petroleum industry seeks to minimize exploration risk by employing the best possible expertise, methods and tools. Is it possible to quantify the success of this process of risk reduction? Due to inherent uncertainty in predicting geological reality and due to changing environments for hydrocarbon exploration, it is not enough simply to record the proportion of successful wells drilled; in various parts of the world it has been noted that pseudo-random drilling would apparently have been as successful as the actual drilling programme. How, then, should we judge the success of risk reduction? For many years the E&P industry hasmore » routinely used Monte Carlo modelling to generate a probability distribution for prospect reserves. One aspect of Monte Carlo modelling which has received insufficient attention, but which is essential for quantifying risk reduction, is the consistency and repeatability with which predictions can be made. Reducing the subjective element inherent in the specification of geological uncertainty allows better quantification of uncertainty in the prediction of reserves, in both exploration and appraisal. Building on work reported at the AAPG annual conventions in 1994 and 1995, the present paper incorporates analogue information with uncertainty modelling. Analogues provide a major step forward in the quantification of risk, but their significance is potentially greater still. The two principal contributors to uncertainty in field and prospect analysis are the hydrocarbon life-cycle and the geometry of the trap. These are usually treated separately. Combining them into a single model is a major contribution to the reduction risk. This work is based in part on a joint project with Oryx Energy UK Ltd., and thanks are due in particular to Richard Benmore and Mike Cooper.« less
Recruiting Minority Men Who Have Sex With Men for HIV Research: Results From a 4-City Campaign
Silvestre, Anthony J.; Hylton, John B.; Johnson, Lisette M.; Houston, Carmoncelia; Witt, Mallory; Jacobson, Lisa; Ostrow, David
2006-01-01
We describe the efforts of a 4-city campaign to recruit Black and Hispanic men who have sex with men into an established HIV epidemiological study. The campaign used community organizing principles and a social marketing model that focused on personnel, location, product, costs and benefits, and promotion. The campaign was developed at the community, group, and individual levels to both increase trust and reduce barriers. The proportion of Hispanic men recruited during the 2002–2003 campaign doubled compared with the 1987 campaign, and the proportion and number of White men decreased by 20%. The proportion of Black men decreased because of the large increase in Hispanic men, although the number of Black men increased by 56%. Successful recruitment included training recruitment specialists, involving knowledgeable minority community members during planning, and having an accessible site with convenient hours. PMID:16670218
NASA Technical Reports Server (NTRS)
Strahler, Alan H.; Li, Xiao-Wen; Jupp, David L. B.
1991-01-01
The bidirectional radiance or reflectance of a forest or woodland can be modeled using principles of geometric optics and Boolean models for random sets in a three dimensional space. This model may be defined at two levels, the scene includes four components; sunlight and shadowed canopy, and sunlit and shadowed background. The reflectance of the scene is modeled as the sum of the reflectances of the individual components as weighted by their areal proportions in the field of view. At the leaf level, the canopy envelope is an assemblage of leaves, and thus the reflectance is a function of the areal proportions of sunlit and shadowed leaf, and sunlit and shadowed background. Because the proportions of scene components are dependent upon the directions of irradiance and exitance, the model accounts for the hotspot that is well known in leaf and tree canopies.
Missing observations in multiyear rotation sampling designs
NASA Technical Reports Server (NTRS)
Gbur, E. E.; Sielken, R. L., Jr. (Principal Investigator)
1982-01-01
Because Multiyear estimation of at-harvest stratum crop proportions is more efficient than single year estimation, the behavior of multiyear estimators in the presence of missing acquisitions was studied. Only the (worst) case when a segment proportion cannot be estimated for the entire year is considered. The effect of these missing segments on the variance of the at-harvest stratum crop proportion estimator is considered when missing segments are not replaced, and when missing segments are replaced by segments not sampled in previous years. The principle recommendations are to replace missing segments according to some specified strategy, and to use a sequential procedure for selecting a sampling design; i.e., choose an optimal two year design and then, based on the observed two year design after segment losses have been taken into account, choose the best possible three year design having the observed two year parent design.
Psychotherapy for neurologists.
Hobday, Gabrielle S; Gabbard, Glen O
2009-07-01
Psychotherapy has traditionally been regarded as the purview of psychiatry rather than neurology. Yet, the doctor-patient relationship is fundamental to both specialties, and the principles that derive from psychotherapy theory and practice apply to that relationship regardless of the specialty. It is common knowledge that a large proportion of patients seen in the context of the practice of medicine have some kind of emotional disturbance. Moreover, patients with organic disease may also have significant emotional difficulties that complicate both the primary illness and its treatment. This experience inevitably has drawn attention to the need for the nonpsychiatric physician to have an understanding and proficiency in psychiatric diagnosis and psychotherapeutic principles. In this article, we consider basic psychotherapeutic principles that are useful in the everyday practice of neurologists and other nonpsychiatric physicians. These skills are important not only for practical reasons, but also because responsiveness to their emotional distress is essential to maintain empathy and caring as cornerstones of the art of medicine. With the use of clinical examples to illustrate these principles, we hope that readers can apply them to their own clinical experiences.
Prediction and Validation of Mars Pathfinder Hypersonic Aerodynamic Data Base
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Braun, Robert D.; Weilmuenster, K. James; Mitcheltree, Robert A.; Engelund, Walter C.; Powell, Richard W.
1998-01-01
Postflight analysis of the Mars Pathfinder hypersonic, continuum aerodynamic data base is presented. Measured data include accelerations along the body axis and axis normal directions. Comparisons of preflight simulation and measurements show good agreement. The prediction of two static instabilities associated with movement of the sonic line from the shoulder to the nose and back was confirmed by measured normal accelerations. Reconstruction of atmospheric density during entry has an uncertainty directly proportional to the uncertainty in the predicted axial coefficient. The sensitivity of the moment coefficient to freestream density, kinetic models and center-of-gravity location are examined to provide additional consistency checks of the simulation with flight data. The atmospheric density as derived from axial coefficient and measured axial accelerations falls within the range required for sonic line shift and static stability transition as independently determined from normal accelerations.
Sterba, Sonya K; Rights, Jason D
2016-01-01
Item parceling remains widely used under conditions that can lead to parcel-allocation variability in results. Hence, researchers may be interested in quantifying and accounting for parcel-allocation variability within sample. To do so in practice, three key issues need to be addressed. First, how can we combine sources of uncertainty arising from sampling variability and parcel-allocation variability when drawing inferences about parameters in structural equation models? Second, on what basis can we choose the number of repeated item-to-parcel allocations within sample? Third, how can we diagnose and report proportions of total variability per estimate arising due to parcel-allocation variability versus sampling variability? This article addresses these three methodological issues. Developments are illustrated using simulated and empirical examples, and software for implementing them is provided.
Top quark mass determination from the energy peaks of b-jets and B-hadrons at NLO QCD
Agashe, Kaustubh; Franceschini, Roberto; Kim, Doojin; ...
2016-11-21
Here, we analyze the energy spectra of single b-jets and B-hadrons resulting from the production and decay of top quarks within the SM at the LHC at the NLO QCD. For both hadrons and jets, we calculate the correlation of the peak of the spectrum with the top quark mass, considering the “energy peak” as an observable to determine the top quarkmass. Such a method is motivated by our previous work where we argued that this approach can have reduced sensitivity to the details of the production mechanism of the top quark, whether it concerns higher-order QCD effects or newmore » physics contributions. For a 1% jet energy scale uncertainty, the top quark mass can then be extracted using the energy peak of b-jets with an error ±(1.2(exp) + 0.6(th)) GeV. In view of the dominant jet energy scale uncertainty in the measurement using b-jets, we also investigate the extraction of the top quark mass from the energy peak of the corresponding B-hadrons which, in principle, can be measured without this uncertainty. The calculation of the B-hadron energy spectrum is carried out using fragmentation functions at NLO. The dependence on the fragmentation scale turns out to be the largest theoretical uncertainty in this extraction of top quark mass.« less
Top quark mass determination from the energy peaks of b-jets and B-hadrons at NLO QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agashe, Kaustubh; Franceschini, Roberto; Kim, Doojin
Here, we analyze the energy spectra of single b-jets and B-hadrons resulting from the production and decay of top quarks within the SM at the LHC at the NLO QCD. For both hadrons and jets, we calculate the correlation of the peak of the spectrum with the top quark mass, considering the “energy peak” as an observable to determine the top quarkmass. Such a method is motivated by our previous work where we argued that this approach can have reduced sensitivity to the details of the production mechanism of the top quark, whether it concerns higher-order QCD effects or newmore » physics contributions. For a 1% jet energy scale uncertainty, the top quark mass can then be extracted using the energy peak of b-jets with an error ±(1.2(exp) + 0.6(th)) GeV. In view of the dominant jet energy scale uncertainty in the measurement using b-jets, we also investigate the extraction of the top quark mass from the energy peak of the corresponding B-hadrons which, in principle, can be measured without this uncertainty. The calculation of the B-hadron energy spectrum is carried out using fragmentation functions at NLO. The dependence on the fragmentation scale turns out to be the largest theoretical uncertainty in this extraction of top quark mass.« less
NASA Astrophysics Data System (ADS)
Donner, S. D.; Webber, S.
2011-12-01
Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.
Quantum scattering in one-dimensional systems satisfying the minimal length uncertainty relation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph
In quantum gravity theories, when the scattering energy is comparable to the Planck energy the Heisenberg uncertainty principle breaks down and is replaced by the minimal length uncertainty relation. In this paper, the consequences of the minimal length uncertainty relation on one-dimensional quantum scattering are studied using an approach involving a recently proposed second-order differential equation. An exact analytical expression for the tunneling probability through a locally-periodic rectangular potential barrier system is obtained. Results show that the existence of a non-zero minimal length uncertainty tends to shift the resonant tunneling energies to the positive direction. Scattering through a locally-periodic potentialmore » composed of double-rectangular potential barriers shows that the first band of resonant tunneling energies widens for minimal length cases when the double-rectangular potential barrier is symmetric but narrows down when the double-rectangular potential barrier is asymmetric. A numerical solution which exploits the use of Wronskians is used to calculate the transmission probabilities through the Pöschl–Teller well, Gaussian barrier, and double-Gaussian barrier. Results show that the probability of passage through the Pöschl–Teller well and Gaussian barrier is smaller in the minimal length cases compared to the non-minimal length case. For the double-Gaussian barrier, the probability of passage for energies that are more positive than the resonant tunneling energy is larger in the minimal length cases compared to the non-minimal length case. The approach is exact and applicable to many types of scattering potential.« less
NASA Astrophysics Data System (ADS)
Li, Chuan-Yao; Huang, Hai-Jun; Tang, Tie-Qiao
2017-03-01
This paper investigates the traffic flow dynamics under the social optimum (SO) principle in a single-entry traffic corridor with staggered shifts from the analytical and numerical perspectives. The LWR (Lighthill-Whitham and Richards) model and the Greenshield's velocity-density function are utilized to describe the dynamic properties of traffic flow. The closed-form SO solution is analytically derived and some numerical examples are used to further testify the analytical solution. The optimum proportion of the numbers of commuters with different desired arrival times is further discussed, where the analytical and numerical results both indicate that the cumulative outflow curve under the SO principle is piecewise smooth.
Karbowski, Jan
2015-01-01
The structure and quantitative composition of the cerebral cortex are interrelated with its computational capacity. Empirical data analyzed here indicate a certain hierarchy in local cortical composition. Specifically, neural wire, i.e., axons and dendrites take each about 1/3 of cortical space, spines and glia/astrocytes occupy each about (1/3)2, and capillaries around (1/3)4. Moreover, data analysis across species reveals that these fractions are roughly brain size independent, which suggests that they could be in some sense optimal and thus important for brain function. Is there any principle that sets them in this invariant way? This study first builds a model of local circuit in which neural wire, spines, astrocytes, and capillaries are mutually coupled elements and are treated within a single mathematical framework. Next, various forms of wire minimization rule (wire length, surface area, volume, or conduction delays) are analyzed, of which, only minimization of wire volume provides realistic results that are very close to the empirical cortical fractions. As an alternative, a new principle called “spine economy maximization” is proposed and investigated, which is associated with maximization of spine proportion in the cortex per spine size that yields equally good but more robust results. Additionally, a combination of wire cost and spine economy notions is considered as a meta-principle, and it is found that this proposition gives only marginally better results than either pure wire volume minimization or pure spine economy maximization, but only if spine economy component dominates. However, such a combined meta-principle yields much better results than the constraints related solely to minimization of wire length, wire surface area, and conduction delays. Interestingly, the type of spine size distribution also plays a role, and better agreement with the data is achieved for distributions with long tails. In sum, these results suggest that for the efficiency of local circuits wire volume may be more primary variable than wire length or temporal delays, and moreover, the new spine economy principle may be important for brain evolutionary design in a broader context. PMID:26436731
Comment on "Inference with minimal Gibbs free energy in information field theory".
Iatsenko, D; Stefanovska, A; McClintock, P V E
2012-03-01
Enßlin and Weig [Phys. Rev. E 82, 051112 (2010)] have introduced a "minimum Gibbs free energy" (MGFE) approach for estimation of the mean signal and signal uncertainty in Bayesian inference problems: it aims to combine the maximum a posteriori (MAP) and maximum entropy (ME) principles. We point out, however, that there are some important questions to be clarified before the new approach can be considered fully justified, and therefore able to be used with confidence. In particular, after obtaining a Gaussian approximation to the posterior in terms of the MGFE at some temperature T, this approximation should always be raised to the power of T to yield a reliable estimate. In addition, we show explicitly that MGFE indeed incorporates the MAP principle, as well as the MDI (minimum discrimination information) approach, but not the well-known ME principle of Jaynes [E.T. Jaynes, Phys. Rev. 106, 620 (1957)]. We also illuminate some related issues and resolve apparent discrepancies. Finally, we investigate the performance of MGFE estimation for different values of T, and we discuss the advantages and shortcomings of the approach.
NASA Astrophysics Data System (ADS)
Aurora, Tarlok
2013-04-01
In introductory physics, students verify Archimedes' principle by immersing an object in water in a container, with a side-spout to collect the displaced water, resulting in a large uncertainty, due to surface tension. A modified procedure was introduced, in which a plastic bucket is suspended from a force sensor, and an object hangs underneath the bucket. The object is immersed in water in a glass beaker (without any side spout), and the weight loss is measured with a computer-controlled force sensor. Instead of collecting the water displaced by the object, tap water was added to the bucket to compensate for the weight loss, and the Archimedes' principle was verified within less than a percent. With this apparatus, buoyant force was easily studied as a function of volume of displaced water; as well as a function of density of saline solution. By graphing buoyant force as a function of volume (or density of liquid), value of g was obtained from slope. Apparatus and sources of error will be discussed.
Pediatric disaster response in developed countries: ten guiding principles.
Brandenburg, Mark A; Arneson, Wendy L
2007-01-01
Mass casualty incidents and large-scale disasters involving children are likely to overwhelm a regional disaster response system. Children have unique vulnerabilities that require special considerations when developing pediatric response systems. Although medical and trauma strategies exist for the evaluation and treatment of children on a daily basis, the application of these strategies under conditions of resource-constrained triage and treatment have rarely been evaluated. A recent report, however, by the Institute of Medicine did conclude that on a day-to-day basis the U.S. healthcare system does not adequately provide emergency medical services for children. The variability, scale, and uncertainty of disasters call for a set of guiding principles rather than rigid protocols when developing pediatric response plans. The authors propose the following guiding principles in addressing the well-recognized, unique vulnerabilities of children: (1) terrorism prevention and preparedness, (2) all-hazards preparedness, (3) postdisaster disease and injury prevention, (4) nutrition and hydration, (5) equipment and supplies, (6) pharmacology, (7) mental health, (8) identification and reunification of displaced children, (9) day care and school, and (10) perinatology. It is hoped that the 10 guiding principles discussed in this article will serve as a basic framework for developing pediatric response plans and teams in developed countries.
Quantum Cryptography II: How to re-use a one-time pad safely even if P=NP.
Bennett, Charles H; Brassard, Gilles; Breidbart, Seth
2014-01-01
When elementary quantum systems, such as polarized photons, are used to transmit digital information, the uncertainty principle gives rise to novel cryptographic phenomena unachievable with traditional transmission media, e.g. a communications channel on which it is impossible in principle to eavesdrop without a high probability of being detected. With such a channel, a one-time pad can safely be reused many times as long as no eavesdrop is detected, and, planning ahead, part of the capacity of these uncompromised transmissions can be used to send fresh random bits with which to replace the one-time pad when an eavesdrop finally is detected. Unlike other schemes for stretching a one-time pad, this scheme does not depend on complexity-theoretic assumptions such as the difficulty of factoring.
NASA Astrophysics Data System (ADS)
Goldhaber, Alfred; Requist, Ryan
2003-07-01
As a consequence of the Aharonov-Bohm effect, there is a quantum-induced attraction between a charged particle and a rigid, impenetrable hoop made from an arbitrarily thin tube containing a superconductor quantum of magnetic flux. This is remarkable because in classical physics there is no force between the two objects, and quantum-mechanical effects (associated with uncertainty-principle energy) generally are repulsive rather than attractive. For an incident spinless charged particle in a P wave (in a configuration with total angular momentum zero) we verify a resonance just above threshold using the Kohn variational principle in its S-matrix form. Even if optimistic choices of parameters describing a model system with these properties were feasible, the temperature required to observe the resonance would be far lower than has yet been attained in the laboratory.
Innovative surgery and the precautionary principle.
Meyerson, Denise
2013-12-01
Surgical innovation involves practices, such as new devices, technologies, procedures, or applications, which are novel and untested. Although innovative practices are believed to offer an improvement on the standard surgical approach, they may prove to be inefficacious or even dangerous. This article considers how surgeons considering innovation should reason in the conditions of uncertainty that characterize innovative surgery. What attitude to the unknown risks of innovative surgery should they take? The answer to this question involves value judgments about the acceptability of risk taking when satisfactory scientific information is not available. This question has been confronted in legal contexts, where risk aversion in the form of the precautionary principle has become increasingly influential as a regulatory response to innovative technologies that pose uncertain future hazards. This article considers whether it is appropriate to apply a precautionary approach when making decisions about innovative surgery.
ERIC Educational Resources Information Center
Brennan, Jewel E.
Alcoholism is a problem of immense proportions. Views about alcoholism range from consideration of the problem as a moral weakness to the disease concept approach. Since the effects of alcoholic intake can be benevolent as well as toxic, the dilemma centers around alcohol usage. Various theories have been formulated, experimented with, and…
Code of Federal Regulations, 2012 CFR
2012-01-01
... plans. The Department of Health and Human Services, in consultation with the other Federal agencies... equity in the asset will be refunded in the same proportion as Federal participation in its cost. In case.... Salaries and other expenses of the State legislature or similar local governmental bodies are unallowable...