Sample records for gap uncertainty principles

  1. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  2. Gamma-Ray Telescope and Uncertainty Principle

    ERIC Educational Resources Information Center

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  3. Extrapolation, uncertainty factors, and the precautionary principle.

    PubMed

    Steel, Daniel

    2011-09-01

    This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. What is the uncertainty principle of non-relativistic quantum mechanics?

    NASA Astrophysics Data System (ADS)

    Riggs, Peter J.

    2018-05-01

    After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.

  5. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  6. Self-completeness and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero

    2014-03-01

    The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self- consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.

  7. Self-completeness and the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero

    2013-11-01

    The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.

  8. Managing Uncertainty in Water Infrastructure Design Using Info-gap Robustness

    NASA Astrophysics Data System (ADS)

    Irias, X.; Cicala, D.

    2013-12-01

    Info-gap theory, a tool for managing deep uncertainty, can be of tremendous value for design of water systems in areas of high seismic risk. Maintaining reliable water service in those areas is subject to significant uncertainties including uncertainty of seismic loading, unknown seismic performance of infrastructure, uncertain costs of innovative seismic-resistant construction, unknown costs to repair seismic damage, unknown societal impacts from downtime, and more. Practically every major earthquake that strikes a population center reveals additional knowledge gaps. In situations of such deep uncertainty, info-gap can offer advantages over traditional approaches, whether deterministic approaches that use empirical safety factors to address the uncertainties involved, or probabilistic methods that attempt to characterize various stochastic properties and target a compromise between cost and reliability. The reason is that in situations of deep uncertainty, it may not be clear what safety factor would be reasonable, or even if any safety factor is sufficient to address the uncertainties, and we may lack data to characterize the situation probabilistically. Info-gap is a tool that recognizes up front that our best projection of the future may be wrong. Thus, rather than seeking a solution that is optimal for that projection, info-gap seeks a solution that works reasonably well for all plausible conditions. In other words, info-gap seeks solutions that are robust in the face of uncertainty. Info-gap has been used successfully across a wide range of disciplines including climate change science, project management, and structural design. EBMUD is currently using info-gap to help it gain insight into possible solutions for providing reliable water service to an island community within its service area. The island, containing about 75,000 customers, is particularly vulnerable to water supply disruption from earthquakes, since it has negligible water storage and is

  9. Gap Size Uncertainty Quantification in Advanced Gas Reactor TRISO Fuel Irradiation Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Binh T.; Einerson, Jeffrey J.; Hawkes, Grant L.

    The Advanced Gas Reactor (AGR)-3/4 experiment is the combination of the third and fourth tests conducted within the tristructural isotropic fuel development and qualification research program. The AGR-3/4 test consists of twelve independent capsules containing a fuel stack in the center surrounded by three graphite cylinders and shrouded by a stainless steel shell. This capsule design enables temperature control of both the fuel and the graphite rings by varying the neon/helium gas mixture flowing through the four resulting gaps. Knowledge of fuel and graphite temperatures is crucial for establishing the functional relationship between fission product release and irradiation thermal conditions.more » These temperatures are predicted for each capsule using the commercial finite-element heat transfer code ABAQUS. Uncertainty quantification reveals that the gap size uncertainties are among the dominant factors contributing to predicted temperature uncertainty due to high input sensitivity and uncertainty. Gap size uncertainty originates from the fact that all gap sizes vary with time due to dimensional changes of the fuel compacts and three graphite rings caused by extended exposure to high temperatures and fast neutron irradiation. Gap sizes are estimated using as-fabricated dimensional measurements at the start of irradiation and post irradiation examination dimensional measurements at the end of irradiation. Uncertainties in these measurements provide a basis for quantifying gap size uncertainty. However, lack of gap size measurements during irradiation and lack of knowledge about the dimension change rates lead to gap size modeling assumptions, which could increase gap size uncertainty. In addition, the dimensional measurements are performed at room temperature, and must be corrected to account for thermal expansion of the materials at high irradiation temperatures. Uncertainty in the thermal expansion coefficients for the graphite materials used in the AGR-3

  10. An uncertainty principle for unimodular quantum groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crann, Jason; Université Lille 1 - Sciences et Technologies, UFR de Mathématiques, Laboratoire de Mathématiques Paul Painlevé - UMR CNRS 8524, 59655 Villeneuve d'Ascq Cédex; Kalantar, Mehrdad, E-mail: jason-crann@carleton.ca, E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect tomore » the Haar weight reduces to the canonical entropy of the random walk generated by the state.« less

  11. The Uncertainty Principle in the Presence of Quantum Memory

    NASA Astrophysics Data System (ADS)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  12. Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2016-01-01

    Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…

  13. Constraining the generalized uncertainty principle with the atomic weak-equivalence-principle test

    NASA Astrophysics Data System (ADS)

    Gao, Dongfeng; Wang, Jin; Zhan, Mingsheng

    2017-04-01

    Various models of quantum gravity imply the Planck-scale modifications of Heisenberg's uncertainty principle into a so-called generalized uncertainty principle (GUP). The GUP effects on high-energy physics, cosmology, and astrophysics have been extensively studied. Here, we focus on the weak-equivalence-principle (WEP) violation induced by the GUP. Results from the WEP test with the 85Rb-87Rb dual-species atom interferometer are used to set upper bounds on parameters in two GUP proposals. A 1045-level bound on the Kempf-Mangano-Mann proposal and a 1027-level bound on Maggiore's proposal, which are consistent with bounds from other experiments, are obtained. All these bounds have huge room for improvement in the future.

  14. Black hole complementarity with the generalized uncertainty principle in Gravity's Rainbow

    NASA Astrophysics Data System (ADS)

    Gim, Yongwan; Um, Hwajin; Kim, Wontae

    2018-02-01

    When gravitation is combined with quantum theory, the Heisenberg uncertainty principle could be extended to the generalized uncertainty principle accompanying a minimal length. To see how the generalized uncertainty principle works in the context of black hole complementarity, we calculate the required energy to duplicate information for the Schwarzschild black hole. It shows that the duplication of information is not allowed and black hole complementarity is still valid even assuming the generalized uncertainty principle. On the other hand, the generalized uncertainty principle with the minimal length could lead to a modification of the conventional dispersion relation in light of Gravity's Rainbow, where the minimal length is also invariant as well as the speed of light. Revisiting the gedanken experiment, we show that the no-cloning theorem for black hole complementarity can be made valid in the regime of Gravity's Rainbow on a certain combination of parameters.

  15. The action uncertainty principle and quantum gravity

    NASA Astrophysics Data System (ADS)

    Mensky, Michael B.

    1992-02-01

    Results of the path-integral approach to the quantum theory of continuous measurements have been formulated in a preceding paper in the form of an inequality of the type of the uncertainty principle. The new inequality was called the action uncertainty principle, AUP. It was shown that the AUP allows one to find in a simple what outputs of the continuous measurements will occur with high probability. Here a more simple form of the AUP will be formulated, δ S≳ħ. When applied to quantum gravity, it leads in a very simple way to the Rosenfeld inequality for measurability of the average curvature.

  16. Uncertainty principle in loop quantum cosmology by Moyal formalism

    NASA Astrophysics Data System (ADS)

    Perlov, Leonid

    2018-03-01

    In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.

  17. Generalized uncertainty principle: implications for black hole complementarity

    NASA Astrophysics Data System (ADS)

    Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han

    2014-12-01

    At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.

  18. Quantum corrections to newtonian potential and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Scardigli, Fabio; Lambiase, Gaetano; Vagenas, Elias

    2017-08-01

    We use the leading quantum corrections to the newtonian potential to compute the deformation parameter of the generalized uncertainty principle. By assuming just only General Relativity as theory of Gravitation, and the thermal nature of the GUP corrections to the Hawking spectrum, our calculation gives, to first order, a specific numerical result. We briefly discuss the physical meaning of this value, and compare it with the previously obtained bounds on the generalized uncertainty principle deformation parameter.

  19. Polar Wavelet Transform and the Associated Uncertainty Principles

    NASA Astrophysics Data System (ADS)

    Shah, Firdous A.; Tantary, Azhar Y.

    2018-06-01

    The polar wavelet transform- a generalized form of the classical wavelet transform has been extensively used in science and engineering for finding directional representations of signals in higher dimensions. The aim of this paper is to establish new uncertainty principles associated with the polar wavelet transforms in L2(R2). Firstly, we study some basic properties of the polar wavelet transform and then derive the associated generalized version of Heisenberg-Pauli-Weyl inequality. Finally, following the idea of Beckner (Proc. Amer. Math. Soc. 123, 1897-1905 1995), we drive the logarithmic version of uncertainty principle for the polar wavelet transforms in L2(R2).

  20. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  1. Single-Slit Diffraction and the Uncertainty Principle

    ERIC Educational Resources Information Center

    Rioux, Frank

    2005-01-01

    A theoretical analysis of single-slit diffraction based on the Fourier transform between coordinate and momentum space is presented. The transform between position and momentum is used to illuminate the intimate relationship between single-slit diffraction and uncertainty principle.

  2. On different types of uncertainties in the context of the precautionary principle.

    PubMed

    Aven, Terje

    2011-10-01

    Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.

  3. Entropy bound of local quantum field theory with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Wan; Lee, Hyung Won; Myung, Yun Soo

    2009-03-01

    We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound A 3 / 4 rather than A with A the boundary area.

  4. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  5. “Stringy” coherent states inspired by generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Ghosh, Subir; Roy, Pinaki

    2012-05-01

    Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.

  6. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  7. Verification of the Uncertainty Principle by Using Diffraction of Light Waves

    ERIC Educational Resources Information Center

    Nikolic, D.; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the…

  8. Lorentz invariance violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser; Magdy, H.; Ali, A. Farag

    2016-01-01

    There are several theoretical indications that the quantum gravity approaches may have predictions for a minimal measurable length, and a maximal observable momentum and throughout a generalization for Heisenberg uncertainty principle. The generalized uncertainty principle (GUP) is based on a momentum-dependent modification in the standard dispersion relation which is conjectured to violate the principle of Lorentz invariance. From the resulting Hamiltonian, the velocity and time of flight of relativistic distant particles at Planck energy can be derived. A first comparison is made with recent observations for Hubble parameter in redshift-dependence in early-type galaxies. We find that LIV has two types of contributions to the time of flight delay Δ t comparable with that observations. Although the wrong OPERA measurement on faster-than-light muon neutrino anomaly, Δ t, and the relative change in the speed of muon neutrino Δ v in dependence on redshift z turn to be wrong, we utilize its main features to estimate Δ v. Accordingly, the results could not be interpreted as LIV. A third comparison is made with the ultra high-energy cosmic rays (UHECR). It is found that an essential ingredient of the approach combining string theory, loop quantum gravity, black hole physics and doubly spacial relativity and the one assuming a perturbative departure from exact Lorentz invariance. Fixing the sensitivity factor and its energy dependence are essential inputs for a reliable confronting of our calculations to UHECR. The sensitivity factor is related to the special time of flight delay and the time structure of the signal. Furthermore, the upper and lower bounds to the parameter, a that characterizes the generalized uncertainly principle, have to be fixed in related physical systems such as the gamma rays bursts.

  9. Continuous quantum measurements and the action uncertainty principle

    NASA Astrophysics Data System (ADS)

    Mensky, Michael B.

    1992-09-01

    The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.

  10. The Uncertainty Principle, Virtual Particles and Real Forces

    ERIC Educational Resources Information Center

    Jones, Goronwy Tudor

    2002-01-01

    This article provides a simple practical introduction to wave-particle duality, including the energy-time version of the Heisenberg Uncertainty Principle. It has been successful in leading students to an intuitive appreciation of "virtual particles" and the role they play in describing the way ordinary particles, like electrons and protons, exert…

  11. Decision Making Under Uncertainty - Bridging the Gap Between End User Needs and Science Capability

    NASA Astrophysics Data System (ADS)

    Verdon-Kidd, D. C.; Kiem, A.; Austin, E. K.

    2012-12-01

    Successful adaptation outcomes depend on decision making based on the best available climate science information. However, a fundamental barrier exists, namely the 'gap' between information that climate science can currently provide and the information that is practically useful for end users and decision makers. This study identifies the major contributing factors to the 'gap' from an Australian perspective and provides recommendations as to ways in which the 'gap' may be narrowed. This was achieved via a literature review, online survey (targeted to providers of climate information and end users of that information), workshop (where both climate scientists and end users came together to discuss key issues) and focus group. The study confirmed that uncertainty in climate science is a key barrier to adaptation. The issue of uncertainty was found to be multi-faceted, with issues identified in terms of communication of uncertainty, misunderstanding of uncertainty and the lack of tools/methods to deal with uncertainty. There were also key differences in terms of expectations for the future - most end users were of the belief that uncertainty associated with future climate projections would reduce within the next five to 10 years, however producers of climate science information were well aware that this would most likely not be the case. This is a concerning finding as end users may delay taking action on adaptation and risk planning until the uncertainties are reduced - a situation which may never eventuate or may occur after the optimal time for action. Improved communication and packaging of climate information was another key theme that was highlighted in this study. Importantly, it was made clear that improved communication is not just about more glossy brochures and presentations by climate scientists, rather there is a role for a program or group to fill this role (coined a 'knowledge broker' during the workshop and focus group). The role of the 'knowledge

  12. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  13. Info-gap management of public health Policy for TB with HIV-prevalence and epidemiological uncertainty

    PubMed Central

    2012-01-01

    Background Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. Aims We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Methods Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. Results We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. Conclusions The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving

  14. Info-gap management of public health Policy for TB with HIV-prevalence and epidemiological uncertainty.

    PubMed

    Ben-Haim, Yakov; Dacso, Clifford C; Zetola, Nicola M

    2012-12-19

    Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals.

  15. Human Time-Frequency Acuity Beats the Fourier Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Oppenheim, Jacob N.; Magnasco, Marcelo O.

    2013-01-01

    The time-frequency uncertainty principle states that the product of the temporal and frequency extents of a signal cannot be smaller than 1/(4π). We study human ability to simultaneously judge the frequency and the timing of a sound. Our subjects often exceeded the uncertainty limit, sometimes by more than tenfold, mostly through remarkable timing acuity. Our results establish a lower bound for the nonlinearity and complexity of the algorithms employed by our brains in parsing transient sounds, rule out simple “linear filter” models of early auditory processing, and highlight timing acuity as a central feature in auditory object processing.

  16. Uncertainty relations and topological-band insulator transitions in 2D gapped Dirac materials

    NASA Astrophysics Data System (ADS)

    Romera, E.; Calixto, M.

    2015-05-01

    Uncertainty relations are studied for a characterization of topological-band insulator transitions in 2D gapped Dirac materials isostructural with graphene. We show that the relative or Kullback-Leibler entropy in position and momentum spaces, and the standard variance-based uncertainty relation give sharp signatures of topological phase transitions in these systems.

  17. Generalized uncertainty principle and quantum gravity phenomenology

    NASA Astrophysics Data System (ADS)

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  18. Generalized uncertainty principles and quantum field theory

    NASA Astrophysics Data System (ADS)

    Husain, Viqar; Kothawala, Dawood; Seahra, Sanjeev S.

    2013-01-01

    Quantum mechanics with a generalized uncertainty principle arises through a representation of the commutator [x^,p^]=if(p^). We apply this deformed quantization to free scalar field theory for f±=1±βp2. The resulting quantum field theories have a rich fine scale structure. For small wavelength modes, the Green’s function for f+ exhibits a remarkable transition from Lorentz to Galilean invariance, whereas for f- such modes effectively do not propagate. For both cases Lorentz invariance is recovered at long wavelengths.

  19. Conditional uncertainty principle

    NASA Astrophysics Data System (ADS)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  20. A method to estimate the additional uncertainty in gap-filled NEE resulting from long gaps in the CO2 flux record

    Treesearch

    Andrew D. Richardson; David Y. Hollinger

    2007-01-01

    Missing values in any data set create problems for researchers. The process by which missing values are replaced, and the data set is made complete, is generally referred to as imputation. Within the eddy flux community, the term "gap filling" is more commonly applied. A major challenge is that random errors in measured data result in uncertainty in the gap-...

  1. Squeezed States, Uncertainty Relations and the Pauli Principle in Composite and Cosmological Models

    NASA Technical Reports Server (NTRS)

    Terazawa, Hidezumi

    1996-01-01

    The importance of not only uncertainty relations but also the Pauli exclusion principle is emphasized in discussing various 'squeezed states' existing in the universe. The contents of this paper include: (1) Introduction; (2) Nuclear Physics in the Quark-Shell Model; (3) Hadron Physics in the Standard Quark-Gluon Model; (4) Quark-Lepton-Gauge-Boson Physics in Composite Models; (5) Astrophysics and Space-Time Physics in Cosmological Models; and (6) Conclusion. Also, not only the possible breakdown of (or deviation from) uncertainty relations but also the superficial violation of the Pauli principle at short distances (or high energies) in composite (and string) models is discussed in some detail.

  2. The action uncertainty principle for continuous measurements

    NASA Astrophysics Data System (ADS)

    Mensky, Michael B.

    1996-02-01

    The action uncertainty principle (AUP) for the specification of the most probable readouts of continuous quantum measurements is proved, formulated in different forms and analyzed (for nonlinear as well as linear systems). Continuous monitoring of an observable A(p,q,t) with resolution Δa( t) is considered. The influence of the measurement process on the evolution of the measured system (quantum measurement noise) is presented by an additional term δ F(t)A(p,q,t) in the Hamiltonian where the function δ F (generalized fictitious force) is restricted by the AUP ∫|δ F(t)| Δa( t) d t ≲ and arbitrary otherwise. Quantum-nondemolition (QND) measurements are analyzed with the help of the AUP. A simple uncertainty relation for continuous quantum measurements is derived. It states that the area of a certain band in the phase space should be of the order of. The width of the band depends on the measurement resolution while its length is determined by the deviation of the system, due to the measurement, from classical behavior.

  3. Using Uncertainty Principle to Find the Ground-State Energy of the Helium and a Helium-like Hookean Atom

    ERIC Educational Resources Information Center

    Harbola, Varun

    2011-01-01

    In this paper, we accurately estimate the ground-state energy and the atomic radius of the helium atom and a helium-like Hookean atom by employing the uncertainty principle in conjunction with the variational approach. We show that with the use of the uncertainty principle, electrons are found to be spread over a radial region, giving an electron…

  4. Research on the effects of geometrical and material uncertainties on the band gap of the undulated beam

    NASA Astrophysics Data System (ADS)

    Li, Yi; Xu, Yanlong

    2017-09-01

    Considering uncertain geometrical and material parameters, the lower and upper bounds of the band gap of an undulated beam with periodically arched shape are studied by the Monte Carlo Simulation (MCS) and interval analysis based on the Taylor series. Given the random variations of the overall uncertain variables, scatter plots from the MCS are used to analyze the qualitative sensitivities of the band gap respect to these uncertainties. We find that the influence of uncertainty of the geometrical parameter on the band gap of the undulated beam is stronger than that of the material parameter. And this conclusion is also proved by the interval analysis based on the Taylor series. Our methodology can give a strategy to reduce the errors between the design and practical values of the band gaps by improving the accuracy of the specially selected uncertain design variables of the periodical structures.

  5. The Generalized Uncertainty Principle and Harmonic Interaction in Three Spatial Dimensions

    NASA Astrophysics Data System (ADS)

    Hassanabadi, H.; Hooshmand, P.; Zarrinkamar, S.

    2015-01-01

    In three spatial dimensions, the generalized uncertainty principle is considered under an isotropic harmonic oscillator interaction in both non-relativistic and relativistic regions. By using novel transformations and separations of variables, the exact analytical solution of energy eigenvalues as well as the wave functions is obtained. Time evolution of the non-relativistic region is also reported.

  6. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  7. Generalized Uncertainty Principle and Parikh-Wilczek Tunneling

    NASA Astrophysics Data System (ADS)

    Mehdipour, S. Hamid

    We investigate the modifications of the Hawking radiation by the Generalized Uncertainty Principle (GUP) and the tunneling process. By using the GUP-corrected de Broglie wavelength, the squeezing of the fundamental momentum cell, and consequently a GUP-corrected energy, we find the nonthermal effects which lead to a nonzero statistical correlation function between probabilities of tunneling of two massive particles with different energies. Then the recovery of part of the information from the black hole radiation is feasible. From the other point of view, the inclusion of the effects of quantum gravity as the GUP expression can halt the evaporation process, so that a stable black hole remnant is left behind, including the other part of the black hole information content. Therefore, these features of the Planck-scale corrections may solve the information problem in black hole evaporation.

  8. Uncertainty, imprecision, and the precautionary principle in climate change assessment.

    PubMed

    Borsuk, M E; Tomassini, L

    2005-01-01

    Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.

  9. Setting the most robust effluent level under severe uncertainty: application of information-gap decision theory to chemical management.

    PubMed

    Yokomizo, Hiroyuki; Naito, Wataru; Tanaka, Yoshinari; Kamo, Masashi

    2013-11-01

    Decisions in ecological risk management for chemical substances must be made based on incomplete information due to uncertainties. To protect the ecosystems from the adverse effect of chemicals, a precautionary approach is often taken. The precautionary approach, which is based on conservative assumptions about the risks of chemical substances, can be applied selecting management models and data. This approach can lead to an adequate margin of safety for ecosystems by reducing exposure to harmful substances, either by reducing the use of target chemicals or putting in place strict water quality criteria. However, the reduction of chemical use or effluent concentrations typically entails a financial burden. The cost effectiveness of the precautionary approach may be small. Hence, we need to develop a formulaic methodology in chemical risk management that can sufficiently protect ecosystems in a cost-effective way, even when we do not have sufficient information for chemical management. Information-gap decision theory can provide the formulaic methodology. Information-gap decision theory determines which action is the most robust to uncertainty by guaranteeing an acceptable outcome under the largest degree of uncertainty without requiring information about the extent of parameter uncertainty at the outset. In this paper, we illustrate the application of information-gap decision theory to derive a framework for setting effluent limits of pollutants for point sources under uncertainty. Our application incorporates a cost for reduction in pollutant emission and a cost to wildlife species affected by the pollutant. Our framework enables us to settle upon actions to deal with severe uncertainty in ecological risk management of chemicals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Using info-Gap Decision Theory for Water Resources Planning Under Severe Uncertainty

    NASA Astrophysics Data System (ADS)

    Korteling, B.; Brazier, R.; Kapelan, Z.; Dessai, S.

    2012-12-01

    Water resource managers are required to develop comprehensive water resource plans based on severely uncertain information of the effects of climate change on local hydrology and future socio-economic changes on localised demand. In England and Wales, current water resource planning methodologies include a headroom estimation process that quantifies uncertainty based on only one point of an assumed range of deviations from the expected climate and projected demand 25 years into the future. There are many situations where there is not enough knowledge to be able to estimate a representative probability of occurrence, or to be confident that the tails of an assumed probability distribution will not exhibit unexpected skewness, or that the kurtosis of a distribution differs from the norm. These situations can be considered severely uncertain. Information-Gap decision theory offers a method to sample a wider range of uncertainty than with traditional methods, and as a result, compare the robustness of various water resource management options under conditions of severe uncertainty. A more robust management option is one that delivers the same level of performance as other options at higher levels of uncertainty. A case study is based on a Water Supply Area that encompasses the county of Cornwall in southwest England containing 17 reservoirs and 19 demand nodes. The performance success of management options are evaluated primarily by measures of water availability including a reservoir risk measure that tests the probability and magnitude that strategic reservoir storage levels fall below the drought management curve under adverse conditions and also a safety margin deficit that tests how quickly reservoir levels can return to optimum operating levels in favourable conditions. Multi-Criteria Decision Analysis (MCDA) is used to test the effectiveness of different management options with different weightings for metrics other than water availability including; capital and

  11. Lyme disease ecology in a changing world: Consensus, uncertainty and critical gaps for improving control

    USGS Publications Warehouse

    Kilpatrick, A. Marm; Dobson, Andrew D.M.; Levi, Taal; Salkeld, Daniel J.; Swei, Andrea; Ginsberg, Howard; Kjemtrup, Anne; Padgett, Kerry A.; Jensen, Per A.; Fish, Durland; Ogden, Nick H.; Diuk-Wasser, Maria A.

    2017-01-01

    Lyme disease is the most common tick-borne disease in temperate regions of North America, Europe and Asia, and the number of reported cases has increased in many regions as landscapes have been altered. Although there has been extensive work on the ecology and epidemiology of this disease in both Europe and North America, substantial uncertainty exists about fundamental aspects that determine spatial and temporal variation in both disease risk and human incidence, which hamper effective and efficient prevention and control. Here we describe areas of consensus that can be built on, identify areas of uncertainty and outline research needed to fill these gaps to facilitate predictive models of disease risk and the development of novel disease control strategies. Key areas of uncertainty include: (i) the precise influence of deer abundance on tick abundance, (ii) how tick populations are regulated, (iii) assembly of host communities and tick-feeding patterns across different habitats, (iv) reservoir competence of host species, and (v) pathogenicity for humans of different genotypes of Borrelia burgdorferi. Filling these knowledge gaps will improve Lyme disease prevention and control and provide general insights into the drivers and dynamics of this emblematic multi-host–vector-borne zoonotic disease.

  12. "Citizen Jane": Rethinking Design Principles for Closing the Gender Gap in Computing.

    ERIC Educational Resources Information Center

    Raphael, Chad

    This paper identifies three rationales in the relevant literature for closing the gender gap in computing: economic, cultural and political. Each rationale implies a different set of indicators of present inequalities, disparate goals for creating equality, and distinct principles for software and web site design that aims to help girls overcome…

  13. Generalized uncertainty principle and the maximum mass of ideal white dwarfs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashidi, Reza, E-mail: reza.rashidi@srttu.edu

    The effects of a generalized uncertainty principle on the structure of an ideal white dwarf star is investigated. The equation describing the equilibrium configuration of the star is a generalized form of the Lane–Emden equation. It is proved that the star always has a finite size. It is then argued that the maximum mass of such an ideal white dwarf tends to infinity, as opposed to the conventional case where it has a finite value.

  14. Robust climate policies under uncertainty: a comparison of robust decision making and info-gap methods.

    PubMed

    Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J

    2012-10-01

    This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.

  15. The statistical fluctuation study of quantum key distribution in means of uncertainty principle

    NASA Astrophysics Data System (ADS)

    Liu, Dunwei; An, Huiyao; Zhang, Xiaoyu; Shi, Xuemei

    2018-03-01

    Laser defects in emitting single photon, photon signal attenuation and propagation of error cause our serious headaches in practical long-distance quantum key distribution (QKD) experiment for a long time. In this paper, we study the uncertainty principle in metrology and use this tool to analyze the statistical fluctuation of the number of received single photons, the yield of single photons and quantum bit error rate (QBER). After that we calculate the error between measured value and real value of every parameter, and concern the propagation error among all the measure values. We paraphrase the Gottesman-Lo-Lutkenhaus-Preskill (GLLP) formula in consideration of those parameters and generate the QKD simulation result. In this study, with the increase in coding photon length, the safe distribution distance is longer and longer. When the coding photon's length is N = 10^{11}, the safe distribution distance can be almost 118 km. It gives a lower bound of safe transmission distance than without uncertainty principle's 127 km. So our study is in line with established theory, but we make it more realistic.

  16. Lorentz violation and generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Lambiase, Gaetano; Scardigli, Fabio

    2018-04-01

    Investigations on possible violation of Lorentz invariance have been widely pursued in the last decades, both from theoretical and experimental sides. A comprehensive framework to formulate the problem is the standard model extension (SME) proposed by A. Kostelecky, where violation of Lorentz invariance is encoded into specific coefficients. Here we present a procedure to link the deformation parameter β of the generalized uncertainty principle to the SME coefficients of the gravity sector. The idea is to compute the Hawking temperature of a black hole in two different ways. The first way involves the deformation parameter β , and therefore we get a deformed Hawking temperature containing the parameter β . The second way involves a deformed Schwarzschild metric containing the Lorentz violating terms s¯μ ν of the gravity sector of the SME. The comparison between the two different techniques yields a relation between β and s¯μ ν. In this way bounds on β transferred from s¯μ ν are improved by many orders of magnitude when compared with those derived in other gravitational frameworks. Also the opposite possibility of bounds transferred from β to s¯μ ν is briefly discussed.

  17. Cosmological horizons, uncertainty principle, and maximum length quantum mechanics

    NASA Astrophysics Data System (ADS)

    Perivolaropoulos, L.

    2017-05-01

    The cosmological particle horizon is the maximum measurable length in the Universe. The existence of such a maximum observable length scale implies a modification of the quantum uncertainty principle. Thus due to nonlocality of quantum mechanics, the global properties of the Universe could produce a signature on the behavior of local quantum systems. A generalized uncertainty principle (GUP) that is consistent with the existence of such a maximum observable length scale lmax is Δ x Δ p ≥ℏ2/1/1 -α Δ x2 where α =lmax-2≃(H0/c )2 (H0 is the Hubble parameter and c is the speed of light). In addition to the existence of a maximum measurable length lmax=1/√{α }, this form of GUP implies also the existence of a minimum measurable momentum pmin=3/√{3 } 4 ℏ√{α }. Using appropriate representation of the position and momentum quantum operators we show that the spectrum of the one-dimensional harmonic oscillator becomes E¯n=2 n +1 +λnα ¯ where E¯n≡2 En/ℏω is the dimensionless properly normalized n th energy level, α ¯ is a dimensionless parameter with α ¯≡α ℏ/m ω and λn˜n2 for n ≫1 (we show the full form of λn in the text). For a typical vibrating diatomic molecule and lmax=c /H0 we find α ¯˜10-77 and therefore for such a system, this effect is beyond the reach of current experiments. However, this effect could be more important in the early Universe and could produce signatures in the primordial perturbation spectrum induced by quantum fluctuations of the inflaton field.

  18. Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.

    PubMed

    Rogers, Michael D

    2003-06-01

    Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.

  19. Lyme disease ecology in a changing world: consensus, uncertainty and critical gaps for improving control

    PubMed Central

    Dobson, Andrew D. M.; Levi, Taal; Salkeld, Daniel J.; Swei, Andrea; Ginsberg, Howard S.; Kjemtrup, Anne; Padgett, Kerry A.; Jensen, Per M.; Fish, Durland; Ogden, Nick H.

    2017-01-01

    Lyme disease is the most common tick-borne disease in temperate regions of North America, Europe and Asia, and the number of reported cases has increased in many regions as landscapes have been altered. Although there has been extensive work on the ecology and epidemiology of this disease in both Europe and North America, substantial uncertainty exists about fundamental aspects that determine spatial and temporal variation in both disease risk and human incidence, which hamper effective and efficient prevention and control. Here we describe areas of consensus that can be built on, identify areas of uncertainty and outline research needed to fill these gaps to facilitate predictive models of disease risk and the development of novel disease control strategies. Key areas of uncertainty include: (i) the precise influence of deer abundance on tick abundance, (ii) how tick populations are regulated, (iii) assembly of host communities and tick-feeding patterns across different habitats, (iv) reservoir competence of host species, and (v) pathogenicity for humans of different genotypes of Borrelia burgdorferi. Filling these knowledge gaps will improve Lyme disease prevention and control and provide general insights into the drivers and dynamics of this emblematic multi-host–vector-borne zoonotic disease. This article is part of the themed issue ‘Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. PMID:28438910

  20. Uncertainty and Clinical Psychology: Therapists' Responses.

    ERIC Educational Resources Information Center

    Bienenfeld, Sheila

    Three sources of professional uncertainty have been described: uncertainty about the practitioner's mastery of knowledge; uncertainty due to gaps in the knowledge base itself; and uncertainty about the source of the uncertainty, i.e., the practitioner does not know whether his uncertainty is due to gaps in the knowledge base or to personal…

  1. The strain induced band gap modulation from narrow gap semiconductor to half-metal on Ti{sub 2}CrGe: A first principles study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jia, E-mail: jiali@hebut.edu.cn; Research Institute for Energy Equipment Materials, Hebei University of Technology, Tianjin 300401; Zhang, Zhidong

    The Heusler alloy Ti{sub 2}CrGe is a stable L2{sub 1} phase with antiferromagnetic ordering. With band-gap energy (∼ 0.18 eV) obtained from a first-principles calculation, it belongs to the group of narrow band gap semiconductor. The band-gap energy decreases with increasing lattice compression and disappears until a strain of −5%; moreover, gap contraction only occurs in the spin-down states, leading to half-metallic character at the −5% strain. The Ti{sub 1}, Ti{sub 2}, and Cr moments all exhibit linear changes in behavior within strains of −5%– +5%. Nevertheless, the total zero moment is robust for these strains. The imaginary part ofmore » the dielectric function for both up and down spin states shows a clear onset energy, indicating a corresponding electronic gap for the two spin channels.« less

  2. Weak values, 'negative probability', and the uncertainty principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokolovski, D.

    2007-10-15

    A quantum transition can be seen as a result of interference between various pathways (e.g., Feynman paths), which can be labeled by a variable f. An attempt to determine the value of f without destroying the coherence between the pathways produces a weak value of f. We show f to be an average obtained with an amplitude distribution which can, in general, take negative values, which, in accordance with the uncertainty principle, need not contain information about the actual range of f which contributes to the transition. It is also demonstrated that the moments of such alternating distributions have amore » number of unusual properties which may lead to a misinterpretation of the weak-measurement results. We provide a detailed analysis of weak measurements with and without post-selection. Examples include the double-slit diffraction experiment, weak von Neumann and von Neumann-like measurements, traversal time for an elastic collision, phase time, and local angular momentum.« less

  3. An info-gap application to robust design of a prestressed space structure under epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Hot, Aurélien; Weisser, Thomas; Cogan, Scott

    2017-07-01

    Uncertainty quantification is an integral part of the model validation process and is important to take into account during the design of mechanical systems. Sources of uncertainty are diverse but generally fall into two categories: aleatory due to random process and epistemic resulting from a lack of knowledge. This work focuses on the behavior of solar arrays in their stowed configuration. To avoid impacts during launch, snubbers are used to prestress the panels. Since the mechanical properties of the snubbers and the associated preload configurations are difficult to characterize precisely, an info-gap approach is proposed to investigate the influence of such uncertainties on design configurations obtained for different values of safety factors. This eventually allows to revise the typical values of these factors and to reevaluate them with respect to a targeted robustness level. The proposed methodology is illustrated using a simplified finite element model of a solar array.

  4. Wave-Particle Duality and Uncertainty Principle: Phenomenographic Categories of Description of Tertiary Physics Students' Depictions

    ERIC Educational Resources Information Center

    Ayene, Mengesha; Kriek, Jeanne; Damtie, Baylie

    2011-01-01

    Quantum mechanics is often thought to be a difficult subject to understand, not only in the complexity of its mathematics but also in its conceptual foundation. In this paper we emphasize students' depictions of the uncertainty principle and wave-particle duality of quantum events, phenomena that could serve as a foundation in building an…

  5. The energy-time uncertainty principle and the EPR paradox: Experiments involving correlated two-photon emission in parametric down-conversion

    NASA Technical Reports Server (NTRS)

    Chiao, Raymond Y.; Kwiat, Paul G.; Steinberg, Aephraim M.

    1992-01-01

    The energy-time uncertainty principle is on a different footing than the momentum position uncertainty principle: in contrast to position, time is a c-number parameter, and not an operator. As Aharonov and Bohm have pointed out, this leads to different interpretations of the two uncertainty principles. In particular, one must distinguish between an inner and an outer time in the definition of the spread in time, delta t. It is the inner time which enters the energy-time uncertainty principle. We have checked this by means of a correlated two-photon light source in which the individual energies of the two photons are broad in spectra, but in which their sum is sharp. In other words, the pair of photons is in an entangled state of energy. By passing one member of the photon pair through a filter with width delta E, it is observed that the other member's wave packet collapses upon coincidence detection to a duration delta t, such that delta E(delta t) is approximately equal to planks constant/2 pi, where this duration delta t is an inner time, in the sense of Aharonov and Bohm. We have measured delta t by means of a Michelson interferometer by monitoring the visibility of the fringes seen in coincidence detection. This is a nonlocal effect, in the sense that the two photons are far away from each other when the collapse occurs. We have excluded classical-wave explanations of this effect by means of triple coincidence measurements in conjunction with a beam splitter which follows the Michelson interferometer. Since Bell's inequalities are known to be violated, we believe that it is also incorrect to interpret this experimental outcome as if energy were a local hidden variable, i.e., as if each photon, viewed as a particle, possessed some definite but unknown energy before its detection.

  6. Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?

    NASA Astrophysics Data System (ADS)

    Majumder, Barun; Sen, Sourav

    2012-10-01

    In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.

  7. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGES

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  8. On the action of Heisenberg's uncertainty principle in discrete linear methods for calculating the components of the deflection of the vertical

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Lapshin, Aleksey

    2013-04-01

    The method of discrete linear transformations that can be implemented through the algorithms of the Standard Fourier Transform (SFT), Short-Time Fourier Transform (STFT) or Wavelet transform (WT) is effective for calculating the components of the deflection of the vertical from discrete values of gravity anomaly. The SFT due to the action of Heisenberg's uncertainty principle indicates weak spatial localization that manifests in the following: firstly, it is necessary to know the initial digital signal on the complete number line (in case of one-dimensional transform) or in the whole two-dimensional space (if a two-dimensional transform is performed) in order to find the SFT. Secondly, the localization and values of the "peaks" of the initial function cannot be derived from its Fourier transform as the coefficients of the Fourier transform are formed by taking into account all the values of the initial function. Thus, the SFT gives the global information on all frequencies available in the digital signal throughout the whole time period. To overcome this peculiarity it is necessary to localize the signal in time and apply the Fourier transform only to a small portion of the signal; the STFT that differs from the SFT only by the presence of an additional factor (window) is used for this purpose. A narrow enough window is chosen to localize the signal in time and, according to Heisenberg's uncertainty principle, it results in have significant enough uncertainty in frequency. If one chooses a wide enough window it, according to the same principle, will increase time uncertainty. Thus, if the signal is narrowly localized in time its spectrum, on the contrary, is spread on the complete axis of frequencies, and vice versa. The STFT makes it possible to improve spatial localization, that is, it allows one to define the presence of any frequency in the signal and the interval of its presence. However, owing to Heisenberg's uncertainty principle, it is impossible to tell

  9. Solving Autonomy Technology Gaps through Wireless Technology and Orion Avionics Architectural Principles

    NASA Astrophysics Data System (ADS)

    Black, Randy; Bai, Haowei; Michalicek, Andrew; Shelton, Blaine; Villela, Mark

    2008-01-01

    Currently, autonomy in space applications is limited by a variety of technology gaps. Innovative application of wireless technology and avionics architectural principles drawn from the Orion crew exploration vehicle provide solutions for several of these gaps. The Vision for Space Exploration envisions extensive use of autonomous systems. Economic realities preclude continuing the level of operator support currently required of autonomous systems in space. In order to decrease the number of operators, more autonomy must be afforded to automated systems. However, certification authorities have been notoriously reluctant to certify autonomous software in the presence of humans or when costly missions may be jeopardized. The Orion avionics architecture, drawn from advanced commercial aircraft avionics, is based upon several architectural principles including partitioning in software. Robust software partitioning provides "brick wall" separation between software applications executing on a single processor, along with controlled data movement between applications. Taking advantage of these attributes, non-deterministic applications can be placed in one partition and a "Safety" application created in a separate partition. This "Safety" partition can track the position of astronauts or critical equipment and prevent any unsafe command from executing. Only the Safety partition need be certified to a human rated level. As a proof-of-concept demonstration, Honeywell has teamed with the Ultra WideBand (UWB) Working Group at NASA Johnson Space Center to provide tracking of humans, autonomous systems, and critical equipment. Using UWB the NASA team can determine positioning to within less than one inch resolution, allowing a Safety partition to halt operation of autonomous systems in the event that an unplanned collision is imminent. Another challenge facing autonomous systems is the coordination of multiple autonomous agents. Current approaches address the issue as one of

  10. Corrected black hole thermodynamics in Damour-Ruffini’s method with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Chen, Ge-Rui

    Recently, some approaches to quantum gravity indicate that a minimal measurable length lp ˜ 10-35 should be considered, a direct implication of the minimal measurable length is the generalized uncertainty principle (GUP). Taking the effect of GUP into account, Hawking radiation of massless scalar particles from a Schwarzschild black hole is investigated by the use of Damour-Ruffini’s method. The original Klein-Gordon equation is modified. It is obtained that the corrected Hawking temperature is related to the energy of emitting particles. Some discussions appear in the last section.

  11. First-principles study of direct and narrow band gap semiconducting β -CuGaO 2

    DOE PAGES

    Nguyen, Manh Cuong; Zhao, Xin; Wang, Cai-Zhuang; ...

    2015-04-16

    Semiconducting oxides have attracted much attention due to their great stability in air or water and the abundance of oxygen. Recent success in synthesizing a metastable phase of CuGaO 2 with direct narrow band gap opens up new applications of semiconducting oxides as absorber layer for photovoltaics. Using first-principles density functional theory calculations, we investigate the thermodynamic and mechanical stabilities as well as the structural and electronic properties of the β-CuGaO 2 phase. Our calculations show that the β-CuGaO 2 structure is dynamically and mechanically stable. The energy band gap is confirmed to be direct at the Γ point ofmore » Brillouin zone. In conclusion, the optical absorption occurs right at the band gap edge and the density of states near the valance band maximum is large, inducing an intense absorption of light as observed in experiment.« less

  12. Planning water supply under uncertainty - benefits and limitations of RDM, Info-Gap, economic optimization and many-objective optimization

    NASA Astrophysics Data System (ADS)

    Matrosov, E.; Padula, S.; Huskova, I.; Harou, J. J.

    2012-12-01

    Population growth and the threat of drier or changed climates are likely to increase water scarcity world-wide. A combination of demand management (water conservation) and new supply infrastructure is often needed to meet future projected demands. In this case system planners must decide what to implement, when and at what capacity. Choices can range from infrastructure to policies or a mix of the two, culminating in a complex planning problem. Decision making under uncertainty frameworks can be used to help planners with this planning problem. This presentation introduces, applies and compares four decision making under uncertainty frameworks. The application is to the Thames basin water resource system which includes the city of London. The approaches covered here include least-economic cost capacity expansion optimization (EO), Robust Decision Making (RDM), Info-Gap Decision Theory (Info-gap) and many-objective evolutionary optimization (MOEO). EO searches for the least-economic cost program, i.e. the timing, sizing, and choice of supply-demand management actions/upgrades which meet projected water demands. Instead of striving for optimality, the RDM and Info-gap approaches help build plans that are robust to 'deep' uncertainty in future conditions. The MOEO framework considers multiple performance criteria and uses water systems simulators as a function evaluator for the evolutionary algorithm. Visualizations show Pareto approximate tradeoffs between multiple objectives. In this presentation we detail the application of each framework to the Thames basin (including London) water resource planning problem. Supply and demand options are proposed by the major water companies in the basin. We apply the EO method using a 29 year time horizon and an annual time step considering capital, operating (fixed and variable), social and environmental costs. The method considers all plausible combinations of supply and conservation schemes and capacities proposed by water

  13. Quantification of uncertainty in first-principles predicted mechanical properties of solids: Application to solid ion conductors

    NASA Astrophysics Data System (ADS)

    Ahmad, Zeeshan; Viswanathan, Venkatasubramanian

    2016-08-01

    Computationally-guided material discovery is being increasingly employed using a descriptor-based screening through the calculation of a few properties of interest. A precise understanding of the uncertainty associated with first-principles density functional theory calculated property values is important for the success of descriptor-based screening. The Bayesian error estimation approach has been built in to several recently developed exchange-correlation functionals, which allows an estimate of the uncertainty associated with properties related to the ground state energy, for example, adsorption energies. Here, we propose a robust and computationally efficient method for quantifying uncertainty in mechanical properties, which depend on the derivatives of the energy. The procedure involves calculating energies around the equilibrium cell volume with different strains and fitting the obtained energies to the corresponding energy-strain relationship. At each strain, we use instead of a single energy, an ensemble of energies, giving us an ensemble of fits and thereby, an ensemble of mechanical properties associated with each fit, whose spread can be used to quantify its uncertainty. The generation of ensemble of energies is only a post-processing step involving a perturbation of parameters of the exchange-correlation functional and solving for the energy non-self-consistently. The proposed method is computationally efficient and provides a more robust uncertainty estimate compared to the approach of self-consistent calculations employing several different exchange-correlation functionals. We demonstrate the method by calculating the uncertainty bounds for several materials belonging to different classes and having different structures using the developed method. We show that the calculated uncertainty bounds the property values obtained using three different GGA functionals: PBE, PBEsol, and RPBE. Finally, we apply the approach to calculate the uncertainty

  14. Zn(x)Cd(1-x)Se nanomultipods with tunable band gaps: synthesis and first-principles calculations.

    PubMed

    Wei, Hao; Su, Yanjie; Han, Ziyi; Li, Tongtong; Ren, Xinglong; Yang, Zhi; Wei, Liangming; Cong, Fengsong; Zhang, Yafei

    2013-06-14

    In this paper, we demonstrate that ZnxCd1-xSe nanomultipods can be synthesized via a facile and nontoxic solution-based method. Interesting aspects of composition, morphology and optical properties were deeply explored. The value of Zn/(Zn+Cd) could be altered across the entire range from 0.08 to 0.86 by varying the ratio of cation precursor contents. The band gap energy could be linearly tuned from 1.88 to 2.48 eV with respect to the value of Zn/(Zn+Cd). The experiment also showed that oleylamine played a dominant role in the formation of multipod structure. A possible growth mechanism was further suggested. First-principles calculations of band gap energy and density of states in the Vienna ab initio simulation package code were performed to verify the experimental variation tendency of the band gap. Computational results indicated that dissimilarities of electronic band structures and orbital constitutions determined the tunable band gap of the as-synthesized nanomultipod, which might be promising for versatile applications in relevant areas of solar cells, biomedicine, sensors, catalysts and so on.

  15. DCS: A Case Study of Identification of Knowledge and Disposition Gaps Using Principles of Continuous Risk Management

    NASA Technical Reports Server (NTRS)

    Norcross, Jason; Steinberg, Susan; Kundrot, Craig; Charles, John

    2011-01-01

    The Human Research Program (HRP) is formulated around the program architecture of Evidence-Risk-Gap-Task-Deliverable. Review of accumulated evidence forms the basis for identification of high priority risks to human health and performance in space exploration. Gaps in knowledge or disposition are identified for each risk, and a portfolio of research tasks is developed to fill them. Deliverables from the tasks inform the evidence base with the ultimate goal of defining the level of risk and reducing it to an acceptable level. A comprehensive framework for gap identification, focus, and metrics has been developed based on principles of continuous risk management and clinical care. Research towards knowledge gaps improves understanding of the likelihood, consequence or timeframe of the risk. Disposition gaps include development of standards or requirements for risk acceptance, development of countermeasures or technology to mitigate the risk, and yearly technology assessment related to watching developments related to the risk. Standard concepts from clinical care: prevention, diagnosis, treatment, monitoring, rehabilitation, and surveillance, can be used to focus gaps dealing with risk mitigation. The research plan for the new HRP Risk of Decompression Sickness (DCS) used the framework to identify one disposition gap related to establishment of a DCS standard for acceptable risk, two knowledge gaps related to DCS phenomenon and mission attributes, and three mitigation gaps focused on prediction, prevention, and new technology watch. These gaps were organized in this manner primarily based on target for closure and ease of organizing interim metrics so that gap status could be quantified. Additional considerations for the knowledge gaps were that one was highly design reference mission specific and the other gap was focused on DCS phenomenon.

  16. Exploring the charge localization and band gap opening of borophene: a first-principles study.

    PubMed

    Kistanov, Andrey A; Cai, Yongqing; Zhou, Kun; Srikanth, Narasimalu; Dmitriev, Sergey V; Zhang, Yong-Wei

    2018-01-18

    Recently synthesized two-dimensional (2D) boron, borophene, exhibits a novel metallic behavior rooted in the s-p orbital hybridization, distinctively different from other 2D materials such as sulfides/selenides and semi-metallic graphene. This unique feature of borophene implies new routes for charge delocalization and band gap opening. Herein, using first-principles calculations, we explore the routes to localize the carriers and open the band gap of borophene via chemical functionalization, ribbon construction, and defect engineering. The metallicity of borophene is found to be remarkably robust against H- and F-functionalization and the presence of vacancies. Interestingly, a strong odd-even oscillation of the electronic structure with width is revealed for H-functionalized borophene nanoribbons, while an ultra-high work function (∼7.83 eV) is found for the F-functionalized borophene due to its strong charge transfer to the atomic adsorbates.

  17. Thermodynamics of a class of regular black holes with a generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Maluf, R. V.; Neves, Juliano C. S.

    2018-05-01

    In this article, we present a study on thermodynamics of a class of regular black holes. Such a class includes Bardeen and Hayward regular black holes. We obtained thermodynamic quantities like the Hawking temperature, entropy, and heat capacity for the entire class. As part of an effort to indicate some physical observable to distinguish regular black holes from singular black holes, we suggest that regular black holes are colder than singular black holes. Besides, contrary to the Schwarzschild black hole, that class of regular black holes may be thermodynamically stable. From a generalized uncertainty principle, we also obtained the quantum-corrected thermodynamics for the studied class. Such quantum corrections provide a logarithmic term for the quantum-corrected entropy.

  18. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    NASA Astrophysics Data System (ADS)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  19. Band-gap bowing and p-type doping of (Zn, Mg, Be)O wide-gap semiconductor alloys: a first-principles study

    NASA Astrophysics Data System (ADS)

    Shi, H.-L.; Duan, Y.

    2008-12-01

    Using a first-principles band-structure method and a special quasirandom structure (SQS) approach, we systematically calculate the band gap bowing parameters and p-type doping properties of (Zn, Mg, Be)O related random ternary and quaternary alloys. We show that the bowing parameters for ZnBeO and MgBeO alloys are large and dependent on composition. This is due to the size difference and chemical mismatch between Be and Zn(Mg) atoms. We also demonstrate that adding a small amount of Be into MgO reduces the band gap indicating that the bowing parameter is larger than the band-gap difference. We select an ideal N atom with lower p atomic energy level as dopant to perform p-type doping of ZnBeO and ZnMgBeO alloys. For N doped in ZnBeO alloy, we show that the acceptor transition energies become shallower as the number of the nearest neighbor Be atoms increases. This is thought to be because of the reduction of p- d repulsion. The NO acceptor transition energies are deep in the ZnMgBeO quaternary alloy lattice-matched to GaN substrate due to the lower valence band maximum. These decrease slightly as there are more nearest neighbor Mg atoms surrounding the N dopant. The important natural valence band alignment between ZnO, MgO, BeO, ZnBeO, and ZnMgBeO quaternary alloy is also investigated.

  20. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    PubMed

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  1. Comparison of Classical and Quantum Mechanical Uncertainties.

    ERIC Educational Resources Information Center

    Peslak, John, Jr.

    1979-01-01

    Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)

  2. Organizing principles as tools for bridging the gap between system theory and biological experimentation.

    PubMed

    Mekios, Constantinos

    2016-04-01

    Twentieth-century theoretical efforts towards the articulation of general system properties came short of having the significant impact on biological practice that their proponents envisioned. Although the latter did arrive at preliminary mathematical formulations of such properties, they had little success in showing how these could be productively incorporated into the research agenda of biologists. Consequently, the gap that kept system-theoretic principles cut-off from biological experimentation persisted. More recently, however, simple theoretical tools have proved readily applicable within the context of systems biology. In particular, examples reviewed in this paper suggest that rigorous mathematical expressions of design principles, imported primarily from engineering, could produce experimentally confirmable predictions of the regulatory properties of small biological networks. But this is not enough for contemporary systems biologists who adopt the holistic aspirations of early systemologists, seeking high-level organizing principles that could provide insights into problems of biological complexity at the whole-system level. While the presented evidence is not conclusive about whether this strategy could lead to the realization of the lofty goal of a comprehensive explanatory integration, it suggests that the ongoing quest for organizing principles is pragmatically advantageous for systems biologists. The formalisms postulated in the course of this process can serve as bridges between system-theoretic concepts and the results of molecular experimentation: they constitute theoretical tools for generalizing molecular data, thus producing increasingly accurate explanations of system-wide phenomena.

  3. Quantum theory of the generalised uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bruneton, Jean-Philippe; Larena, Julien

    2017-04-01

    We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.

  4. Experimental and first-principles calculation study of the pressure-induced transitions to a metastable phase in GaP O4 and in the solid solution AlP O4-GaP O4

    NASA Astrophysics Data System (ADS)

    Angot, E.; Huang, B.; Levelut, C.; Le Parc, R.; Hermet, P.; Pereira, A. S.; Aquilanti, G.; Frapper, G.; Cambon, O.; Haines, J.

    2017-08-01

    α -Quartz-type gallium phosphate and representative compositions in the AlP O4-GaP O4 solid solution were studied by x-ray powder diffraction and absorption spectroscopy, Raman scattering, and by first-principles calculations up to pressures of close to 30 GPa. A phase transition to a metastable orthorhombic high-pressure phase along with some of the stable orthorhombic C m c m CrV O4 -type material is found to occur beginning at 9 GPa at 320 ∘C in GaP O4 . In the case of the AlP O4-GaP O4 solid solution at room temperature, only the metastable orthorhombic phase was obtained above 10 GPa. The possible crystal structures of the high-pressure forms of GaP O4 were predicted from first-principles calculations and the evolutionary algorithm USPEX. A predicted orthorhombic structure with a P m n 21 space group with the gallium in sixfold and phosphorus in fourfold coordination was found to be in the best agreement with the combined experimental data from x-ray diffraction and absorption and Raman spectroscopy. This method is found to very powerful to better understand competition between different phase transition pathways at high pressure.

  5. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  6. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  7. An uncertainty budget for VHF and UHF reflectometers

    NASA Astrophysics Data System (ADS)

    Ridler, N. M.; Medley, C. J.

    1992-05-01

    Details of the derivation of an uncertainty budget for one port immittance or complex voltage reflection coefficient measuring instruments, operating at VHF and UHF in the 14 mm 50 ohm coaxial line size, are reported. The principles of the uncertainty budget are given along with experimental results obtained using six ports and a network analyzer as the measuring instruments. Details of the types of calibration for which the uncertainty budget is suitable are reported. Various aspects of the uncertainty budget are considered and general principles and treatment of the type A and type B contributions are discussed. Experimental results obtained using the uncertainty budget are given. A summary of uncertainties for the six ports and HP8753B automatic network analyzer are also given.

  8. Generalized uncertainty principle impact onto the black holes information flux and the sparsity of Hawking radiation

    NASA Astrophysics Data System (ADS)

    Alonso-Serrano, Ana; DÄ browski, Mariusz P.; Gohar, Hussain

    2018-02-01

    We investigate the generalized uncertainty principle (GUP) corrections to the entropy content and the information flux of black holes, as well as the corrections to the sparsity of the Hawking radiation at the late stages of evaporation. We find that due to these quantum gravity motivated corrections, the entropy flow per particle reduces its value on the approach to the Planck scale due to a better accuracy in counting the number of microstates. We also show that the radiation flow is no longer sparse when the mass of a black hole approaches Planck mass which is not the case for non-GUP calculations.

  9. Visible-light absorption and large band-gap bowing of GaN 1-xSb x from first principles

    DOE PAGES

    Sheetz, R. Michael; Richter, Ernst; Andriotis, Antonis N.; ...

    2011-08-01

    Applicability of the Ga(Sb x)N 1-x alloys for practical realization of photoelectrochemical water splitting is investigated using first-principles density functional theory incorporating the local density approximation and generalized gradient approximation plus the Hubbard U parameter formalism. Our calculations reveal that a relatively small concentration of Sb impurities is sufficient to achieve a significant narrowing of the band gap, enabling absorption of visible light. Theoretical results predict that Ga(Sb x)N 1-x alloys with 2-eV band gaps straddle the potential window at moderate to low pH values, thus indicating that dilute Ga(Sb x)N 1-x alloys could be potential candidates for splitting watermore » under visible light irradiation.« less

  10. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  11. Diversifying the composition and structure of managed late-successional forests with harvest gaps: What is the optimal gap size?

    Treesearch

    Christel C. Kern; Anthony W. D’Amato; Terry F. Strong

    2013-01-01

    Managing forests for resilience is crucial in the face of uncertain future environmental conditions. Because harvest gap size alters the species diversity and vertical and horizontal structural heterogeneity, there may be an optimum range of gap sizes for conferring resilience to environmental uncertainty. We examined the impacts of different harvest gap sizes on...

  12. Orbiter Gap Filler Bending Model for Re-entry

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2007-01-01

    Pressure loads on a protruding gap filler during an Orbiter reentry are investigated to evaluate the likelihood of extraction due to pressure loads, and to ascertain how much bending will be induced by re-entry pressure loads. Oblique shock wave theory is utilized to develop a representation of the pressure loads induced on a gap filler for the ISSHVFW trajectory, representative of a heavy weight ISS return. A free body diagram is utilized to react the forces induced by the pressure forces. Preliminary results developed using these methods demonstrate that pressure loads, alone, are not likely causes of gap filler extraction during reentry. Assessment of the amount a gap filler will bend over is presented. Implications of gap filler bending during re-entry include possible mitigation of early boundary layer transition concerns, uncertainty in ground based measurement of protruding gap fillers from historical Orbiter flight history, and uncertainty in the use of Orbiter gap fillers for boundary layer prediction calibration. Authors will be added to the author list as appropriate.

  13. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health.

  14. Info-Gap robustness pathway method for transitioning of urban drainage systems under deep uncertainties.

    PubMed

    Zischg, Jonatan; Goncalves, Mariana L R; Bacchin, Taneha Kuzniecow; Leonhardt, Günther; Viklander, Maria; van Timmeren, Arjan; Rauch, Wolfgang; Sitzenfrei, Robert

    2017-09-01

    In the urban water cycle, there are different ways of handling stormwater runoff. Traditional systems mainly rely on underground piped, sometimes named 'gray' infrastructure. New and so-called 'green/blue' ambitions aim for treating and conveying the runoff at the surface. Such concepts are mainly based on ground infiltration and temporal storage. In this work a methodology to create and compare different planning alternatives for stormwater handling on their pathways to a desired system state is presented. Investigations are made to assess the system performance and robustness when facing the deeply uncertain spatial and temporal developments in the future urban fabric, including impacts caused by climate change, urbanization and other disruptive events, like shifts in the network layout and interactions of 'gray' and 'green/blue' structures. With the Info-Gap robustness pathway method, three planning alternatives are evaluated to identify critical performance levels at different stages over time. This novel methodology is applied to a real case study problem where a city relocation process takes place during the upcoming decades. In this case study it is shown that hybrid systems including green infrastructures are more robust with respect to future uncertainties, compared to traditional network design.

  15. Uncertainty relations with the generalized Wigner-Yanase-Dyson skew information

    NASA Astrophysics Data System (ADS)

    Fan, Yajing; Cao, Huaixin; Wang, Wenhua; Meng, Huixian; Chen, Liang

    2018-07-01

    The uncertainty principle in quantum mechanics is a fundamental relation with different forms, including Heisenberg's uncertainty relation and Schrödinger's uncertainty relation. We introduce the generalized Wigner-Yanase-Dyson correlation and the related quantities. Various properties of them are discussed. Finally, we establish several generalizations of uncertainty relation expressed in terms of the generalized Wigner-Yanase-Dyson skew information.

  16. The Irrelevance of the Risk-Uncertainty Distinction.

    PubMed

    Roser, Dominic

    2017-10-01

    Precautionary Principles are often said to be appropriate for decision-making in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the risk-uncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, in particular by distinguishing three interpretations of probability: objective, epistemic, and subjective probability. I then claim that if we are concerned with whether we have probabilities at all-regardless of how low their epistemic credentials are-then we almost always have probabilities for policy-making. The reason is that subjective and epistemic probability are the relevant interpretations of probability and we almost always have subjective and epistemic probabilities. In contrast, if we are only concerned with probabilities that have sufficiently high epistemic credentials, then we obviously do not always have probabilities. Climate policy, for example, would then be a case of decision-making under uncertainty. But, so I argue, we should not dismiss probabilities with low epistemic credentials. Rather, when they are the best available probabilities our decision principles should make use of them. And, since they are almost always available, the risk-uncertainty distinction remains irrelevant.

  17. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    PubMed

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.

  18. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Methods for Estimating the Uncertainty in Emergy Table-Form Models

    EPA Science Inventory

    Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...

  20. Practical Doping Principles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zunger, A.

    2003-05-01

    'Theoretical investigations of doping of several wide-gap materials suggest a number of rather general, practical"doping principles" that may help guide experimental strategies of overcoming doping bottlenecks. This paper will be published as a journal article in the future.

  1. Generalized Entropic Uncertainty Relations with Tsallis' Entropy

    NASA Technical Reports Server (NTRS)

    Portesi, M.; Plastino, A.

    1996-01-01

    A generalization of the entropic formulation of the Uncertainty Principle of Quantum Mechanics is considered with the introduction of the q-entropies recently proposed by Tsallis. The concomitant generalized measure is illustrated for the case of phase and number operators in quantum optics. Interesting results are obtained when making use of q-entropies as the basis for constructing generalized entropic uncertainty measures.

  2. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Landé, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schr

  3. First principles study of the electronic properties and band gap modulation of two-dimensional phosphorene monolayer: Effect of strain engineering

    NASA Astrophysics Data System (ADS)

    Phuc, Huynh V.; Hieu, Nguyen N.; Ilyasov, Victor V.; Phuong, Le T. T.; Nguyen, Chuong V.

    2018-06-01

    The effect of strain on the structural and electronic properties of monolayer phosphorene is studied by using first-principle calculations based on the density functional theory. The intra- and inter-bond length and bond angle for monolayer phosphorene is also evaluated. The intra- and inter-bond length and the bond angle for phosphorene show an opposite tendency under different directions of the applied strain. At the equilibrium state, monolayer phosphorene is a semiconductor with a direct band gap at the Γ-point of 0.91 eV. A direct-indirect band gap transition is found in monolayer phosphorene when both the compression and tensile strain are simultaneously applied along both zigzag and armchair directions. Under the applied compression strain, a semiconductor-metal transition for monolayer phosphorene is observed at -13% and -10% along armchair and zigzag direction, respectively. The direct-indirect and phase transition will largely constrain application of monolayer phosphorene to electronic and optical devices.

  4. Making Invasion models useful for decision makers; incorporating uncertainty, knowledge gaps, and decision-making preferences

    Treesearch

    Denys Yemshanov; Frank H Koch; Mark Ducey

    2015-01-01

    Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision maker’s perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...

  5. First Principles Electronic Structure of Mn doped GaAs, GaP, and GaN Semiconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulthess, Thomas C; Temmerman, Walter M; Szotek, Zdzislawa

    We present first-principles electronic structure calculations of Mn doped III-V semiconductors based on the local spin-density approximation (LSDA) as well as the self-interaction corrected local spin density method (SIC-LSD). We find that it is crucial to use a self-interaction free approach to properly describe the electronic ground state. The SIC-LSD calculations predict the proper electronic ground state configuration for Mn in GaAs, GaP, and GaN. Excellent quantitative agreement with experiment is found for magnetic moment and p-d exchange in (GaMn)As. These results allow us to validate commonly used models for magnetic semiconductors. Furthermore, we discuss the delicate problem of extractingmore » binding energies of localized levels from density functional theory calculations. We propose three approaches to take into account final state effects to estimate the binding energies of the Mn-d levels in GaAs. We find good agreement between computed values and estimates from photoemisison experiments.« less

  6. Niels Bohr's discussions with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger: the origins of the principles of uncertainty and complementarity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehra, J.

    1987-05-01

    In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger during 1920-1927 are treated. From the formulation of quantum mechanics in 1925-1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory - formulated in fall 1926 by Dirac, London, and Jordan - Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publiclymore » enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of quantum mechanics and of physical theory as such - formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930 - were continued during the next decades. All these aspects are briefly summarized.« less

  7. A Strategy for Uncertainty Visualization Design

    DTIC Science & Technology

    2009-10-01

    143–156, Magdeburg , Germany . [11] Thomson, J., Hetzler, E., MacEachren, A., Gahegan, M. and Pavel, M. (2005), A Typology for Visualizing Uncertainty...and Stasko [20] to bridge analytic gaps in visualization design, when tasks in the strategy overlap (and therefore complement) design frameworks

  8. On entropic uncertainty relations in the presence of a minimal length

    NASA Astrophysics Data System (ADS)

    Rastegin, Alexey E.

    2017-07-01

    Entropic uncertainty relations for the position and momentum within the generalized uncertainty principle are examined. Studies of this principle are motivated by the existence of a minimal observable length. Then the position and momentum operators satisfy the modified commutation relation, for which more than one algebraic representation is known. One of them is described by auxiliary momentum so that the momentum and coordinate wave functions are connected by the Fourier transform. However, the probability density functions of the physically true and auxiliary momenta are different. As the corresponding entropies differ, known entropic uncertainty relations are changed. Using differential Shannon entropies, we give a state-dependent formulation with correction term. State-independent uncertainty relations are obtained in terms of the Rényi entropies and the Tsallis entropies with binning. Such relations allow one to take into account a finiteness of measurement resolution.

  9. A Hierarchical Multi-Model Approach for Uncertainty Segregation, Prioritization and Comparative Evaluation of Competing Modeling Propositions

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Elshall, A. S.; Hanor, J. S.

    2012-12-01

    Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis

  10. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    NASA Astrophysics Data System (ADS)

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working

  11. Mechanics of Fluid-Filled Interstitial Gaps. I. Modeling Gaps in a Compact Tissue.

    PubMed

    Parent, Serge E; Barua, Debanjan; Winklbauer, Rudolf

    2017-08-22

    Fluid-filled interstitial gaps are a common feature of compact tissues held together by cell-cell adhesion. Although such gaps can in principle be the result of weak, incomplete cell attachment, adhesion is usually too strong for this to occur. Using a mechanical model of tissue cohesion, we show that, instead, a combination of local prevention of cell adhesion at three-cell junctions by fluidlike extracellular material and a reduction of cortical tension at the gap surface are sufficient to generate stable gaps. The size and shape of these interstitial gaps depends on the mechanical tensions between cells and at gap surfaces, and on the difference between intracellular and interstitial pressures that is related to the volume of the interstitial fluid. As a consequence of the dependence on tension/tension ratios, the presence of gaps does not depend on the absolute strength of cell adhesion, and similar gaps are predicted to occur in tissues of widely differing cohesion. Tissue mechanical parameters can also vary within and between cells of a given tissue, generating asymmetrical gaps. Within limits, these can be approximated by symmetrical gaps. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  12. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  13. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  14. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  15. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models

  16. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    NASA Astrophysics Data System (ADS)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.

  17. Momentum dependence of the superconducting gap and in-gap states in MgB 2 multiband superconductor

    DOE PAGES

    Mou, Daixiang; Jiang, Rui; Taufour, Valentin; ...

    2015-06-29

    We use tunable laser-based angle-resolved photoemission spectroscopy to study the electronic structure of the multiband superconductor MgB 2. These results form the baseline for detailed studies of superconductivity in multiband systems. We find that the magnitude of the superconducting gap on both σ bands follows a BCS-like variation with temperature with Δ 0 ~ 7meV. Furthermore, the value of the gap is isotropic within experimental uncertainty and in agreement with a pure s-wave pairing symmetry. We observe in-gap states confined to k F of the σ band that occur at some locations of the sample surface. As a result, themore » energy of this excitation, ~ 3 meV, was found to be somewhat larger than the previously reported gap on π Fermi sheet and therefore we cannot exclude the possibility of interband scattering as its origin.« less

  18. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  19. Identifying gaps in conservation networks: of indicators and uncertainty in geographic-based analyses

    Treesearch

    Curtis H. Flather; Kenneth R. Wilson; Denis J. Dean; William C. McComb

    1997-01-01

    Mapping of biodiversity elements to expose gaps in. conservation networks has become a common strategy in nature-reserve design. We review a set of critical assumptions and issues that influence the interpretation and implementation of gap analysis, including: (1) the assumption that a subset of taxa can be used to indicate overall diversity patterns, and (2) the...

  20. Role of information theoretic uncertainty relations in quantum theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz; ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin; Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk

    2015-04-15

    Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again,more » improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.« less

  1. Uncertainty for Part Density Determination: An Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Mario Orlando

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM)more » for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.« less

  2. The physical origins of the uncertainty theorem

    NASA Astrophysics Data System (ADS)

    Giese, Albrecht

    2013-10-01

    The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Albert Einstein's interpretation.

  3. On the relativity and uncertainty of distance, time, and energy measurements by man. (1) Derivation of the Weber psychophysical law from the Heisenberg uncertainty principle applied to a superconductive biological detector. (2) The reverse derivation. (3) A human theory of relativity.

    PubMed

    Cope, F W

    1981-01-01

    The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature.

  4. Estimated stocks of circumpolar permafrost carbon with quantified uncertainty ranges and identified data gaps

    DOE PAGES

    Hugelius, Gustaf; Strauss, J.; Zubrzycki, S.; ...

    2014-12-01

    Soils and other unconsolidated deposits in the northern circumpolar permafrost region store large amounts of soil organic carbon (SOC). This SOC is potentially vulnerable to remobilization following soil warming and permafrost thaw, but SOC stock estimates were poorly constrained and quantitative error estimates were lacking. This study presents revised estimates of permafrost SOC stocks, including quantitative uncertainty estimates, in the 0–3 m depth range in soils as well as for sediments deeper than 3 m in deltaic deposits of major rivers and in the Yedoma region of Siberia and Alaska. Revised estimates are based on significantly larger databases compared tomore » previous studies. Despite this there is evidence of significant remaining regional data gaps. Estimates remain particularly poorly constrained for soils in the High Arctic region and physiographic regions with thin sedimentary overburden (mountains, highlands and plateaus) as well as for deposits below 3 m depth in deltas and the Yedoma region. While some components of the revised SOC stocks are similar in magnitude to those previously reported for this region, there are substantial differences in other components, including the fraction of perennially frozen SOC. Upscaled based on regional soil maps, estimated permafrost region SOC stocks are 217 ± 12 and 472 ± 27 Pg for the 0–0.3 and 0–1 m soil depths, respectively (±95% confidence intervals). Storage of SOC in 0–3 m of soils is estimated to 1035 ± 150 Pg. Of this, 34 ± 16 Pg C is stored in poorly developed soils of the High Arctic. Based on generalized calculations, storage of SOC below 3 m of surface soils in deltaic alluvium of major Arctic rivers is estimated as 91 ± 52 Pg. In the Yedoma region, estimated SOC stocks below 3 m depth are 181 ± 54 Pg, of which 74 ± 20 Pg is stored in intact Yedoma (late Pleistocene ice- and organic-rich silty sediments) with the remainder in refrozen thermokarst deposits. Total estimated SOC

  5. A Practical Approach to Address Uncertainty in Stakeholder Deliberations.

    PubMed

    Gregory, Robin; Keeney, Ralph L

    2017-03-01

    This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.

  6. Global ethics and principlism.

    PubMed

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together.

  7. First-principles study of band gap engineering via oxygen vacancy doping in perovskite ABB'O₃ solid solutions

    DOE PAGES

    Qi, Tingting; Curnan, Matthew T.; Kim, Seungchul; ...

    2011-12-15

    Oxygen vacancies in perovskite oxide solid solutions are fundamentally interesting and technologically important. However, experimental characterization of the vacancy locations and their impact on electronic structure is challenging. We have carried out first-principles calculations on two Zr-modified solid solutions, Pb(Zn 1/3Nb 2/3)O₃ and Pb(Mg 1/3Nb 2/3)O₃, in which vacancies are present. We find that the vacancies are more likely to reside between low-valent cation-cation pairs than high-valent cation-cation pairs. Based on the analysis of our results, we formulate guidelines that can be used to predict the location of oxygen vacancies in perovskite solid solutions. Our results show that vacancies canmore » have a significant impact on both the conduction and valence band energies, in some cases lowering the band gap by ≈0.5 eV. The effects of vacancies on the electronic band structure can be understood within the framework of crystal field theory.« less

  8. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  9. Tightening the entropic uncertainty bound in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Adabi, F.; Salimi, S.; Haseli, S.

    2016-06-01

    The uncertainty principle is a fundamental principle in quantum physics. It implies that the measurement outcomes of two incompatible observables cannot be predicted simultaneously. In quantum information theory, this principle can be expressed in terms of entropic measures. M. Berta et al. [Nat. Phys. 6, 659 (2010), 10.1038/nphys1734] have indicated that uncertainty bound can be altered by considering a particle as a quantum memory correlating with the primary particle. In this article, we obtain a lower bound for entropic uncertainty in the presence of a quantum memory by adding an additional term depending on the Holevo quantity and mutual information. We conclude that our lower bound will be tightened with respect to that of Berta et al. when the accessible information about measurements outcomes is less than the mutual information about the joint state. Some examples have been investigated for which our lower bound is tighter than Berta et al.'s lower bound. Using our lower bound, a lower bound for the entanglement of formation of bipartite quantum states has been obtained, as well as an upper bound for the regularized distillable common randomness.

  10. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes

  11. Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Smith, William D

    2010-02-01

    In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads to risk-ignorant decisions and miscalculation of expected impacts as well as the costs required to minimize these impacts. Here we use the information gap concept to evaluate the robustness of risk maps to uncertainties in key assumptions about an invading organism. We generate risk maps with a spatial model of invasion that simulates potential entries of an invasive pest via international marine shipments, their spread through a landscape, and establishment on a susceptible host. In particular, we focus on the question of how much uncertainty in risk model assumptions can be tolerated before the risk map loses its value. We outline this approach with an example of a forest pest recently detected in North America, Sirex noctilio Fabricius. The results provide a spatial representation of the robustness of predictions of S. noctilio invasion risk to uncertainty and show major geographic hotspots where the consideration of uncertainty in model parameters may change management decisions about a new invasive pest. We then illustrate how the dependency between the extent of uncertainties and the degree of robustness of a risk map can be used to select a surveillance network design that is most robust to knowledge gaps about the pest.

  12. Design of crusher liner based on time - varying uncertainty theory

    NASA Astrophysics Data System (ADS)

    Tang, J. C.; Shi, B. Q.; Yu, H. J.; Wang, R. J.; Zhang, W. Y.

    2017-09-01

    This article puts forward the time-dependent design method considering the load fluctuation factors for the liner based on the time-varying uncertainty theory. In this method, the time-varying uncertainty design model of liner is constructed by introducing the parameters that affect the wear rate, the volatility and the drift rate. Based on the design example, the timevarying design outline of the moving cone liner is obtained. Based on the theory of minimum wear, the gap curve of wear resistant cavity is designed, and the optimized cavity is obtained by the combination of the thickness of the cone and the cavity gap. Taking the PYGB1821 multi cylinder hydraulic cone crusher as an example, it is proved that the service life of the new liner is improved by more than 14.3%.

  13. The Effect of High N-DOPED Anatase TiO2 on the Band Gap Narrowing and Redshift by First-Principles

    NASA Astrophysics Data System (ADS)

    Hou, Qingyu; Jin, Yongjun; Ying, Chun; Zhao, Erjun; Zhang, Yue; Dong, Hongying

    2012-10-01

    Anatase TiO2 supercells were studied by first-principles, in which one was undoped and another three were high N-doping. Partial densities of states, band structure, population and absorption spectrum were calculated. The calculated results indicated that in the condition of TiO2-xNx (x = 0.0625, 0.125, 0.25), the higher the doping concentration is, the shorter will be the lattice parameters parallel to the direction of c-axis. The strength of covalent bond significantly varied. The formation energy increases at first, and then decreases. The doping models become less stable as N-doping concentration increases. Meanwhile, the narrower the band gap is, the more significant will be the redshift, which is in agreement with the experimental results.

  14. Is the Precautionary Principle Really Incoherent?

    PubMed

    Boyer-Kassem, Thomas

    2017-11-01

    The Precautionary Principle has been an increasingly important principle in international treaties since the 1980s. Through varying formulations, it states that when an activity can lead to a catastrophe for human health or the environment, measures should be taken to prevent it even if the cause-and-effect relationship is not fully established scientifically. The Precautionary Principle has been critically discussed from many sides. This article concentrates on a theoretical argument by Peterson (2006) according to which the Precautionary Principle is incoherent with other desiderata of rational decision making, and thus cannot be used as a decision rule that selects an action among several ones. I claim here that Peterson's argument fails to establish the incoherence of the Precautionary Principle, by attacking three of its premises. I argue (i) that Peterson's treatment of uncertainties lacks generality, (ii) that his Archimedian condition is problematic for incommensurability reasons, and (iii) that his explication of the Precautionary Principle is not adequate. This leads me to conjecture that the Precautionary Principle can be envisaged as a coherent decision rule, again. © 2017 Society for Risk Analysis.

  15. Uncertainty during breast diagnostic evaluation: state of the science.

    PubMed

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  16. Plant canopy gap-size analysis theory for improving optical measurements of leaf-area index

    NASA Astrophysics Data System (ADS)

    Chen, Jing M.; Cihlar, Josef

    1995-09-01

    Optical instruments currently available for measuring the leaf-area index (LAI) of a plant canopy all utilize only the canopy gap-fraction information. These instruments include the Li-Cor LAI-2000 Plant Canopy Analyzer, Decagon, and Demon. The advantages of utilizing both the canopy gap-fraction and gap-size information are shown. For the purpose of measuring the canopy gap size, a prototype sunfleck-LAI instrument named Tracing Radiation and Architecture of Canopies (TRAC), has been developed and tested in two pure conifer plantations, red pine (Pinus resinosa Ait.) and jack pine (Pinus banksiana Lamb). A new gap-size-analysis theory is presented to quantify the effect of canopy architecture on optical measurements of LAI based on the gap-fraction principle. The theory is an improvement on that of Lang and Xiang [Agric. For. Meteorol. 37, 229 (1986)]. In principle, this theory can be used for any heterogeneous canopies.

  17. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  18. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Energy and Uncertainty in General Relativity

    NASA Astrophysics Data System (ADS)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  20. A study of the influence of forest gaps on fire–atmosphere interactions

    Treesearch

    Michael T. Kiefer; Warren E. Heilman; Shiyuan Zhong; Joseph J. (Jay) Charney; Xindi (Randy) Bian

    2016-01-01

    Much uncertainty exists regarding the possible role that gaps in forest canopies play in modulating fire–atmosphere interactions in otherwise horizontally homogeneous forests. This study examines the influence of gaps in forest canopies on atmospheric perturbations induced by a low-intensity fire using the ARPS-CANOPY model, a version of the Advanced Regional...

  1. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  2. Line-averaging measurement methods to estimate the gap in the CO2 balance closure - possibilities, challenges, and uncertainties

    NASA Astrophysics Data System (ADS)

    Ziemann, Astrid; Starke, Manuela; Schütze, Claudia

    2017-11-01

    An imbalance of surface energy fluxes using the eddy covariance (EC) method is observed in global measurement networks although all necessary corrections and conversions are applied to the raw data. Mainly during nighttime, advection can occur, resulting in a closing gap that consequently should also affect the CO2 balances. There is the crucial need for representative concentration and wind data to measure advective fluxes. Ground-based remote sensing techniques are an ideal tool as they provide the spatially representative CO2 concentration together with wind components within the same voxel structure. For this purpose, the presented SQuAd (Spatially resolved Quantification of the Advection influence on the balance closure of greenhouse gases) approach applies an integrated method combination of acoustic and optical remote sensing. The innovative combination of acoustic travel-time tomography (A-TOM) and open-path Fourier-transform infrared spectroscopy (OP-FTIR) will enable an upscaling and enhancement of EC measurements. OP-FTIR instrumentation offers the significant advantage of real-time simultaneous measurements of line-averaged concentrations for CO2 and other greenhouse gases (GHGs). A-TOM is a scalable method to remotely resolve 3-D wind and temperature fields. The paper will give an overview about the proposed SQuAd approach and first results of experimental tests at the FLUXNET site Grillenburg in Germany. Preliminary results of the comprehensive experiments reveal a mean nighttime horizontal advection of CO2 of about 10 µmol m-2 s-1 estimated by the spatially integrating and representative SQuAd method. Additionally, uncertainties in determining CO2 concentrations using passive OP-FTIR and wind speed applying A-TOM are systematically quantified. The maximum uncertainty for CO2 concentration was estimated due to environmental parameters, instrumental characteristics, and retrieval procedure with a total amount of approximately 30 % for a single

  3. Band gap modulation of graphene by metal substrate: A first principles study

    NASA Astrophysics Data System (ADS)

    Sahoo, Mihir Ranjan; Sahu, Sivabrata; Kushwaha, Anoop Kumar; Nayak, S. K.

    2018-04-01

    Due to high in-plane charge carrier mobility with high electron velocity and long spin diffusion length, graphene guarantees as a completely unique material for devices with various applications. Unaffected 2pz orbitals of carbon atoms in graphene can be highly influenced by substrates and leads to tuning in electronic properties. We report here a density functional calculation of graphene monolayer based on metallic substrate like nickel surfaces. Band-gap of graphene near K points opens due to interactions between 2pz and d-orbitals of nickel atoms and the gap modulation can be done with the increasing number of layers of substrates.

  4. Visualization of Uncertainty

    NASA Astrophysics Data System (ADS)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs

  5. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  6. Quasiparticle Energies and Band Gaps in Graphene Nanoribbons

    NASA Astrophysics Data System (ADS)

    Yang, Li; Park, Cheol-Hwan; Son, Young-Woo; Cohen, Marvin L.; Louie, Steven G.

    2007-11-01

    We present calculations of the quasiparticle energies and band gaps of graphene nanoribbons (GNRs) carried out using a first-principles many-electron Green’s function approach within the GW approximation. Because of the quasi-one-dimensional nature of a GNR, electron-electron interaction effects due to the enhanced screened Coulomb interaction and confinement geometry greatly influence the quasiparticle band gap. Compared with previous tight-binding and density functional theory studies, our calculated quasiparticle band gaps show significant self-energy corrections for both armchair and zigzag GNRs, in the range of 0.5 3.0 eV for ribbons of width 2.4 0.4 nm. The quasiparticle band gaps found here suggest that use of GNRs for electronic device components in ambient conditions may be viable.

  7. Risk assessment principle for engineered nanotechnology in food and drug.

    PubMed

    Hwang, Myungsil; Lee, Eun Ji; Kweon, Se Young; Park, Mi Sun; Jeong, Ji Yoon; Um, Jun Ho; Kim, Sun Ah; Han, Bum Suk; Lee, Kwang Ho; Yoon, Hae Jung

    2012-06-01

    While the ability to develop nanomaterials and incorporate them into products is advancing rapidly worldwide, understanding of the potential health safety effects of nanomaterials has proceeded at a much slower pace. Since 2008, Korea Food and Drug Administration (KFDA) started an investigation to prepare "Strategic Action Plan" to evaluate safety and nano risk management associated with foods, drugs, medical devices and cosmetics using nano-scale materials. Although there are some studies related to potential risk of nanomaterials, physical-chemical characterization of nanomaterials is not clear yet and these do not offer enough information due to their limitations. Their uncertainties make it impossible to determine whether nanomaterials are actually hazardous to human. According to the above mention, we have some problems to conduct the human exposure risk assessment currently. On the other hand, uncertainty about safety may lead to polarized public debate and to businesses unwillingness for further nanotechnology investigation. Therefore, the criteria and methods to assess possible adverse effects of nanomaterials have been vigorously taken into consideration by many international organizations: the World Health Organization, the Organization for Economic and Commercial Development and the European Commission. The object of this study was to develop risk assessment principles for safety management of future nanoproducts and also to identify areas of research to strengthen risk assessment for nanomaterials. The research roadmaps which were proposed in this study will be helpful to fill up the current gaps in knowledge relevant nano risk assessment.

  8. Uncertainty in Bohr's response to the Heisenberg microscope

    NASA Astrophysics Data System (ADS)

    Tanona, Scott

    2004-09-01

    In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.

  9. Risk assessment of pesticides and other stressors in bees: Principles, data gaps and perspectives from the European Food Safety Authority.

    PubMed

    Rortais, Agnès; Arnold, Gérard; Dorne, Jean-Lou; More, Simon J; Sperandio, Giorgio; Streissl, Franz; Szentes, Csaba; Verdonck, Frank

    2017-06-01

    Current approaches to risk assessment in bees do not take into account co-exposures from multiple stressors. The European Food Safety Authority (EFSA) is deploying resources and efforts to move towards a holistic risk assessment approach of multiple stressors in bees. This paper describes the general principles of pesticide risk assessment in bees, including recent developments at EFSA dealing with risk assessment of single and multiple pesticide residues and biological hazards. The EFSA Guidance Document on the risk assessment of plant protection products in bees highlights the need for the inclusion of an uncertainty analysis, other routes of exposures and multiple stressors such as chemical mixtures and biological agents. The EFSA risk assessment on the survival, spread and establishment of the small hive beetle, Aethina tumida, an invasive alien species, is provided with potential insights for other bee pests such as the Asian hornet, Vespa velutina. Furthermore, data gaps are identified at each step of the risk assessment, and recommendations are made for future research that could be supported under the framework of Horizon 2020. Finally, the recent work conducted at EFSA is presented, under the overarching MUST-B project ("EU efforts towards the development of a holistic approach for the risk assessment on MUltiple STressors in Bees") comprising a toolbox for harmonised data collection under field conditions and a mechanistic model to assess effects from pesticides and other stressors such as biological agents and beekeeping management practices, at the colony level and in a spatially complex landscape. Future perspectives at EFSA include the development of a data model to collate high quality data to calibrate and validate the model to be used as a regulatory tool. Finally, the evidence collected within the framework of MUST-B will support EFSA's activities on the development of a holistic approach to the risk assessment of multiple stressors in bees. In

  10. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  11. Band-Gap and Band-Edge Engineering of Multicomponent Garnet Scintillators from First Principles

    NASA Astrophysics Data System (ADS)

    Yadav, Satyesh K.; Uberuaga, Blas P.; Nikl, Martin; Jiang, Chao; Stanek, Christopher R.

    2015-11-01

    Complex doping schemes in R3 Al5 O12 (where R is the rare-earth element) garnet compounds have recently led to pronounced improvements in scintillator performance. Specifically, by admixing lutetium and yttrium aluminate garnets with gallium and gadolinium, the band gap is altered in a manner that facilitates the removal of deleterious electron trapping associated with cation antisite defects. Here, we expand upon this initial work to systematically investigate the effect of substitutional admixing on the energy levels of band edges. Density-functional theory and hybrid density-functional theory (HDFT) are used to survey potential admixing candidates that modify either the conduction-band minimum (CBM) or valence-band maximum (VBM). We consider two sets of compositions based on Lu3 B5O12 where B is Al, Ga, In, As, and Sb, and R3Al5 O12 , where R is Lu, Gd, Dy, and Er. We find that admixing with various R cations does not appreciably affect the band gap or band edges. In contrast, substituting Al with cations of dissimilar ionic radii has a profound impact on the band structure. We further show that certain dopants can be used to selectively modify only the CBM or the VBM. Specifically, Ga and In decrease the band gap by lowering the CBM, while As and Sb decrease the band gap by raising the VBM, the relative change in band gap is quantitatively validated by HDFT. These results demonstrate a powerful approach to quickly screen the impact of dopants on the electronic structure of scintillator compounds, identifying those dopants which alter the band edges in very specific ways to eliminate both electron and hole traps responsible for performance limitations. This approach should be broadly applicable for the optimization of electronic and optical performance for a wide range of compounds by tuning the VBM and CBM.

  12. Measurement of optical to electrical and electrical to optical delays with ps-level uncertainty.

    PubMed

    Peek, H Z; Pinkert, T J; Jansweijer, P P M; Koelemeij, J C J

    2018-05-28

    We present a new measurement principle to determine the absolute time delay of a waveform from an optical reference plane to an electrical reference plane and vice versa. We demonstrate a method based on this principle with 2 ps uncertainty. This method can be used to perform accurate time delay determinations of optical transceivers used in fiber-optic time-dissemination equipment. As a result the time scales in optical and electrical domain can be related to each other with the same uncertainty. We expect this method will be a new breakthrough in high-accuracy time transfer and absolute calibration of time-transfer equipment.

  13. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less

  14. Quantifying uncertainty in read-across assessment – an algorithmic approach - (SOT)

    EPA Science Inventory

    Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithm...

  15. The equivalence principle in a quantum world

    NASA Astrophysics Data System (ADS)

    Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre

    2015-09-01

    We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).

  16. Bridging the gap between uncertainty analysis for complex watershed models and decision-making for watershed-scale water management

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Han, F.; Wu, B.

    2013-12-01

    Process-based, spatially distributed and dynamic models provide desirable resolutions to watershed-scale water management. However, their reliability in solving real management problems has been seriously questioned, since the model simulation usually involves significant uncertainty with complicated origins. Uncertainty analysis (UA) for complex hydrological models has been a hot topic in the past decade, and a variety of UA approaches have been developed, but mostly in a theoretical setting. Whether and how a UA could benefit real management decisions remains to be critical questions. We have conducted a series of studies to investigate the applicability of classic approaches, such as GLUE and Markov Chain Monte Carlo (MCMC) methods, in real management settings, unravel the difficulties encountered by such methods, and tailor the methods to better serve the management. Frameworks and new algorithms, such as Probabilistic Collocation Method (PCM)-based approaches, were also proposed for specific management issues. This presentation summarize our past and ongoing studies on the role of UA in real water management. Challenges and potential strategies to bridge the gap between UA for complex models and decision-making for management will be discussed. Future directions for the research in this field will also be suggested. Two common water management settings were examined. One is the Total Maximum Daily Loads (TMDLs) management for surface water quality protection. The other is integrated water resources management for watershed sustainability. For the first setting, nutrients and pesticides TMDLs in the Newport Bay Watershed (Orange Country, California, USA) were discussed. It is a highly urbanized region with a semi-arid Mediterranean climate, typical of the western U.S. For the second setting, the water resources management in the Zhangye Basin (the midstream part of Heihe Baisn, China), where the famous 'Silk Road' came through, was investigated. The Zhangye

  17. Close Early Learning Gaps with Rigorous DAP

    ERIC Educational Resources Information Center

    Brown, Christopher P.; Mowry, Brian

    2015-01-01

    Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…

  18. Fundamental uncertainty limit for speckle displacement measurements.

    PubMed

    Fischer, Andreas

    2017-09-01

    The basic metrological task in speckle photography is to quantify displacements of speckle patterns, allowing for instance the investigation of the mechanical load and modification of objects with rough surfaces. However, the fundamental limit of the measurement uncertainty due to photon shot noise is unknown. For this reason, the Cramér-Rao bound (CRB) is derived for speckle displacement measurements, representing the squared minimal achievable measurement uncertainty. As result, the CRB for speckle patterns is only two times the CRB for an ideal point light source. Hence, speckle photography is an optimal measurement approach for contactless displacement measurements on rough surfaces. In agreement with a derivation from Heisenberg's uncertainty principle, the CRB depends on the number of detected photons and the diffraction limit of the imaging system described by the speckle size. The theoretical results are verified and validated, demonstrating the capability for displacement measurements with nanometer resolution.

  19. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  20. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  1. Robustness of risk maps and survey networks to knowledge gaps about a new invasive pest

    Treesearch

    Denys Yemshanov; Frank H. Koch; Yakov Ben-Haim; William D. Smith

    2010-01-01

    In pest risk assessment it is frequently necessary to make management decisions regarding emerging threats under severe uncertainty. Although risk maps provide useful decision support for invasive alien species, they rarely address knowledge gaps associated with the underlying risk model or how they may change the risk estimates. Failure to recognize uncertainty leads...

  2. Band-gap and band-edge engineering of multicomponent garnet scintillators from first principles

    DOE PAGES

    Yadav, Satyesh K.; Uberuaga, Blas P.; Nikl, Martin; ...

    2015-11-24

    Complex doping schemes in R 3Al 5O 12 (where R is the rare-earth element) garnet compounds have recently led to pronounced improvements in scintillator performance. Specifically, by admixing lutetium and yttrium aluminate garnets with gallium and gadolinium, the band gap is altered in a manner that facilitates the removal of deleterious electron trapping associated with cation antisite defects. Here, we expand upon this initial work to systematically investigate the effect of substitutional admixing on the energy levels of band edges. Density-functional theory and hybrid density-functional theory (HDFT) are used to survey potential admixing candidates that modify either the conduction-band minimummore » (CBM) or valence-band maximum (VBM). We consider two sets of compositions based on Lu 3B 5O 12 where B is Al, Ga, In, As, and Sb, and R 3Al 5O 12, where R is Lu, Gd, Dy, and Er. We find that admixing with various R cations does not appreciably affect the band gap or band edges. In contrast, substituting Al with cations of dissimilar ionic radii has a profound impact on the band structure. We further show that certain dopants can be used to selectively modify only the CBM or the VBM. Specifically, Ga and In decrease the band gap by lowering the CBM, while As and Sb decrease the band gap by raising the VBM, the relative change in band gap is quantitatively validated by HDFT. These results demonstrate a powerful approach to quickly screen the impact of dopants on the electronic structure of scintillator compounds, identifying those dopants which alter the band edges in very specific ways to eliminate both electron and hole traps responsible for performance limitations. Furthermore, this approach should be broadly applicable for the optimization of electronic and optical performance for a wide range of compounds by tuning the VBM and CBM.« less

  3. Band gap scaling laws in group IV nanotubes.

    PubMed

    Wang, Chongze; Fu, Xiaonan; Guo, Yangyang; Guo, Zhengxiao; Xia, Congxin; Jia, Yu

    2017-03-17

    By using the first-principles calculations, the band gap properties of nanotubes formed by group IV elements have been investigated systemically. Our results reveal that for armchair nanotubes, the energy gaps at K points in the Brillouin zone decrease as 1/r scaling law with the radii (r) increasing, while they are scaled by -1/r 2  + C at Γ points, here, C is a constant. Further studies show that such scaling law of K points is independent of both the chiral vector and the type of elements. Therefore, the band gaps of nanotubes for a given radius can be determined by these scaling laws easily. Interestingly, we also predict the existence of indirect band gap for both germanium and tin nanotubes. Our new findings provide an efficient way to determine the band gaps of group IV element nanotubes by knowing the radii, as well as to facilitate the design of functional nanodevices.

  4. Steering the measured uncertainty under decoherence through local PT -symmetric operations

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu

    2018-07-01

    The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.

  5. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  6. Position-momentum uncertainty relations in the presence of quantum memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Berta, Mario; Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting ofmore » position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.« less

  7. Casuistry and principlism: the convergence of method in biomedical ethics.

    PubMed

    Kuczewski, M

    1998-12-01

    Casuistry and principlism are two of the leading contenders to be considered the methodology of bioethics. These methods may be incommensurable since the former emphasizes the examination of cases while the latter focuses on moral principles. Conversely, since both analyze cases in terms of mid-level principles, there is hope that these methods may be reconcilable or complementary. I analyze the role of principles in each and thereby show that these theories are virtually identical when interpreted in a certain light. That is, if the gaps in each method are filled by a concept of judgment or Aristotelian practical wisdom, these methods converge.

  8. Tuning band gap of monolayer and bilayer SnS2 by strain effect and external electric field: A first principles calculations

    NASA Astrophysics Data System (ADS)

    Rahman, Abeera; Shin, Young-Han

    Recently many efforts have been paid to two-dimensional layered metal dichalcogenides (LMDs). Among them MoS2 has become a prototype LMD, and recent studies show surprising and rich new physics emerging in other van der Waals materials such as layered SnS2 [1-4]. SnS2 is a semiconducting earth-abundant material and Sn is a group IV element replacing the transition metal in MoS2. SnS2 shows new possibilities in various potential applications. However, the knowledge on basic properties of layered SnS2 is still not well understood. In this study, we consider two types of structures; 1T with P 3 m 1 (164) space group and 1H with P63 / mmc (194) space group. Our first principles calculations show that the 1T structure for SnS2 is more stable than the 1H structure whereas latter is more stable for MoS2. Moreover,in contrast to MoS2,SnS2 shows an indirect band gap both for 1T and 1H structures while 1T MoS2 is metallic and 1H has a direct band gap. We also study strain effect in the range of 0-10% on the band structure for monolayer and bilayer SnS2 (both for 1T and 1H structures).We find significant change in their band gaps. We also investigate the bilayer SnS2 with and without out-of-plane stress. This research was supported by Brain Korea 21 Plus Program and Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and future Planning (NRF-2014M3A7B4049367, NRF-2014R1A2A1A1105089).

  9. Dipole-allowed direct band gap silicon superlattices

    PubMed Central

    Oh, Young Jun; Lee, In-Ho; Kim, Sunghyun; Lee, Jooyoung; Chang, Kee Joo

    2015-01-01

    Silicon is the most popular material used in electronic devices. However, its poor optical properties owing to its indirect band gap nature limit its usage in optoelectronic devices. Here we present the discovery of super-stable pure-silicon superlattice structures that can serve as promising materials for solar cell applications and can lead to the realization of pure Si-based optoelectronic devices. The structures are almost identical to that of bulk Si except that defective layers are intercalated in the diamond lattice. The superlattices exhibit dipole-allowed direct band gaps as well as indirect band gaps, providing ideal conditions for the investigation of a direct-to-indirect band gap transition. The fact that almost all structural portions of the superlattices originate from bulk Si warrants their stability and good lattice matching with bulk Si. Through first-principles molecular dynamics simulations, we confirmed their thermal stability and propose a possible method to synthesize the defective layer through wafer bonding. PMID:26656482

  10. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.

  11. Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.

    PubMed

    Peters, Achim; McEwen, Bruce S; Friston, Karl

    2017-09-01

    The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Application of Risk Management and Uncertainty Concepts and Methods for Ecosystem Restoration: Principles and Best Practice

    DTIC Science & Technology

    2012-08-01

    habitats for specific species of trout . The report noted that these uncertainties — and the SMEs, who had past experience in such topic areas — were...reduce uncertainty in HREP projects is reflected in the completion of the Pool 11 Islands (UMRS RM 583-593) HREP in 2003. In 1989 the Browns Lake

  13. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    NASA Astrophysics Data System (ADS)

    Grassi, Giacomo; Monni, Suvi; Federici, Sandro; Achard, Frederic; Mollicone, Danilo

    2008-07-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data—i.e., area change and C stock change/area—may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools—already existing in UNFCCC decisions and IPCC guidance documents—may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  14. Mind the Gap: Exploring the Underground of the NASA Space Cancer Risk Model

    NASA Technical Reports Server (NTRS)

    Chappell, L. J.; Elgart, S. R.; Milder, C. M.; Shavers, M. R.; Semones, E. J.; Huff, J. L.

    2017-01-01

    The REID quantifies the lifetime risk of death from radiation-induced cancer in an exposed astronaut. The NASA Space Cancer Risk (NSCR) 2012 mode incorporates elements from physics, biology, epidemiology, and statistics to generate the REID distribution. The current model quantifies the space radiation environment, radiation quality, and dose-rate effects to estimate a NASA-weighted dose. This weighted dose is mapped to the excess risk of radiation-induced cancer mortality from acute exposures to gamma rays and then transferred to an astronaut population. Finally, the REID is determined by integrating this risk over the individual's lifetime. The calculated upper 95% confidence limit of the REID is used to restrict an astronaut's permissible mission duration (PMD) for a proposed mission. As a statistical quantity characterized by broad, subjective uncertainties, REID estimates for space missions result in wide distributions. Currently, the upper 95% confidence level is over 350% larger than the mean REID value, which can severely limit an astronaut's PMD. The model incorporates inputs from multiple scientific disciplines in the risk estimation process. Physics and particle transport models calculate how radiation moves through space, penetrates spacecraft, and makes its way to the human beings onboard. Epidemiological studies of exposures from atomic bombings, medical treatments, and power plants are used to quantify health risks from acute and chronic low linear energy transfer (LET) ionizing radiation. Biological studies in cellular and animal models using radiation at various LETs and energies inform quality metrics for ions present in space radiation. Statistical methodologies unite these elements, controlling for mathematical and scientific uncertainty and variability. Despite current progress, these research platforms contain knowledge gaps contributing to the large uncertainties still present in the model. The NASA Space Radiation Program Element (SRPE

  15. Optimal Wind Power Uncertainty Intervals for Electricity Market Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ying; Zhou, Zhi; Botterud, Audun

    It is important to select an appropriate uncertainty level of the wind power forecast for power system scheduling and electricity market operation. Traditional methods hedge against a predefined level of wind power uncertainty, such as a specific confidence interval or uncertainty set, which leaves the questions of how to best select the appropriate uncertainty levels. To bridge this gap, this paper proposes a model to optimize the forecast uncertainty intervals of wind power for power system scheduling problems, with the aim of achieving the best trade-off between economics and reliability. Then we reformulate and linearize the models into a mixedmore » integer linear programming (MILP) without strong assumptions on the shape of the probability distribution. In order to invest the impacts on cost, reliability, and prices in a electricity market, we apply the proposed model on a twosettlement electricity market based on a six-bus test system and on a power system representing the U.S. state of Illinois. The results show that the proposed method can not only help to balance the economics and reliability of the power system scheduling, but also help to stabilize the energy prices in electricity market operation.« less

  16. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Treesearch

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  17. Information theoretic quantification of diagnostic uncertainty.

    PubMed

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  18. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  19. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  20. Understanding and applying principles of social cognition and ...

    EPA Pesticide Factsheets

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. Social-ecological stressors place significant pressure on major societal systems, triggering adaptive reforms in human governance and environmental law. Though potentially benefici

  1. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  2. Energy band gaps in graphene nanoribbons with corners

    NASA Astrophysics Data System (ADS)

    Szczȩśniak, Dominik; Durajski, Artur P.; Khater, Antoine; Ghader, Doried

    2016-05-01

    In the present paper, we study the relation between the band gap size and the corner-corner length in representative chevron-shaped graphene nanoribbons (CGNRs) with 120° and 150° corner edges. The direct physical insight into the electronic properties of CGNRs is provided within the tight-binding model with phenomenological edge parameters, developed against recent first-principle results. We show that the analyzed CGNRs exhibit inverse relation between their band gaps and corner-corner lengths, and that they do not present a metal-insulator transition when the chemical edge modifications are introduced. Our results also suggest that the band gap width for the CGNRs is predominantly governed by the armchair edge effects, and is tunable through edge modifications with foreign atoms dressing.

  3. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  4. The precautionary principle and pharmaceutical risk management.

    PubMed

    Callréus, Torbjörn

    2005-01-01

    Although it is often vigorously contested and has several different formulations, the precautionary principle has in recent decades guided environmental policy making in the face of scientific uncertainty. Originating from a criticism of traditional risk assessment, the key element of the precautionary principle is the justification for acting in the face of uncertain knowledge about risks. In the light of its growing invocation in various areas that are related to public health and recently in relation to drug safety issues, this article presents an introductory review of the main elements of the precautionary principle and some arguments conveyed by its advocates and opponents. A comparison of the characteristics of pharmaceutical risk management and environmental policy making (i.e. the setting within which the precautionary principle evolved), indicates that several important differences exist. If believed to be of relevance, in order to avoid arbitrary and unpredictable decision making, both the interpretation and possible application of the precautionary principle need to be adapted to the conditions of pharmaceutical risk management.

  5. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  6. Enhancing the Therapy Experience Using Principles of Video Game Design.

    PubMed

    Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison

    2016-02-01

    This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.

  7. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  8. Sizable band gap in organometallic topological insulator

    NASA Astrophysics Data System (ADS)

    Derakhshan, V.; Ketabi, S. A.

    2017-01-01

    Based on first principle calculation when Ceperley-Alder and Perdew-Burke-Ernzerh type exchange-correlation energy functional were adopted to LSDA and GGA calculation, electronic properties of organometallic honeycomb lattice as a two-dimensional topological insulator was calculated. In the presence of spin-orbit interaction bulk band gap of organometallic lattice with heavy metals such as Au, Hg, Pt and Tl atoms were investigated. Our results show that the organometallic topological insulator which is made of Mercury atom shows the wide bulk band gap of about ∼120 meV. Moreover, by fitting the conduction and valence bands to the band-structure which are produced by Density Functional Theory, spin-orbit interaction parameters were extracted. Based on calculated parameters, gapless edge states within bulk insulating gap are indeed found for finite width strip of two-dimensional organometallic topological insulators.

  9. Chemical carcinogens: a review of the science and its associated principles. U.S. Interagency Staff Group on Carcinogens.

    PubMed Central

    1986-01-01

    In order to articulate a view of chemical carcinogenesis that scientists generally hold in common today and to draw upon this understanding to compose guiding principles that can be used as a bases for the efforts of the regulatory agencies to establish guidelines for assessing carcinogenic risk to meet the specific requirements of the legislative acts they are charged to implement, the Office of Science and Technology Policy, Executive Office, the White House drew on the expertise of a number of regulatory agencies to elucidate present scientific views in critical areas of the major disciplines important to the process of risk assessment. The document is composed of two major sections, Principles and the State-of-the-Science. The latter consists of subsections on the mechanisms of carcinogenesis, short-term and long-term testing, and epidemiology, which are important components in the risk assessment step of hazard identification. These subsections are followed by one on exposure assessment, and a final section which includes analyses of dose-response (hazard) assessment and risk characterization. The principles are derived from considerations in each of the subsections. Because of present gaps in understanding, the principles contain judgmental (science policy) decisions on major unresolved issues as well as statements of what is generally accepted as fact. These judgments are basically assumptions which are responsible for much of the uncertainty in the process of risk assessment. There was an attempt to clearly distinguish policy and fact. The subsections of the State-of-the-Science portion provide the underlying support to the principles articulated, and to read the "Principles" section without a full appreciation of the State-of-the-Science section is to invite oversimplification and misinterpretation. Finally, suggestions are made for future research efforts which will improve the process of risk assessment. PMID:3530737

  10. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  11. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  12. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  13. Fundamental gaps with approximate density functionals: The derivative discontinuity revealed from ensemble considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraisler, Eli; Kronik, Leeor

    2014-05-14

    The fundamental gap is a central quantity in the electronic structure of matter. Unfortunately, the fundamental gap is not generally equal to the Kohn-Sham gap of density functional theory (DFT), even in principle. The two gaps differ precisely by the derivative discontinuity, namely, an abrupt change in slope of the exchange-correlation energy as a function of electron number, expected across an integer-electron point. Popular approximate functionals are thought to be devoid of a derivative discontinuity, strongly compromising their performance for prediction of spectroscopic properties. Here we show that, in fact, all exchange-correlation functionals possess a derivative discontinuity, which arises naturallymore » from the application of ensemble considerations within DFT, without any empiricism. This derivative discontinuity can be expressed in closed form using only quantities obtained in the course of a standard DFT calculation of the neutral system. For small, finite systems, addition of this derivative discontinuity indeed results in a greatly improved prediction for the fundamental gap, even when based on the most simple approximate exchange-correlation density functional – the local density approximation (LDA). For solids, the same scheme is exact in principle, but when applied to LDA it results in a vanishing derivative discontinuity correction. This failure is shown to be directly related to the failure of LDA in predicting fundamental gaps from total energy differences in extended systems.« less

  14. [Theoretical risk management and legitimacy of the precautionary principle in medicine. Look back at HIV contamination through blood transfusion in France, twenty years ago].

    PubMed

    Moutel, G; Hergon, E; Duchange, N; Bellier, L; Rouger, P; Hervé, C

    2005-02-01

    The precautionary principle first appeared in France during the health crisis following the contamination of patients with HIV via blood transfusion. This study analyses whether the risk associated with blood transfusion was taken into account early enough considering the context of scientific uncertainty between 1982 and 1985. The aim was to evaluate whether a precautionary principle was applied and whether it was relevant. First, we investigated the context of scientific uncertainty and controversies prevailing between 1982 and 1985. Then we analysed the attitude and decisions of the French authorities in this situation to determine whether a principle of precaution was applied. Finally, we explored the reasons at the origin of the delay in controlling the risk. Despite the scientific uncertainties associated with the potential risk of HIV contamination by transfusion in 1983, we found that a list of recommendations aiming to reduce this risk was published in June of that year. In the prevailing climate of uncertainty, these measures could be seen as precautionary. However, the recommended measures were not widely applied. Cultural, structural and economic factors hindered their implementation. Our analysis provides insight into the use of precautionary principle in the domain of blood transfusion and, more generally, medicine. It also sheds light on the expectations that health professionals should have of this principle. The aim of the precautionary principle is to manage rather than to reduce scientific uncertainty. The principle is not a futile search for zero risk. Rather, it is a principle for action allowing precautionary measures to be taken. However, we show that these measures must appear legitimate to be applied. This legitimacy requires an adapted decision-making process, involving all those concerned in the management of collective risks.

  15. Uncertainty relations as Hilbert space geometry

    NASA Technical Reports Server (NTRS)

    Braunstein, Samuel L.

    1994-01-01

    Precision measurements involve the accurate determination of parameters through repeated measurements of identically prepared experimental setups. For many parameters there is a 'natural' choice for the quantum observable which is expected to give optimal information; and from this observable one can construct an Heinsenberg uncertainty principle (HUP) bound on the precision attainable for the parameter. However, the classical statistics of multiple sampling directly gives us tools to construct bounds for the precision available for the parameters of interest (even when no obvious natural quantum observable exists, such as for phase, or time); it is found that these direct bounds are more restrictive than those of the HUP. The implication is that the natural quantum observables typically do not encode the optimal information (even for observables such as position, and momentum); we show how this can be understood simply in terms of the Hilbert space geometry. Another striking feature of these bounds to parameter uncertainty is that for a large enough number of repetitions of the measurements all V quantum states are 'minimum uncertainty' states - not just Gaussian wave-packets. Thus, these bounds tell us what precision is achievable as well as merely what is allowed.

  16. Air Pollution and Health: Bridging the Gap from Health Outcomes: Conference Summary

    EPA Science Inventory

    “Air Pollution and Health: Bridging the Gap from Sources to Health Outcomes,” an international specialty conference sponsored by the American Association for Aerosol Research, was held to address key uncertainties in our understanding of adverse health effects related to air po...

  17. Defense Resource Planning Under Uncertainty: An Application of Robust Decision Making to Munitions Mix Planning

    DTIC Science & Technology

    2016-02-01

    In addition , the parser updates some parameters based on uncertainties. For example, Analytica was very slow to update Pk values based on...moderate range. The additional security environments helped to fill gaps in lower severity. Weapons Effectiveness Pk values were modified to account for two...project is to help improve the value and character of defense resource planning in an era of growing uncertainty and complex strategic challenges

  18. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  19. Theoretical study of band gap in CuAlO2: Pressure dependence and self-interaction correction

    NASA Astrophysics Data System (ADS)

    Nakanishi, Akitaka; Katayama-Yoshida, Hiroshi

    2012-08-01

    By using first-principles calculations, we studied the energy gaps of delafossite CuAlO2: (1) pressure dependence and (2) self-interaction correction (SIC). Our simulation shows that CuAlO2 transforms from a delafossite structure to a leaning delafossite structure at 60 GPa. The energy gap of CuAlO2 increases through the structural transition due to the enhanced covalency of Cu 3d and O 2p states. We implemented a self-interaction correction (SIC) into first-principles calculation code to go beyond local density approximation and applied it to CuAlO2. The energy gap calculated within the SIC is close to experimental data while one calculated without the SIC is about 1 eV smaller than the experimental data.

  20. Assembling GHERG: Could "academic crowd-sourcing" address gaps in global health estimates?

    PubMed

    Rudan, Igor; Campbell, Harry; Marušić, Ana; Sridhar, Devi; Nair, Harish; Adeloye, Davies; Theodoratou, Evropi; Chan, Kit Yee

    2015-06-01

    In recent months, the World Health Organization (WHO), independent academic researchers, the Lancet and PLoS Medicine journals worked together to improve reporting of population health estimates. The new guidelines for accurate and transparent health estimates reporting (likely to be named GATHER), which are eagerly awaited, represent a helpful move that should benefit the field of global health metrics. Building on this progress and drawing from a tradition of Child Health Epidemiology Reference Group (CHERG)'s successful work model, we would like to propose a new initiative - "Global Health Epidemiology Reference Group" (GHERG). We see GHERG as an informal and entirely voluntary international collaboration of academic groups who are willing to contribute to improving disease burden estimates and respect the principles of the new guidelines - a form of "academic crowd-sourcing". The main focus of GHERG will be to identify the "gap areas" where not much information is available and/or where there is a lot of uncertainty present about the accuracy of the existing estimates. This approach should serve to complement the existing WHO and IHME estimates and to represent added value to both efforts.

  1. Global Surface Temperature Change and Uncertainties Since 1861

    NASA Technical Reports Server (NTRS)

    Shen, Samuel S. P.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    The objective of this talk is to analyze the warming trend and its uncertainties of the global and hemi-spheric surface temperatures. By the method of statistical optimal averaging scheme, the land surface air temperature and sea surface temperature observational data are used to compute the spatial average annual mean surface air temperature. The optimal averaging method is derived from the minimization of the mean square error between the true and estimated averages and uses the empirical orthogonal functions. The method can accurately estimate the errors of the spatial average due to observational gaps and random measurement errors. In addition, quantified are three independent uncertainty factors: urbanization, change of the in situ observational practices and sea surface temperature data corrections. Based on these uncertainties, the best linear fit to annual global surface temperature gives an increase of 0.61 +/- 0.16 C between 1861 and 2000. This lecture will also touch the topics on the impact of global change on nature and environment. as well as the latest assessment methods for the attributions of global change.

  2. Thermal Conductivity and Large Isotope Effect in GaN from First Principles

    DTIC Science & Technology

    2012-08-28

    August 2012) We present atomistic first principles results for the lattice thermal conductivity of GaN and compare them to those for GaP, GaAs, and GaSb ...weak scattering results from stiff atomic bonds and the large Ga to N mass ratio, which give phonons high frequencies and also a pronounced energy gap...66.70.f, 63.20.kg, 71.15.m Introduction.—Gallium nitride (GaN) is a wide band gap semiconductor and a promising candidate for use in opto- electronic

  3. Socialization of Young Children: Successful Principles and Models.

    ERIC Educational Resources Information Center

    Schieser, Hans A.

    This discussion focuses on several principles and models of early childhood education which have been used to improve children's ability to live and work with others. Modern problems of socialization in early childhood are discussed in terms of the generation gap in industrial societies, the development of the "private" family with…

  4. Optimizations for optical velocity measurements in narrow gaps

    NASA Astrophysics Data System (ADS)

    Schlüßler, Raimund; Blechschmidt, Christian; Czarske, Jürgen; Fischer, Andreas

    2013-09-01

    Measuring the flow velocity in small gaps or near a surface with a nonintrusive optical measurement technique is a challenging measurement task, as disturbing light reflections from the surface appear. However, these measurements are important, e.g., in order to understand and to design the leakage flow in the tip gap between the rotor blade end face and the housing of a turbomachine. Hence, methods to reduce the interfering light power and to correct measurement errors caused by it need to be developed and verified. Different alternatives of minimizing the interfering light power for optical flow measurements in small gaps are presented. By optimizing the beam shape of the applied illumination beam using a numerical diffraction simulation, the interfering light power is reduced by up to a factor of 100. In combination with a decrease of the reflection coefficient of the rotor blade surface, an additional reduction of the interfering light power below the used scattered light power is possible. Furthermore, a correction algorithm to decrease the measurement uncertainty of disturbed measurements is derived. These improvements enable optical three-dimensional three-component flow velocity measurements in submillimeter gaps or near a surface.

  5. Capturing total chronological and spatial uncertainties in palaeo-ice sheet reconstructions: the DATED example

    NASA Astrophysics Data System (ADS)

    Hughes, Anna; Gyllencreutz, Richard; Mangerud, Jan; Svendsen, John Inge

    2017-04-01

    Glacial geologists generate empirical reconstructions of former ice-sheet dynamics by combining evidence from the preserved record of glacial landforms (e.g. end moraines, lineations) and sediments with chronological evidence (mainly numerical dates derived predominantly from radiocarbon, exposure and luminescence techniques). However the geomorphological and sedimentological footprints and chronological data are both incomplete records in both space and time, and all have multiple types of uncertainty associated with them. To understand ice sheets' response to climate we need numerical models of ice-sheet dynamics based on physical principles. To test and/or constrain such models, empirical reconstructions of past ice sheets that capture and acknowledge all uncertainties are required. In 2005 we started a project (Database of the Eurasian Deglaciation, DATED) to produce an empirical reconstruction of the evolution of the last Eurasian ice sheets, (including the British-Irish, Scandinavian and Svalbard-Barents-Kara Seas ice sheets) that is fully documented, specified in time, and includes uncertainty estimates. Over 5000 dates relevant to constraining ice build-up and retreat were assessed for reliability and used together with published ice-sheet margin positions based on glacial geomorphology to reconstruct time-slice maps of the ice sheets' extent. The DATED maps show synchronous ice margins with maximum-minimum uncertainty bounds for every 1000 years between 25-10 kyr ago. In the first version of results (DATED-1; Hughes et al. 2016) all uncertainties (both quantitative and qualitative, e.g. precision and accuracy of numerical dates, correlation of moraines, stratigraphic interpretations) were combined based on our best glaciological-geological assessment and expressed in terms of distance as a 'fuzzy' margin. Large uncertainties (>100 km) exist; predominantly across marine sectors and other locations where there are spatial gaps in the dating record (e.g. the

  6. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  7. Participatory Development Principles and Practice: Reflections of a Western Development Worker.

    ERIC Educational Resources Information Center

    Keough, Noel

    1998-01-01

    Principles for participatory community development are as follows: humility and respect; power of local knowledge; democratic practice; diverse ways of knowing; sustainability; reality before theory; uncertainty; relativity of time and efficiency; holistic approach; and decisions rooted in the community. (SK)

  8. Quantitative Feedback Technique (QFT): Bridging the Gap

    DTIC Science & Technology

    2003-05-01

    with Eq. (2) illustrates: (a) the effect of changes of the uncertainty set P(s) upon the output of the closed -loop control system is reduced by the...Bridging the Gap root-locus technique the dominant closed -loop poles are determined for a ζ= 0.45. Table 3 presents the required value of Kx and...degree of decoupling will have been enhanced. Method 1 is then more readily applicable, with the additional benefit of reduced closed -loop BW. E.R.2

  9. Strain-induced band-gap engineering of graphene monoxide and its effect on graphene

    NASA Astrophysics Data System (ADS)

    Pu, H. H.; Rhim, S. H.; Hirschmugl, C. J.; Gajdardziska-Josifovska, M.; Weinert, M.; Chen, J. H.

    2013-02-01

    Using first-principles calculations we demonstrate the feasibility of band-gap engineering in two-dimensional crystalline graphene monoxide (GMO), a recently reported graphene-based material with a 1:1 carbon/oxygen ratio. The band gap of GMO, which can be switched between direct and indirect, is tunable over a large range (0-1.35 eV) for accessible strains. Electron and hole transport occurs predominantly along the zigzag and armchair directions (armchair for both) when GMO is a direct- (indirect-) gap semiconductor. A band gap of ˜0.5 eV is also induced in graphene at the K' points for GMO/graphene hybrid systems.

  10. Gapped electronic structure of epitaxial stanene on InSb(111)

    DOE PAGES

    Xu, Cai-Zhi; Chan, Yang-Hao; Chen, Peng; ...

    2018-01-11

    We report that stanene (single-layer gray tin), with an electronic structure akin to that of graphene but exhibiting a much larger spin-orbit gap, offers a promising platform for room-temperature electronics based on the quantum spin Hall (QSH) effect. This material has received much theoretical attention, but a suitable substrate for stanene growth that results in an overall gapped electronic structure has been elusive; a sizable gap is necessary for room-temperature applications. Here, we report a study of stanene, epitaxially grown on the (111)B-face of indium antimonide (InSb). Angle-resolved photoemission spectroscopy measurements reveal a gap of 0.44 eV, in agreement withmore » our first-principles calculations. Lastly, the results indicate that stanene on InSb(111) is a strong contender for electronic QSH applications.« less

  11. Gapped electronic structure of epitaxial stanene on InSb(111)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Cai-Zhi; Chan, Yang-Hao; Chen, Peng

    We report that stanene (single-layer gray tin), with an electronic structure akin to that of graphene but exhibiting a much larger spin-orbit gap, offers a promising platform for room-temperature electronics based on the quantum spin Hall (QSH) effect. This material has received much theoretical attention, but a suitable substrate for stanene growth that results in an overall gapped electronic structure has been elusive; a sizable gap is necessary for room-temperature applications. Here, we report a study of stanene, epitaxially grown on the (111)B-face of indium antimonide (InSb). Angle-resolved photoemission spectroscopy measurements reveal a gap of 0.44 eV, in agreement withmore » our first-principles calculations. Lastly, the results indicate that stanene on InSb(111) is a strong contender for electronic QSH applications.« less

  12. The optimisation approach of ALARA in nuclear practice: an early application of the precautionary principle. Scientific uncertainty versus legal uncertainty.

    PubMed

    Lierman, S; Veuchelen, L

    2005-01-01

    The late health effects of exposure to low doses of ionising radiation are subject to scientific controversy: one view finds threats of high cancer incidence exaggerated, while the other view thinks the effects are underestimated. Both views have good scientific arguments in favour of them. Since the nuclear field, both industry and medicine have had to deal with this controversy for many decades. One can argue that the optimisation approach to keep the effective doses as low as reasonably achievable, taking economic and social factors into account (ALARA), is a precautionary approach. However, because of these stochastic effects, no scientific proof can be provided. This paper explores how ALARA and the Precautionary Principle are influential in the legal field and in particular in tort law, because liability should be a strong incentive for safer behaviour. This so-called "deterrence effect" of liability seems to evaporate in today's technical and highly complex society, in particular when dealing with the late health effects of low doses of ionising radiation. Two main issues will be dealt with in the paper: 1. How are the health risks attributable to "low doses" of radiation regulated in nuclear law and what lessons can be learned from the field of radiation protection? 2. What does ALARA have to inform the discussion of the Precautionary Principle and vice-versa, in particular, as far as legal sanctions and liability are concerned? It will be shown that the Precautionary Principle has not yet been sufficiently implemented into nuclear law.

  13. Revealing a quantum feature of dimensionless uncertainty in linear and quadratic potentials by changing potential intervals

    NASA Astrophysics Data System (ADS)

    Kheiri, R.

    2016-09-01

    As an undergraduate exercise, in an article (2012 Am. J. Phys. 80 780-14), quantum and classical uncertainties for dimensionless variables of position and momentum were evaluated in three potentials: infinite well, bouncing ball, and harmonic oscillator. While original quantum uncertainty products depend on {{\\hslash }} and the number of states (n), a dimensionless approach makes the comparison between quantum uncertainty and classical dispersion possible by excluding {{\\hslash }}. But the question is whether the uncertainty still remains dependent on quantum number n. In the above-mentioned article, there lies this contrast; on the one hand, the dimensionless quantum uncertainty of the potential box approaches classical dispersion only in the limit of large quantum numbers (n\\to ∞ )—consistent with the correspondence principle. On the other hand, similar evaluations for bouncing ball and harmonic oscillator potentials are equal to their classical counterparts independent of n. This equality may hide the quantum feature of low energy levels. In the current study, we change the potential intervals in order to make them symmetric for the linear potential and non-symmetric for the quadratic potential. As a result, it is shown in this paper that the dimensionless quantum uncertainty of these potentials in the new potential intervals is expressed in terms of quantum number n. In other words, the uncertainty requires the correspondence principle in order to approach the classical limit. Therefore, it can be concluded that the dimensionless analysis, as a useful pedagogical method, does not take away the quantum feature of the n-dependence of quantum uncertainty in general. Moreover, our numerical calculations include the higher powers of the position for the potentials.

  14. Band Gap Engineering of Titania Systems Purposed for Photocatalytic Activity

    NASA Astrophysics Data System (ADS)

    Thurston, Cameron

    Ab initio computer aided design drastically increases candidate population for highly specified material discovery and selection. These simulations, carried out through a first-principles computational approach, accurately extrapolate material properties and behavior. Titanium Dioxide (TiO2 ) is one such material that stands to gain a great deal from the use of these simulations. In its anatase form, titania (TiO2 ) has been found to exhibit a band gap nearing 3.2 eV. If titania is to become a viable alternative to other contemporary photoactive materials exhibiting band gaps better suited for the solar spectrum, then the band gap must be subsequently reduced. To lower the energy needed for electronic excitation, both transition metals and non-metals have been extensively researched and are currently viable candidates for the continued reduction of titania's band gap. The introduction of multicomponent atomic doping introduces new energy bands which tend to both reduce the band gap and recombination loss. Ta-N, Nb-N, V-N, Cr-N, Mo-N, and W-N substitutions were studied in titania and subsequent energy and band gap calculations show a favorable band gap reduction in the case of passivated systems.

  15. Irreproducibility in Preclinical Biomedical Research: Perceptions, Uncertainties, and Knowledge Gaps.

    PubMed

    Jarvis, Michael F; Williams, Michael

    2016-04-01

    Concerns regarding the reliability of biomedical research outcomes were precipitated by two independent reports from the pharmaceutical industry that documented a lack of reproducibility in preclinical research in the areas of oncology, endocrinology, and hematology. Given their potential impact on public health, these concerns have been extensively covered in the media. Assessing the magnitude and scope of irreproducibility is limited by the anecdotal nature of the initial reports and a lack of quantitative data on specific failures to reproduce published research. Nevertheless, remediation activities have focused on needed enhancements in transparency and consistency in the reporting of experimental methodologies and results. While such initiatives can effectively bridge knowledge gaps and facilitate best practices across established and emerging research disciplines and therapeutic areas, concerns remain on how these improve on the historical process of independent replication in validating research findings and their potential to inhibit scientific innovation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Theoretical investigations on diamondoids (CnHm, n = 10-41): Nomenclature, structural stabilities, and gap distributions

    NASA Astrophysics Data System (ADS)

    Wang, Ya-Ting; Zhao, Yu-Jun; Liao, Ji-Hai; Yang, Xiao-Bao

    2018-01-01

    Combining the congruence check and the first-principles calculations, we have systematically investigated the structural stabilities and gap distributions of possible diamondoids (CnHm) with the carbon numbers (n) from 10 to 41. A simple method for the nomenclature is proposed, which can be used to distinguish and screen the candidates with high efficiency. Different from previous theoretical studies, the possible diamondoids can be enumerated according to our nomenclature, without any pre-determination from experiments. The structural stabilities and electronic properties have been studied by density functional based tight binding and first-principles methods, where a nearly linear correlation is found between the energy gaps obtained by these two methods. According to the formation energy of structures, we have determined the stable configurations as a function of chemical potential. The maximum and minimum energy gaps are found to be dominated by the shape of diamondoids for clusters with a given number of carbon atoms, while the gap decreases in general as the size increases due to the quantum confinement.

  17. The inconstant "principle of constancy".

    PubMed

    Kanzer, M

    1983-01-01

    antecedents of the principle of constancy, especially in relation to the teachings and influence of J. F. Herbart (1776-1841), do much to bridge the gap between psychological and neurophysiological aspects of Freud's ideas about constancy and its associated doctrine, psychic determinism. Freud's later teachings about the Nirvana principle and Eros suggest a continuum of "constancies" embodied in the structural and functional development of the mental apparatus as it evolves from primal unity with the environment (e.g., the mother-child unit) and differentiates in patterns that organize the inner and outer worlds in relation to each other.

  18. Designing Phononic Crystals with Wide and Robust Band Gaps

    NASA Astrophysics Data System (ADS)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang; Wang, Lifeng

    2018-04-01

    Phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with wide and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.

  19. Principles for high-quality, high-value testing.

    PubMed

    Power, Michael; Fell, Greg; Wright, Michael

    2013-02-01

    A survey of doctors working in two large NHS hospitals identified over 120 laboratory tests, imaging investigations and investigational procedures that they considered not to be overused. A common suggestion in this survey was that more training was required. And, this prompted the development of a list of core principles for high-quality, high-value testing. The list can be used as a framework for training and as a reference source. The core principles are: (1) Base testing practices on the best available evidence. (2) Apply the evidence on test performance with careful judgement. (3) Test efficiently. (4) Consider the value (and affordability) of a test before requesting it. (5) Be aware of the downsides and drivers of overdiagnosis. (6) Confront uncertainties. (7) Be patient-centred in your approach. (8) Consider ethical issues. (9) Be aware of normal cognitive limitations and biases when testing. (10) Follow the 'knowledge journey' when teaching and learning these core principles.

  20. Three Principles to REVISE People's Unethical Behavior.

    PubMed

    Ayal, Shahar; Gino, Francesca; Barkan, Rachel; Ariely, Dan

    2015-11-01

    Dishonesty and unethical behavior are widespread in the public and private sectors and cause immense annual losses. For instance, estimates of U.S. annual losses indicate $1 trillion paid in bribes, $270 billion lost due to unreported income, and $42 billion lost in retail due to shoplifting and employee theft. In this article, we draw on insights from the growing fields of moral psychology and behavioral ethics to present a three-principle framework we call REVISE. This framework classifies forces that affect dishonesty into three main categories and then redirects those forces to encourage moral behavior. The first principle, reminding, emphasizes the effectiveness of subtle cues that increase the salience of morality and decrease people's ability to justify dishonesty. The second principle, visibility, aims to restrict anonymity, prompt peer monitoring, and elicit responsible norms. The third principle, self-engagement, increases people's motivation to maintain a positive self-perception as a moral person and helps bridge the gap between moral values and actual behavior. The REVISE framework can guide the design of policy interventions to defeat dishonesty. © The Author(s) 2015.

  1. Bayesian-information-gap decision theory with an application to CO 2 sequestration

    DOE PAGES

    O'Malley, D.; Vesselinov, V. V.

    2015-09-04

    Decisions related to subsurface engineering problems such as groundwater management, fossil fuel production, and geologic carbon sequestration are frequently challenging because of an overabundance of uncertainties (related to conceptualizations, parameters, observations, etc.). Because of the importance of these problems to agriculture, energy, and the climate (respectively), good decisions that are scientifically defensible must be made despite the uncertainties. We describe a general approach to making decisions for challenging problems such as these in the presence of severe uncertainties that combines probabilistic and non-probabilistic methods. The approach uses Bayesian sampling to assess parametric uncertainty and Information-Gap Decision Theory (IGDT) to addressmore » model inadequacy. The combined approach also resolves an issue that frequently arises when applying Bayesian methods to real-world engineering problems related to the enumeration of possible outcomes. In the case of zero non-probabilistic uncertainty, the method reduces to a Bayesian method. Lastly, to illustrate the approach, we apply it to a site-selection decision for geologic CO 2 sequestration.« less

  2. Unifying decoherence and the Heisenberg Principle

    NASA Astrophysics Data System (ADS)

    Janssens, Bas

    2017-08-01

    We exhibit three inequalities involving quantum measurement, all of which are sharp and state independent. The first inequality bounds the performance of joint measurement. The second quantifies the trade-off between the measurement quality and the disturbance caused on the measured system. Finally, the third inequality provides a sharp lower bound on the amount of decoherence in terms of the measurement quality. This gives a unified description of both the Heisenberg uncertainty principle and the collapse of the wave function.

  3. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  4. Differentiating intolerance of uncertainty from three related but distinct constructs.

    PubMed

    Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel

    2014-01-01

    Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.

  5. Gap state charge induced spin-dependent negative differential resistance in tunnel junctions

    NASA Astrophysics Data System (ADS)

    Jiang, Jun; Zhang, X.-G.; Han, X. F.

    2016-04-01

    We propose and demonstrate through first-principles calculation a new spin-dependent negative differential resistance (NDR) mechanism in magnetic tunnel junctions (MTJ) with cubic cation disordered crystals (CCDC) AlO x or Mg1-x Al x O as barrier materials. The CCDC is a class of insulators whose band gap can be changed by cation doping. The gap becomes arched in an ultrathin layer due to the space charge formed from metal-induced gap states. With an appropriate combination of an arched gap and a bias voltage, NDR can be produced in either spin channel. This mechanism is applicable to 2D and 3D ultrathin junctions with a sufficiently small band gap that forms a large space charge. It provides a new way of controlling the spin-dependent transport in spintronic devices by an electric field. A generalized Simmons formula for tunneling current through junction with an arched gap is derived to show the general conditions under which ultrathin junctions may exhibit NDR.

  6. Uncertainty of streamwater solute fluxes in five contrasting headwater catchments including model uncertainty and natural variability (Invited)

    NASA Astrophysics Data System (ADS)

    Aulenbach, B. T.; Burns, D. A.; Shanley, J. B.; Yanai, R. D.; Bae, K.; Wild, A.; Yang, Y.; Dong, Y.

    2013-12-01

    There are many sources of uncertainty in estimates of streamwater solute flux. Flux is the product of discharge and concentration (summed over time), each of which has measurement uncertainty of its own. Discharge can be measured almost continuously, but concentrations are usually determined from discrete samples, which increases uncertainty dependent on sampling frequency and how concentrations are assigned for the periods between samples. Gaps between samples can be estimated by linear interpolation or by models that that use the relations between concentration and continuously measured or known variables such as discharge, season, temperature, and time. For this project, developed in cooperation with QUEST (Quantifying Uncertainty in Ecosystem Studies), we evaluated uncertainty for three flux estimation methods and three different sampling frequencies (monthly, weekly, and weekly plus event). The constituents investigated were dissolved NO3, Si, SO4, and dissolved organic carbon (DOC), solutes whose concentration dynamics exhibit strongly contrasting behavior. The evaluation was completed for a 10-year period at five small, forested watersheds in Georgia, New Hampshire, New York, Puerto Rico, and Vermont. Concentration regression models were developed for each solute at each of the three sampling frequencies for all five watersheds. Fluxes were then calculated using (1) a linear interpolation approach, (2) a regression-model method, and (3) the composite method - which combines the regression-model method for estimating concentrations and the linear interpolation method for correcting model residuals to the observed sample concentrations. We considered the best estimates of flux to be derived using the composite method at the highest sampling frequencies. We also evaluated the importance of sampling frequency and estimation method on flux estimate uncertainty; flux uncertainty was dependent on the variability characteristics of each solute and varied for

  7. Principled Practical Knowledge: Not a Bridge but a Ladder

    ERIC Educational Resources Information Center

    Bereiter, Carl

    2014-01-01

    The much-lamented gap between theory and practice in education cannot be filled by practical knowledge alone or by explanatory knowledge alone. Principled practical knowledge (PPK) is a type of knowledge that has characteristics of both practical know-how and scientific theory. Like basic scientific theory, PPK meets standards of explanatory…

  8. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  9. Application of fuzzy system theory in addressing the presence of uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statisticalmore » approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.« less

  10. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  11. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    PubMed Central

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  12. The Sapir-Whorf hypothesis and inference under uncertainty.

    PubMed

    Regier, Terry; Xu, Yang

    2017-11-01

    The Sapir-Whorf hypothesis holds that human thought is shaped by language, leading speakers of different languages to think differently. This hypothesis has sparked both enthusiasm and controversy, but despite its prominence it has only occasionally been addressed in computational terms. Recent developments support a view of the Sapir-Whorf hypothesis in terms of probabilistic inference. This view may resolve some of the controversy surrounding the Sapir-Whorf hypothesis, and may help to normalize the hypothesis by linking it to established principles that also explain other phenomena. On this view, effects of language on nonlinguistic cognition or perception reflect standard principles of inference under uncertainty. WIREs Cogn Sci 2017, 8:e1440. doi: 10.1002/wcs.1440 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.

  13. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  14. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    NASA Astrophysics Data System (ADS)

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  15. Decoherence effect on quantum-memory-assisted entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-01-01

    Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.

  16. Investment, regulation, and uncertainty

    PubMed Central

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  17. Tin monochalcogenide heterostructures as mechanically rigid infrared band gap semiconductors

    NASA Astrophysics Data System (ADS)

    Özçelik, V. Ongun; Fathi, Mohammad; Azadani, Javad G.; Low, Tony

    2018-05-01

    Based on first-principles density functional calculations, we show that SnS and SnSe layers can form mechanically rigid heterostructures with the constituent puckered or buckled monolayers. Due to the strong interlayer coupling, the electronic wave functions of the conduction and valence band edges are delocalized across the heterostructure. The resultant band gaps of the heterostructures reside in the infrared region. With strain engineering, the heterostructure band gap undergoes a transition from indirect to direct in the puckered phase. Our results show that there is a direct correlation between the electronic wave function and the mechanical rigidity of the layered heterostructure.

  18. Turnaround in Texas: How One District Closed Its Minority Achievement Gap.

    ERIC Educational Resources Information Center

    Cook, Glenn

    2003-01-01

    Describes how one district closed its minority achievement gap by employing the Brazosport model, a combination of effective-schools research and Total Quality Management principles. The model includes an eight-step "plan-do-check" instructional cycle for teachers and emphasizes teacher training and tracking data. Student-achievement…

  19. Uncertainties in historical pollution data from sedimentary records from an Australian urban floodplain lake

    NASA Astrophysics Data System (ADS)

    Lintern, A.; Leahy, P.; Deletic, A.; Heijnis, H.; Zawadzki, A.; Gadd, P.; McCarthy, D.

    2018-05-01

    Sediment cores from aquatic environments can provide valuable information about historical pollution levels and sources. However, there is little understanding of the uncertainties associated with these findings. The aim of this study is to fill this knowledge gap by proposing a framework for quantifying the uncertainties in historical heavy metal pollution records reconstructed from sediment cores. This uncertainty framework consists of six sources of uncertainty: uncertainties in (1) metals analysis methods, (2) spatial variability of sediment core heavy metal profiles, (3) sub-sampling intervals, (4) the sediment chronology, (5) the assumption that metal levels in bed sediments reflect the magnitude of metal inputs into the aquatic system, and (6) post-depositional transformation of metals. We apply this uncertainty framework to an urban floodplain lake in South-East Australia (Willsmere Billabong). We find that for this site, uncertainties in historical dated heavy metal profiles can be up to 176%, largely due to uncertainties in the sediment chronology, and in the assumption that the settled heavy metal mass is equivalent to the heavy metal mass entering the aquatic system. As such, we recommend that future studies reconstructing historical pollution records using sediment cores from aquatic systems undertake an investigation of the uncertainties in the reconstructed pollution record, using the uncertainty framework provided in this study. We envisage that quantifying and understanding the uncertainties associated with the reconstructed pollution records will facilitate the practical application of sediment core heavy metal profiles in environmental management projects.

  20. Sulfur-doped Graphene Nanoribbons with a Sequence of Distinct Band Gaps

    NASA Astrophysics Data System (ADS)

    Du, Shi-Xuan; Zhang, Yan-Fang; Zhang, Yi; Berger, Reinhard; Feng, Xinliang; Mullen, Klaus; Lin, Xiao; Zhang, Yu-Yang; Pantelides, Sokrates T.; Gao, Hong-Jun

    Unlike free-standing graphene, graphene nanoribbons (GNRs) can possess semiconducting band gap. However, achieving such control has been a major challenge in the fabrication of GNRs. Chevron-type GNRs were recently achieved by surface-assisted polymerization of pristine or N-substituted oligophenylene monomers. By mixing two different monomers, GNR heterojunctions can in principle be fabricated. Here we report fabrication and characterization of chevron-type GNRs by using sulfur-substituted oligophenylene monomers to achieve GNRs and related heterostructures for the first time. Importantly, our first-principles calculations show that the band gaps of GNRs can be tailored by different S configurations in cyclodehydrogenated isomers through debromination and intramolecular cyclodehydrogenation. This feature should open up new avenues to create multiple GNR heterojunctions by engineering the sulfur configurations. These predictions have been confirmed by Scanning Tunneling Microscopy (STM) and Scanning Tunneling Spectroscopy (STS). The unusual sequence of intraribbon heterojunctions may be useful for nanoscale optoelectronic applications based on quantum dots

  1. Info-gap theory and robust design of surveillance for invasive species: the case study of Barrow Island.

    PubMed

    Davidovitch, Lior; Stoklosa, Richard; Majer, Jonathan; Nietrzeba, Alex; Whittle, Peter; Mengersen, Kerrie; Ben-Haim, Yakov

    2009-06-01

    Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.

  2. Designing Phononic Crystals with Wide and Robust Band Gaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang

    Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less

  3. Designing Phononic Crystals with Wide and Robust Band Gaps

    DOE PAGES

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang; ...

    2018-04-16

    Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less

  4. Understanding and applying principles of social cognition and decision making in adaptive environmental governance.

    PubMed

    DeCaro, Daniel A; Arnol, Craig Anthony Tony; Boama, Emmanuel Frimpong; Garmestani, Ahjond S

    2017-03-01

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people's decision making that cloud their judgment and create conflict. These systems must also satisfy people's fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance.

  5. Understanding and applying principles of social cognition and decision making in adaptive environmental governance

    PubMed Central

    DeCaro, Daniel A.; Arnol, Craig Anthony (Tony); Boama, Emmanuel Frimpong; Garmestani, Ahjond S.

    2018-01-01

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance systems adapt. We focus primarily on the interplay between key decision makers in society and legal systems. We argue that adaptive governance must overcome three cooperative dilemmas to facilitate adaptation: (1) encouraging collaborative problem solving, (2) garnering social acceptance and commitment, and (3) cultivating a culture of trust and tolerance for change and uncertainty. However, to do so governance systems must cope with biases in people’s decision making that cloud their judgment and create conflict. These systems must also satisfy people’s fundamental needs for self-determination, fairness, and security, ensuring that changes to environmental governance are perceived as legitimate, trustworthy, and acceptable. We discuss the implications of these principles for common governance solutions (e.g., public participation, enforcement) and conclude with methodological recommendations. We outline how scholars can investigate the social cognitive principles involved in cases of adaptive governance. PMID:29780425

  6. Exploration of quantum-memory-assisted entropic uncertainty relations in a noninertial frame

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Shi, Jia-Dong; Ye, Liu

    2017-05-01

    The uncertainty principle offers a bound to show accuracy of the simultaneous measurement outcome for two incompatible observables. In this letter, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) when the particle to be measured stays at an open system, and another particle is treated as quantum memory under a noninertial frame. In such a scenario, the collective influence of the unital and nonunital noise environment, and of the relativistic motion of the system, on the QMA-EUR is examined. By numerical analysis, we conclude that, firstly, the noises and the Unruh effect can both increase the uncertainty, due to the decoherence of the bipartite system induced by the noise or Unruh effect; secondly, the uncertainty is more affected by the noises than by the Unruh effect from the acceleration; thirdly, unital noises can reduce the uncertainty in long-time regime. We give a possible physical interpretation for those results: that the information of interest is redistributed among the bipartite, the noisy environment and the physically inaccessible region in the noninertial frame. Therefore, we claim that our observations provide an insight into dynamics of the entropic uncertainty in a noninertial frame, and might be important to quantum precision measurement under relativistic motion.

  7. Heisenberg Uncertainty and the Allowable Masses of the Up Quark and Down Quark

    NASA Astrophysics Data System (ADS)

    Orr, Brian

    2004-05-01

    A possible explanation for the inability to attain deterministic measurements of an elementary particle's energy, as given by the Heisenberg Uncertainty Principle, manifests itself in an interesting anthropic consequent of Andrei Linde's Self-reproducing Inflationary Multiverse model. In Linde's model, the physical laws and constants that govern our universe adopt other values in other universes, due to variable Higgs fields. While the physics in our universe allow for the advent of life and consciousness, the physics necessary for life are not likely to exist in other universes -- Linde demonstrates this through a kind of Darwinism for universes. Our universe, then, is unique. But what are the physical laws and constants that make our universe what it is? Craig Hogan identifies five physical constants that are not bound by symmetry. Fine-tuning these constants gives rise to the basic behavior and structures of the universe. Three of the non-symmetric constants are fermion masses: the up quark mass, the down quark mass, and the electron mass. I will explore Linde's and Hogan's works by comparing the amount of uncertainty in quark masses, as calculated from the Heisenberg Uncertainty Principle, to the range of quark mass values consistent with our observed universe. Should the fine-tuning of the up quark and down quark masses be greater than the range of Heisenberg uncertainties in their respective masses (as I predict, due to quantum tunneling), then perhaps there is a correlation between the measured Heisenberg uncertainty in quark masses and the fine-tuning of masses required for our universe to be as it is. Hogan; "Why the Universe is Just So;" Reviews of Modern Physics; Issue 4; Vol. 72; pg. 1149-1161; Oct. 2000 Linde, "The Self-Reproducing Inflationary Universe;" Scientific American; No. 5; Vol. 271; pg. 48-55; Nov. 1994

  8. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hub, Martina; Thieke, Christian; Kessler, Marc L.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts formore » the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.« less

  9. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    PubMed Central

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-01-01

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well. PMID:22482640

  10. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    PubMed

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    R(2), 0.29). The relatively low levels of uncertainty among orthopaedic surgeons and confidence bias seem inconsistent with the paucity of definitive evidence. If patients want to be informed of the areas of uncertainty and surgeon-to-surgeon variation relevant to their care, it seems possible that a low recognition of uncertainty and surgeon confidence bias might hinder adequately informing patients, informed decisions, and consent. Moreover, limited recognition of uncertainty is associated with modifiable factors such as confidence bias, trust in orthopaedic evidence base, and statistical understanding. Perhaps improved statistical teaching in residency, journal clubs to improve the critique of evidence and awareness of bias, and acknowledgment of knowledge gaps at courses and conferences might create awareness about existing uncertainties. Level 1, prognostic study.

  11. Steric engineering of metal-halide perovskites with tunable optical band gaps

    NASA Astrophysics Data System (ADS)

    Filip, Marina R.; Eperon, Giles E.; Snaith, Henry J.; Giustino, Feliciano

    2014-12-01

    Owing to their high energy-conversion efficiency and inexpensive fabrication routes, solar cells based on metal-organic halide perovskites have rapidly gained prominence as a disruptive technology. An attractive feature of perovskite absorbers is the possibility of tailoring their properties by changing the elemental composition through the chemical precursors. In this context, rational in silico design represents a powerful tool for mapping the vast materials landscape and accelerating discovery. Here we show that the optical band gap of metal-halide perovskites, a key design parameter for solar cells, strongly correlates with a simple structural feature, the largest metal-halide-metal bond angle. Using this descriptor we suggest continuous tunability of the optical gap from the mid-infrared to the visible. Precise band gap engineering is achieved by controlling the bond angles through the steric size of the molecular cation. On the basis of these design principles we predict novel low-gap perovskites for optimum photovoltaic efficiency, and we demonstrate the concept of band gap modulation by synthesising and characterising novel mixed-cation perovskites.

  12. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  13. Optimism in the face of uncertainty supported by a statistically-designed multi-armed bandit algorithm.

    PubMed

    Kamiura, Moto; Sano, Kohei

    2017-10-01

    The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Supporting Fisheries Management by Means of Complex Models: Can We Point out Isles of Robustness in a Sea of Uncertainty?

    PubMed Central

    Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul

    2013-01-01

    Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters. PMID:24204873

  15. Supporting fisheries management by means of complex models: can we point out isles of robustness in a sea of uncertainty?

    PubMed

    Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul

    2013-01-01

    Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters.

  16. Understanding and applying principles of social cognition and decision making in adaptive environmental governance

    EPA Science Inventory

    Environmental governance systems are under greater pressure to adapt and to cope with increased social and ecological uncertainty from stressors like climate change. We review principles of social cognition and decision making that shape and constrain how environmental governance...

  17. Soil moisture in sessile oak forest gaps

    NASA Astrophysics Data System (ADS)

    Zagyvainé Kiss, Katalin Anita; Vastag, Viktor; Gribovszki, Zoltán; Kalicz, Péter

    2015-04-01

    By social demands are being promoted the aspects of the natural forest management. In forestry the concept of continuous forest has been an accepted principle also in Hungary since the last decades. The first step from even-aged stand to continuous forest can be the forest regeneration based on gap cutting, so small openings are formed in a forest due to forestry interventions. This new stand structure modifies the hydrological conditions for the regrowth. Without canopy and due to the decreasing amounts of forest litter the interception is less significant so higher amount of precipitation reaching the soil. This research focuses on soil moisture patterns caused by gaps. The spatio-temporal variability of soil water content is measured in gaps and in surrounding sessile oak (Quercus petraea) forest stand. Soil moisture was determined with manual soil moisture meter which use Time-Domain Reflectometry (TDR) technology. The three different sizes gaps (G1: 10m, G2: 20m, G3: 30m) was opened next to Sopron on the Dalos Hill in Hungary. First, it was determined that there is difference in soil moisture between forest stand and gaps. Second, it was defined that how the gap size influences the soil moisture content. To explore the short term variability of soil moisture, two 24-hour (in growing season) and a 48-hour (in dormant season) field campaign were also performed in case of the medium-sized G2 gap along two/four transects. Subdaily changes of soil moisture were performed. The measured soil moisture pattern was compared with the radiation pattern. It was found that the non-illuminated areas were wetter and in the dormant season the subdaily changes cease. According to our measurements, in the gap there is more available water than under the forest stand due to the less evaporation and interception loss. Acknowledgements: The research was supported by TÁMOP-4.2.2.A-11/1/KONV-2012-0004 and AGRARKLIMA.2 VKSZ_12-1-2013-0034.

  18. Band-Gap Engineering at a Semiconductor-Crystalline Oxide Interface

    DOE PAGES

    Jahangir-Moghadam, Mohammadreza; Ahmadi-Majlan, Kamyar; Shen, Xuan; ...

    2015-02-09

    The epitaxial growth of crystalline oxides on semiconductors provides a pathway to introduce new functionalities to semiconductor devices. Key to integrating the functionalities of oxides onto semiconductors is controlling the band alignment at interfaces between the two materials. Here we apply principles of band gap engineering traditionally used at heterojunctions between conventional semiconductors to control the band offset between a single crystalline oxide and a semiconductor. Reactive molecular beam epitaxy is used to realize atomically abrupt and structurally coherent interfaces between SrZr xTi 1-xO₃ and Ge, in which the band gap of the former is enhanced with Zr content x.more » We present structural and electrical characterization of SrZr xTi 1-xO₃-Ge heterojunctions and demonstrate a type-I band offset can be achieved. These results demonstrate that band gap engineering can be exploited to realize functional semiconductor crystalline oxide heterojunctions.« less

  19. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  20. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  1. Tuning the band gap in silicene by oxidation.

    PubMed

    Du, Yi; Zhuang, Jincheng; Liu, Hongsheng; Xu, Xun; Eilers, Stefan; Wu, Kehui; Cheng, Peng; Zhao, Jijun; Pi, Xiaodong; See, Khay Wai; Peleckis, Germanas; Wang, Xiaolin; Dou, Shi Xue

    2014-10-28

    Silicene monolayers grown on Ag(111) surfaces demonstrate a band gap that is tunable by oxygen adatoms from semimetallic to semiconducting type. With the use of low-temperature scanning tunneling microscopy, we find that the adsorption configurations and amounts of oxygen adatoms on the silicene surface are critical for band gap engineering, which is dominated by different buckled structures in √13 × √13, 4 × 4, and 2√3 × 2√3 silicene layers. The Si-O-Si bonds are the most energy-favored species formed on √13 × √13, 4 × 4, and 2√3 × 2√3 structures under oxidation, which is verified by in situ Raman spectroscopy as well as first-principles calculations. The silicene monolayers retain their structures when fully covered by oxygen adatoms. Our work demonstrates the feasibility of tuning the band gap of silicene with oxygen adatoms, which, in turn, expands the base of available two-dimensional electronic materials for devices with properties that is hardly achieved with graphene oxide.

  2. Encapsulated silicene: A robust large-gap topological insulator

    DOE PAGES

    Kou, Liangzhi; Ma, Yandong; Yan, Binghai; ...

    2015-08-20

    The quantum spin Hall (QSH) effect predicted in silicene has raised exciting prospects of new device applications compatible with current microelectronic technology. Efforts to explore this novel phenomenon, however, have been impeded by fundamental challenges imposed by silicene’s small topologically nontrivial band gap and fragile electronic properties susceptible to environmental degradation effects. Here we propose a strategy to circumvent these challenges by encapsulating silicene between transition-metal dichalcogenides (TMDCs) layers. First-principles calculations show that such encapsulated silicene exhibit a two-orders-of-magnitude enhancement in its nontrivial band gap, which is driven by the strong spin–orbit coupling effect in TMDCs via the proximity effect.more » Moreover, the cladding TMDCs layers also shield silicene from environmental gases that are detrimental to the QSH state in free-standing silicene. In conclusion, the encapsulated silicene represents a novel two-dimensional topological insulator with a robust nontrivial band gap suitable for room-temperature applications, which has significant implications for innovative QSH device design and fabrication.« less

  3. A precautionary principle for dual use research in the life sciences.

    PubMed

    Kuhlau, Frida; Höglund, Anna T; Evers, Kathinka; Eriksson, Stefan

    2011-01-01

    Most life science research entails dual-use complexity and may be misused for harmful purposes, e.g. biological weapons. The Precautionary Principle applies to special problems characterized by complexity in the relationship between human activities and their consequences. This article examines whether the principle, so far mainly used in environmental and public health issues, is applicable and suitable to the field of dual-use life science research. Four central elements of the principle are examined: threat, uncertainty, prescription and action. Although charges against the principle exist - for example that it stifles scientific development, lacks practical applicability and is poorly defined and vague - the analysis concludes that a Precautionary Principle is applicable to the field. Certain factors such as credibility of the threat, availability of information, clear prescriptive demands on responsibility and directives on how to act, determine the suitability and success of a Precautionary Principle. Moreover, policy-makers and researchers share a responsibility for providing and seeking information about potential sources of harm. A central conclusion is that the principle is meaningful and useful if applied as a context-dependent moral principle and allowed flexibility in its practical use. The principle may then inspire awareness-raising and the establishment of practical routines which appropriately reflect the fact that life science research may be misused for harmful purposes. © 2009 Blackwell Publishing Ltd.

  4. The research-practice gap: bridging the schism between eating disorder researchers and practitioners.

    PubMed

    Lilienfeld, Scott O; Ritschel, Lorie A; Lynn, Steven Jay; Brown, Amanda P; Cautin, Robin L; Latzman, Robert D

    2013-07-01

    The field of eating disorders (EDs) treatment has been beset by a marked disjunction between scientific evidence and clinical application. We describe the nature and scope of the research-practice gap in the ED field. We draw on surveys and broader literature to better understand the research-practice gap in ED treatment and reasons for resistance to evidence-based practice. We identify three sources of the research-practice gap: (1) attitudinal factors, (2) differences in the definition of "evidence," and (3) cognitive factors, especially naïve realism and confirmation bias. We affirm the role of science as a safeguard against human fallibility and as a means of bridging the research-practice gap, and delineate key principles of scientific thinking for ED researchers and practitioners. We conclude with proposals for narrowing the research-practice gap in ED treatment and enhancing the quality of interventions for ED clients. Copyright © 2013 Wiley Periodicals, Inc.

  5. The Zika Virus Outbreak in Brazil: Knowledge Gaps and Challenges for Risk Reduction.

    PubMed

    Garcia Serpa Osorio-de-Castro, Claudia; Silva Miranda, Elaine; Machado de Freitas, Carlos; Rochel de Camargo, Kenneth; Cranmer, Hilarie Hartel

    2017-06-01

    We analyzed uncertainties and complexities of the Zika virus outbreak in Brazil, and we discuss risk reduction for future emergencies. We present the public health situation in Brazil and concurrent determinants of the epidemic and the knowledge gaps that persist despite building evidence from research, making public health decisions difficult. Brazil has adopted active measures, but producing desired outcomes may be uncertain because of partial or unavailable information. Reducing population group vulnerabilities and acting on environmental issues are medium- to long-term measures. Simultaneously dealing with information gaps, uncontrolled disease spread, and vulnerabilities is a new risk scenario and must be approached decisively to face emerging biothreats.

  6. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  7. Detection capacity, information gaps and the design of surveillance programs for invasive forest pests

    Treesearch

    Denys Yemshanov; Frank Koch; Yakov Ben-Haim; William Smith

    2010-01-01

    Integrated pest risk maps and their underlying assessments provide broad guidance for establishing surveillance programs for invasive species, but they rarely account for knowledge gaps regarding the pest of interest or how these can be reduced. In this study we demonstrate how the somewhat competing notions of robustness to uncertainty and potential knowledge gains...

  8. Interpreting null results from measurements with uncertain correlations: an info-gap approach.

    PubMed

    Ben-Haim, Yakov

    2011-01-01

    Null events—not detecting a pernicious agent—are the basis for declaring the agent is absent. Repeated nulls strengthen confidence in the declaration. However, correlations between observations are difficult to assess in many situations and introduce uncertainty in interpreting repeated nulls. We quantify uncertain correlations using an info-gap model, which is an unbounded family of nested sets of possible probabilities. An info-gap model is nonprobabilistic and entails no assumption about a worst case. We then evaluate the robustness, to uncertain correlations, of estimates of the probability of a null event. This is then the basis for evaluating a nonprobabilistic robustness-based confidence interval for the probability of a null. © 2010 Society for Risk Analysis.

  9. First-principles spin-transfer torque in CuMnAs |GaP |CuMnAs junctions

    NASA Astrophysics Data System (ADS)

    Stamenova, Maria; Mohebbi, Razie; Seyed-Yazdi, Jamileh; Rungger, Ivan; Sanvito, Stefano

    2017-02-01

    We demonstrate that an all-antiferromagnetic tunnel junction with current perpendicular to the plane geometry can be used as an efficient spintronic device with potential high-frequency operation. By using state-of-the-art density functional theory combined with quantum transport, we show that the Néel vector of the electrodes can be manipulated by spin-transfer torque. This is staggered over the two different magnetic sublattices and can generate dynamics and switching. At the same time the different magnetization states of the junction can be read by standard tunneling magnetoresistance. Calculations are performed for CuMnAs |GaP |CuMnAs junctions with different surface terminations between the antiferromagnetic CuMnAs electrodes and the insulating GaP spacer. We find that the torque remains staggered regardless of the termination, while the magnetoresistance depends on the microscopic details of the interface.

  10. A first-principles study of the electrically tunable band gap in few-layer penta-graphene.

    PubMed

    Wang, Jinjin; Wang, Zhanyu; Zhang, R J; Zheng, Y X; Chen, L Y; Wang, S Y; Tsoo, Chia-Chin; Huang, Hung-Ji; Su, Wan-Sheng

    2018-06-25

    The structural and electronic properties of bilayer (AA- and AB-stacked) and tri-layer (AAA-, ABA- and AAB-stacked) penta-graphene (PG) have been investigated in the framework of density functional theory. The present results demonstrate that the ground state energy in AB stacking is lower than that in AA stacking, whereas ABA stacking is found to be the most energetically favorable, followed by AAB and AAA stackings. All considered model configurations are found to be semiconducting, independent of the stacking sequence. In the presence of a perpendicular electric field, their band gaps can be significantly reduced and completely closed at a specific critical electric field strength, demonstrating a Stark effect. These findings show that few-layer PG will have tremendous opportunities to be applied in nanoscale electronic and optoelectronic devices owing to its tunable band gap.

  11. Development of machine-vision system for gap inspection of muskmelon grafted seedlings.

    PubMed

    Liu, Siyao; Xing, Zuochang; Wang, Zifan; Tian, Subo; Jahun, Falalu Rabiu

    2017-01-01

    Grafting robots have been developed in the world, but some auxiliary works such as gap-inspecting for grafted seedlings still need to be done by human. An machine-vision system of gap inspection for grafted muskmelon seedlings was developed in this study. The image acquiring system consists of a CCD camera, a lens and a front white lighting source. The image of inspected gap was processed and analyzed by software of HALCON 12.0. The recognition algorithm for the system is based on principle of deformable template matching. A template should be created from an image of qualified grafted seedling gap. Then the gap image of the grafted seedling will be compared with the created template to determine their matching degree. Based on the similarity between the gap image of grafted seedling and the template, the matching degree will be 0 to 1. The less similar for the grafted seedling gap with the template the smaller of matching degree. Thirdly, the gap will be output as qualified or unqualified. If the matching degree of grafted seedling gap and the template is less than 0.58, or there is no match is found, the gap will be judged as unqualified; otherwise the gap will be qualified. Finally, 100 muskmelon seedlings were grafted and inspected to test the gap inspection system. Results showed that the gap inspection machine-vision system could recognize the gap qualification correctly as 98% of human vision. And the inspection speed of this system can reach 15 seedlings·min-1. The gap inspection process in grafting can be fully automated with this developed machine-vision system, and the gap inspection system will be a key step of a fully-automatic grafting robots.

  12. Non-Dirac Chern insulators with large band gaps and spin-polarized edge states.

    PubMed

    Xue, Y; Zhang, J Y; Zhao, B; Wei, X Y; Yang, Z Q

    2018-05-10

    Based on first-principles calculations and k·p models, we demonstrate that PbC/MnSe heterostructures are a non-Dirac type of Chern insulator with very large band gaps (244 meV) and exotically half-metallic edge states, providing the possibilities of realizing very robust, completely spin polarized, and dissipationless spintronic devices from the heterostructures. The achieved extraordinarily large nontrivial band gap can be ascribed to the contribution of the non-Dirac type electrons (composed of px and py) and the very strong atomic spin-orbit coupling (SOC) interaction of the heavy Pb element in the system. Surprisingly, the band structures are found to be sensitive to the different exchange and correlation functionals adopted in the first-principles calculations. Chern insulators with various mechanisms are acquired from them. These discoveries show that the predicted nontrivial topology in PbC/MnSe heterostructures is robust and can be observed in experiments at high temperatures. The system has great potential to have attractive applications in future spintronics.

  13. Multi-Case Review of the Application of the Precautionary Principle in European Union Law and Case Law.

    PubMed

    Garnett, Kenisha; Parsons, David J

    2017-03-01

    The precautionary principle was formulated to provide a basis for political action to protect the environment from potentially severe or irreversible harm in circumstances of scientific uncertainty that prevent a full risk or cost-benefit analysis. It underpins environmental law in the European Union and has been extended to include public health and consumer safety. The aim of this study was to examine how the precautionary principle has been interpreted and subsequently applied in practice, whether these applications were consistent, and whether they followed the guidance from the Commission. A review of the literature was used to develop a framework for analysis, based on three attributes: severity of potential harm, standard of evidence (or degree of uncertainty), and nature of the regulatory action. This was used to examine 15 pieces of legislation or judicial decisions. The decision whether or not to apply the precautionary principle appears to be poorly defined, with ambiguities inherent in determining what level of uncertainty and significance of hazard justifies invoking it. The cases reviewed suggest that the Commission's guidance was not followed consistently in forming legislation, although judicial decisions tended to be more consistent and to follow the guidance by requiring plausible evidence of potential hazard in order to invoke precaution. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  14. A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories

    PubMed Central

    Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.

    2012-01-01

    Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886

  15. Management of the Israeli National Water System under Uncertainty

    NASA Astrophysics Data System (ADS)

    Shamir, U.; Housh, M.; Ostfeld, A.; Zaide, M.

    2009-12-01

    Uncertainty in our region is due to the natural variability of hydrological patterns, with recurring extended droughts, reduced average and broadening variability of recharge that seem to indicate the effect of climate change, as well as to deterioration of water quality in the natural sources, to population growth and distribution, to shifting demand patterns among consumer sectors, and to expected future regional water agreements. These factors combine to create a challenging environment in which highly stressed water resources and water systems have to be developed, operated and managed. The natural sources have been used to their sustainable capacity and often beyond. The main policy responses are a shift of fresh water from agriculture to the cities, replacing it with treated wastewater for irrigation, and a major program for construction of sea-water desalination plants and the associated infrastructure needed for its integration into the supply systems. Organizational reforms, regulation, and demand management options are also being developed, including full-cost pricing. Management of the water resources and systems under these conditions requires a long-term perspective. The methodologies for supporting management decisions that have been used to date by the Israeli Water Authority include evaluation by scenarios, simulation, and optimization with sensitivity analysis. We review existing approaches and models for management of the Israeli water system (Zaide 2006) and then present some new methodologies for addressing operational decisions under hydrological uncertainty, which include generation of tradeoffs between the expected value and variability of the outcomes, and an Info-Gap (Ben-Haim 2006) based approach. These methodologies are demonstrated on examples that emulate portions of a regional water system and are then applied to the Israeli National Water System. Ben-Haim, Y. (2006) Info-Gap Theory: Decisions under Severe Uncertainty, 2nd Ed

  16. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  17. A science confidence gap: Education, trust in scientific methods, and trust in scientific institutions in the United States, 2014.

    PubMed

    Achterberg, Peter; de Koster, Willem; van der Waal, Jeroen

    2017-08-01

    Following up on suggestions that attitudes toward science are multi-dimensional, we analyze nationally representative survey data collected in the United States in 2014 ( N = 2006), and demonstrate the existence of a science confidence gap: some people place great trust in scientific methods and principles, but simultaneously distrust scientific institutions. This science confidence gap is strongly associated with level of education: it is larger among the less educated than among the more educated. We investigate explanations for these educational differences. Whereas hypotheses deduced from reflexive-modernization theory do not pass the test, those derived from theorizing on the role of anomie are corroborated. The less educated are more anomic (they have more modernity-induced cultural discontents), which not only underlies their distrust in scientific institutions, but also fuels their trust in scientific methods and principles. This explains why this science confidence gap is most pronounced among the less educated.

  18. Precaution or Integrated Responsibility Approach to Nanovaccines in Fish Farming? A Critical Appraisal of the UNESCO Precautionary Principle.

    PubMed

    Myhr, Anne Ingeborg; Myskja, Bjørn K

    2011-04-01

    Nanoparticles have multifaceted advantages in drug administration as vaccine delivery and hence hold promises for improving protection of farmed fish against diseases caused by pathogens. However, there are concerns that the benefits associated with distribution of nanoparticles may also be accompanied with risks to the environment and health. The complexity of the natural and social systems involved implies that the information acquired in quantified risk assessments may be inadequate for evidence-based decisions. One controversial strategy for dealing with this kind of uncertainty is the precautionary principle. A few years ago, an UNESCO expert group suggested a new approach for implementation of the principle. Here we compare the UNESCO principle with earlier versions and explore the advantages and disadvantages by employing the UNESCO version to the use of PLGA nanoparticles for delivery of vaccines in aquaculture. Finally, we discuss whether a combined scientific and ethical analysis that involves the concept of responsibility will enable approaches that can provide a supplement to the precautionary principle as basis for decision-making in areas of scientific uncertainty, such as the application of nanoparticles in the vaccination of farmed fish.

  19. An effective approach for gap-filling continental scale remotely sensed time-series

    PubMed Central

    Weiss, Daniel J.; Atkinson, Peter M.; Bhatt, Samir; Mappin, Bonnie; Hay, Simon I.; Gething, Peter W.

    2014-01-01

    The archives of imagery and modeled data products derived from remote sensing programs with high temporal resolution provide powerful resources for characterizing inter- and intra-annual environmental dynamics. The impressive depth of available time-series from such missions (e.g., MODIS and AVHRR) affords new opportunities for improving data usability by leveraging spatial and temporal information inherent to longitudinal geospatial datasets. In this research we develop an approach for filling gaps in imagery time-series that result primarily from cloud cover, which is particularly problematic in forested equatorial regions. Our approach consists of two, complementary gap-filling algorithms and a variety of run-time options that allow users to balance competing demands of model accuracy and processing time. We applied the gap-filling methodology to MODIS Enhanced Vegetation Index (EVI) and daytime and nighttime Land Surface Temperature (LST) datasets for the African continent for 2000–2012, with a 1 km spatial resolution, and an 8-day temporal resolution. We validated the method by introducing and filling artificial gaps, and then comparing the original data with model predictions. Our approach achieved R2 values above 0.87 even for pixels within 500 km wide introduced gaps. Furthermore, the structure of our approach allows estimation of the error associated with each gap-filled pixel based on the distance to the non-gap pixels used to model its fill value, thus providing a mechanism for including uncertainty associated with the gap-filling process in downstream applications of the resulting datasets. PMID:25642100

  20. Design principles for shift current photovoltaics

    DOE PAGES

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; ...

    2017-01-25

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less

  1. Design principles for shift current photovoltaics

    PubMed Central

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; Coh, Sinisa; Moore, Joel E.

    2017-01-01

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. By analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. Our method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenides such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W−1. Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells. PMID:28120823

  2. Design principles for shift current photovoltaics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less

  3. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  4. Uniaxial strain on graphene: Raman spectroscopy study and band-gap opening.

    PubMed

    Ni, Zhen Hua; Yu, Ting; Lu, Yun Hao; Wang, Ying Ying; Feng, Yuan Ping; Shen, Ze Xiang

    2008-11-25

    Graphene was deposited on a transparent and flexible substrate, and tensile strain up to approximately 0.8% was loaded by stretching the substrate in one direction. Raman spectra of strained graphene show significant red shifts of 2D and G band (-27.8 and -14.2 cm(-1) per 1% strain, respectively) because of the elongation of the carbon-carbon bonds. This indicates that uniaxial strain has been successfully applied on graphene. We also proposed that, by applying uniaxial strain on graphene, tunable band gap at K point can be realized. First-principle calculations predicted a band-gap opening of approximately 300 meV for graphene under 1% uniaxial tensile strain. The strained graphene provides an alternative way to experimentally tune the band gap of graphene, which would be more efficient and more controllable than other methods that are used to open the band gap in graphene. Moreover, our results suggest that the flexible substrate is ready for such a strain process, and Raman spectroscopy can be used as an ultrasensitive method to determine the strain.

  5. Understanding the Uncertainty of an Effectiveness-Cost Ratio in Educational Resource Allocation: A Bayesian Approach

    ERIC Educational Resources Information Center

    Pan, Yilin

    2016-01-01

    Given the necessity to bridge the gap between what happened and what is likely to happen, this paper aims to explore how to apply Bayesian inference to cost-effectiveness analysis so as to capture the uncertainty of a ratio-type efficiency measure. The first part of the paper summarizes the characteristics of the evaluation data that are commonly…

  6. Practice gaps in the care of mitral valve regurgitation: Insights from the American College of Cardiology mitral regurgitation gap analysis and advisory panel.

    PubMed

    Wang, Andrew; Grayburn, Paul; Foster, Jill A; McCulloch, Marti L; Badhwar, Vinay; Gammie, James S; Costa, Salvatore P; Benitez, Robert Michael; Rinaldi, Michael J; Thourani, Vinod H; Martin, Randolph P

    2016-02-01

    The revised 2014 American College of Cardiology (ACC)/American Heart Association valvular heart disease guidelines provide evidenced-based recommendations for the management of mitral regurgitation (MR). However, knowledge gaps related to our evolving understanding of critical MR concepts may impede their implementation. The ACC conducted a multifaceted needs assessment to characterize gaps, practice patterns, and perceptions related to the diagnosis and treatment of MR. A key project element was a set of surveys distributed to primary care and cardiovascular physicians (cardiologists and cardiothoracic surgeons). Survey and other gap analysis findings were presented to a panel of 10 expert advisors from specialties of general cardiology, cardiac imaging, interventional cardiology, and cardiac surgeons with expertise in valvular heart disease, especially MR, and cardiovascular education. The panel was charged with assessing the relative importance and potential means of remedying identified gaps to improve care for patients with MR. The survey results identified several knowledge and practice gaps that may limit implementation of evidence-based recommendations for MR care. Specifically, half of primary care physicians reported uncertainty regarding timing of intervention for patients with severe primary or functional MR. Physicians in all groups reported that quantitative indices of MR severity were frequently not reported in clinical echocardiographic interpretations, and that these measurements were not consistently reviewed when provided in reports. In the treatment of MR, nearly 30% of primary care physician and general cardiologists did not know the volume of mitral valve repair surgeries by their reference cardiac surgeons and did not have a standard source to obtain this information. After review of the survey results, the expert panel summarized practice gaps into 4 thematic areas and offered proposals to address deficiencies and promote better alignment

  7. Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.

    2013-12-01

    The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, throughmore » the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.« less

  8. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  9. Improving the Pharmacologic Management of Pain in Older Adults: Identifying the Research Gaps and Methods to Address Them

    PubMed Central

    Reid, M. C.; Bennett, David A.; Chen, Wen G.; Eldadah, Basil A.; Farrar, John T.; Ferrell, Bruce; Gallagher, Rollin M.; Hanlon, Joseph T.; Herr, Keela; Horn, Susan D.; Inturrisi, Charles E.; Lemtouni, Salma; Lin, Yu Woody; Michaud, Kaleb; Morrison, R. Sean; Neogi, Tuhina; Porter, Linda L.; Solomon, Daniel H.; Von Korff, Michael; Weiss, Karen; Witter, James; Zacharoff, Kevin L.

    2011-01-01

    Objective There has been a growing recognition of the need for better pharmacologic management of chronic pain among older adults. To address this need, the National Institutes of Health Pain Consortium sponsored an “Expert Panel Discussion on the Pharmacological Management of Chronic Pain in Older Adults” conference in September, 2010, to identify research gaps and strategies to address them. Specific emphasis was placed on ascertaining gaps regarding use of opioid and non-steroidal anti-inflammatory medications because of continued uncertainties regarding their risks and benefits. Design Eighteen panel members provided oral presentations; each was followed by a multidisciplinary panel discussion. Meeting transcripts and panelists’ slide presentations were reviewed to identify the gaps, and the types of studies and research methods panelists suggested could best address them. Results Fifteen gaps were identified in the areas of treatment(e.g., uncertainty regarding the long-term safety and efficacy of commonly prescribed analgesics), epidemiology (e.g., lack of knowledge regarding the course of common pain syndromes), and implementation(e.g., limited understanding of optimal strategies to translate evidence-based pain treatments into practice). Analyses of data from electronic health care databases, observational cohort studies, and ongoing cohort studies (augmented with pain and other relevant outcomes measures) were felt to be practical methods for building an age-appropriate evidence base to improve the pharmacologic management of pain in later life. Conclusions Addressing the gaps presented in the current report was judged by the panel to have substantial potential to improve the health and well being of older adults with chronic pain. PMID:21834914

  10. Quantitative risk assessment of CO2 transport by pipelines--a review of uncertainties and their impacts.

    PubMed

    Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André

    2010-05-15

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  11. Atomic scale origins of sub-band gap optical absorption in gold-hyperdoped silicon

    NASA Astrophysics Data System (ADS)

    Ferdous, Naheed; Ertekin, Elif

    2018-05-01

    Gold hyperdoped silicon exhibits room temperature sub band gap optical absorption, with potential applications as infrared absorbers/detectors and impurity band photovoltaics. We use first-principles density functional theory to establish the origins of the sub band gap response. Substitutional gold AuSi and substitutional dimers AuSi - AuSi are found to be the energetically preferred defect configurations, and AuSi gives rise to partially filled mid-gap defect bands well offset from the band edges. AuSi is predicted to offer substantial sub-band gap absorption, exceeding that measured in prior experiments by two orders of magnitude for similar Au concentration. This suggests that in experimentally realized systems, in addition to AuSi, the implanted gold is accommodated by the lattice in other ways, including other defect complexes and gold precipitates. We further identify that it is energetically favorable for isolated AuSi to form AuSi - AuSi, which by contrast do not exhibit mid-gap states. The formation of dimers and other complexes could serve as nuclei in the earliest stages of Au precipitation, which may be responsible for the observed rapid deactivation of sub-band gap response upon annealing.

  12. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  13. The developmental origins of fairness: the knowledge-behavior gap.

    PubMed

    Blake, Peter R; McAuliffe, Katherine; Warneken, Felix

    2014-11-01

    Recent research in developmental psychology shows that children understand several principles of fairness by 3 years of age, much earlier than previously believed. However, children's knowledge of fairness does not always align with their behavior, and immediate self-interest alone cannot explain this gap. In this forum paper, we consider two factors that influence the relation between fairness knowledge and behavior: relative advantage and how rewards are acquired. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Pressure-Induced Structural Transition and Enhancement of Energy Gap of CuAlO2

    NASA Astrophysics Data System (ADS)

    Nakanishi, Akitaka

    2011-02-01

    By using first-principles calculations, we studied the stable crystal structures and energy gaps of CuAlO2 under high pressure. Our simulation shows that CuAlO2 transforms from a delafossite structure to a leaning delafossite structure. The critical pressure of the transition was determined to be 60 GPa. The energy gap of CuAlO2 increases through the structural transition due to the enhanced covalency of Cu 3d and O 2p states. We found that a chalcopyrite structure does not appear as a stable structure under high pressure.

  15. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  16. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  17. Gap-state engineering of visible-light-active ferroelectrics for photovoltaic applications.

    PubMed

    Matsuo, Hiroki; Noguchi, Yuji; Miyayama, Masaru

    2017-08-08

    Photoferroelectrics offer unique opportunities to explore light energy conversion based on their polarization-driven carrier separation and above-bandgap voltages. The problem associated with the wide bandgap of ferroelectric oxides, i.e., the vanishingly small photoresponse under visible light, has been overcome partly by bandgap tuning, but the narrowing of the bandgap is, in principle, accompanied by a substantial loss of ferroelectric polarization. In this article, we report an approach, 'gap-state' engineering, to produce photoferroelectrics, in which defect states within the bandgap act as a scaffold for photogeneration. Our first-principles calculations and single-domain thin-film experiments of BiFeO 3 demonstrate that gap states half-filled with electrons can enhance not only photocurrents but also photovoltages over a broad photon-energy range that is different from intermediate bands in present semiconductor-based solar cells. Our approach opens a promising route to the material design of visible-light-active ferroelectrics without sacrificing spontaneous polarization.Overcoming the optical transparency of wide bandgap of ferroelectric oxides by narrowing its bandgap tends to result in a loss of polarization. By utilizing defect states within the bandgap, Matsuo et al. report visible-light-active ferroelectrics without sacrificing polarization.

  18. Quantum spin Hall insulator in halogenated arsenene films with sizable energy gaps

    PubMed Central

    Wang, Dongchao; Chen, Li; Shi, Changmin; Wang, Xiaoli; Cui, Guangliang; Zhang, Pinhua; Chen, Yeqing

    2016-01-01

    Based on first-principles calculations, the electronic and topological properties of halogenated (F-, Cl-, Br- and I-) arsenene are investigated in detail. It is found that the halogenated arsenene sheets show Dirac type characteristic in the absence of spin-orbital coupling (SOC), whereas energy gap will be induced by SOC with the values ranging from 0.194 eV for F-arsenene to 0.255 eV for I-arsenene. Noticeably, these four newly proposed two-dimensional (2D) systems are verified to be quantum spin Hall (QSH) insulators by calculating the edge states with obvious linear cross inside bulk energy gap. It should be pointed out that the large energy gap in these 2D materials consisted of commonly used element is quite promising for practical applications of QSH insulators at room temperature. PMID:27340091

  19. New horizons in the implementation and research of comprehensive geriatric assessment: knowing, doing and the 'know-do' gap.

    PubMed

    Gladman, John R F; Conroy, Simon Paul; Ranhoff, Anette Hylen; Gordon, Adam Lee

    2016-03-01

    In this paper, we outline the relationship between the need to put existing applied health research knowledge into practice (the 'know-do gap') and the need to improve the evidence base (the 'know gap') with respect to the healthcare process used for older people with frailty known as comprehensive geriatric assessment (CGA). We explore the reasons for the know-do gap and the principles of how these barriers to implementation might be overcome. We explore how these principles should affect the conduct of applied health research to close the know gap. We propose that impaired flow of knowledge is an important contributory factor in the failure to implement evidence-based practice in CGA; this could be addressed through specific knowledge mobilisation techniques. We describe that implementation failures are also produced by an inadequate evidence base that requires the co-production of research, addressing not only effectiveness but also the feasibility and acceptability of new services, the educational needs of practitioners, the organisational requirements of services, and the contribution made by policy. Only by tackling these issues in concert and appropriate proportion, will the know and know-do gaps for CGA be closed. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. To be or not to be: How do we speak about uncertainty in public?

    NASA Astrophysics Data System (ADS)

    Todesco, Micol; Lolli, Barbara; Sheldrake, Tom; Odbert, Henry

    2016-04-01

    One of the challenges related to hazard communication concerns the public perception and understanding of scientific uncertainties, and of its implications in terms of hazard assessment and mitigation. Often science is perceived as an effective dispenser of resolving answers to the main issues posed by the complexities of life and nature. In this perspective, uncertainty is seen as a pernicious lack of knowledge that hinders our ability to face complex problems. From a scientific perspective, however, the definition of uncertainty is the only valuable tool we have to handle errors affecting our data and propagating through the increasingly complex models we develop to describe reality. Through uncertainty, scientists acknowledge the great variability that characterises natural systems and account for it in their assessment of possible scenarios. From this point of view, uncertainty is not ignorance, but it rather provides a great deal of information that is needed to inform decision making. To find effective ways to bridge the gap between these different meaning of uncertainty, we asked high-school students for assistance. With their help, we gathered definitions of the term 'uncertainty' interviewing different categories of peoples, including schoolmates and professors, neighbours, families and friends. These definitions will be compared with those provided by scientists, to find differences and similarity. To understand the role of uncertainty on judgment, a hands-on experiment is performed where students will have to estimate the exact time of explosion of party poppers subjected to a variable degree of pull. At the end of the project, the students will express their own understanding of uncertainty in a video, which will be made available for sharing. Materials collected during all the activities will contribute to our understanding of how uncertainty is portrayed and can be better expressed to improve our hazard communication.

  1. Free surfaces recast superconductivity in few-monolayer MgB2: Combined first-principles and ARPES demonstration.

    PubMed

    Bekaert, J; Bignardi, L; Aperis, A; van Abswoude, P; Mattevi, C; Gorovikov, S; Petaccia, L; Goldoni, A; Partoens, B; Oppeneer, P M; Peeters, F M; Milošević, M V; Rudolf, P; Cepek, C

    2017-10-31

    Two-dimensional materials are known to harbour properties very different from those of their bulk counterparts. Recent years have seen the rise of atomically thin superconductors, with a caveat that superconductivity is strongly depleted unless enhanced by specific substrates, intercalants or adatoms. Surprisingly, the role in superconductivity of electronic states originating from simple free surfaces of two-dimensional materials has remained elusive to date. Here, based on first-principles calculations, anisotropic Eliashberg theory, and angle-resolved photoemission spectroscopy (ARPES), we show that surface states in few-monolayer MgB 2 make a major contribution to the superconducting gap spectrum and density of states, clearly distinct from the widely known, bulk-like σ- and π-gaps. As a proof of principle, we predict and measure the gap opening on the magnesium-based surface band up to a critical temperature as high as ~30 K for merely six monolayers thick MgB 2 . These findings establish free surfaces as an unavoidable ingredient in understanding and further tailoring of superconductivity in atomically thin materials.

  2. Direct band gap silicon crystals predicted by an inverse design method

    NASA Astrophysics Data System (ADS)

    Oh, Young Jun; Lee, In-Ho; Lee, Jooyoung; Kim, Sunghyun; Chang, Kee Joo

    2015-03-01

    Cubic diamond silicon has an indirect band gap and does not absorb or emit light as efficiently as other semiconductors with direct band gaps. Thus, searching for Si crystals with direct band gaps around 1.3 eV is important to realize efficient thin-film solar cells. In this work, we report various crystalline silicon allotropes with direct and quasi-direct band gaps, which are predicted by the inverse design method which combines a conformation space annealing algorithm for global optimization and first-principles density functional calculations. The predicted allotropes exhibit energies less than 0.3 eV per atom and good lattice matches, compared with the diamond structure. The structural stability is examined by performing finite-temperature ab initio molecular dynamics simulations and calculating the phonon spectra. The absorption spectra are obtained by solving the Bethe-Salpeter equation together with the quasiparticle G0W0 approximation. For several allotropes with the band gaps around 1 eV, photovoltaic efficiencies are comparable to those of best-known photovoltaic absorbers such as CuInSe2. This work is supported by the National Research Foundation of Korea (2005-0093845 and 2008-0061987), Samsung Science and Technology Foundation (SSTF-BA1401-08), KIAS Center for Advanced Computation, and KISTI (KSC-2013-C2-040).

  3. First-principles study of codoping in lanthanum bromide

    NASA Astrophysics Data System (ADS)

    Erhart, Paul; Sadigh, Babak; Schleife, André; Åberg, Daniel

    2015-04-01

    Codoping of Ce-doped LaBr3 with Ba, Ca, or Sr improves the energy resolution that can be achieved by radiation detectors based on these materials. Here, we present a mechanism that rationalizes this enhancement on the basis of first-principles electronic structure calculations and point defect thermodynamics. It is shown that incorporation of Sr creates neutral VBr-SrLa complexes that can temporarily trap electrons. As a result, Auger quenching of free carriers is reduced, allowing for a more linear, albeit slower, scintillation light yield response. Experimental Stokes shifts can be related to different CeLa-SrLa-VBr triple complex configurations. Codoping with other alkaline as well as alkaline-earth metals is considered as well. Alkaline elements are found to have extremely small solubilities on the order of 0.1 ppm and below at 1000 K. Among the alkaline-earth metals the lighter dopant atoms prefer interstitial-like positions and create strong scattering centers, which has a detrimental impact on carrier mobilities. Only the heavier alkaline-earth elements (Ca, Sr, Ba) combine matching ionic radii with sufficiently high solubilities. This provides a rationale for the experimental finding that improved scintillator performance is exclusively achieved using Sr, Ca, or Ba. The present mechanism demonstrates that codoping of wide-gap materials can provide an efficient means for managing charge carrier populations under out-of-equilibrium conditions. In the present case dopants are introduced that manipulate not only the concentrations but also the electronic properties of intrinsic defects without introducing additional gap levels. This leads to the availability of shallow electron traps that can temporarily localize charge carriers, effectively deactivating carrier-carrier recombination channels. The principles of this mechanism are therefore not specific to the material considered here but can be adapted for controlling charge carrier populations and

  4. Stability of direct band gap under mechanical strains for monolayer MoS2, MoSe2, WS2 and WSe2

    NASA Astrophysics Data System (ADS)

    Deng, Shuo; Li, Lijie; Li, Min

    2018-07-01

    Single layer transition-metal dichalcogenides materials (MoS2, MoSe2, WS2 and WSe2) are investigated using the first-principles method with the emphasis on their responses to mechanical strains. All these materials display the direct band gap under a certain range of strains from compressive to tensile (stable range). We have found that this stable range is different for these materials. Through studying on their mechanical properties again using the first-principles approach, it is unveiled that this stable strain range is determined by the Young's modulus. More analysis on strains induced electronic band gap properties have also been conducted.

  5. An Adaptation Dilemma Caused by Impacts-Modeling Uncertainty

    NASA Astrophysics Data System (ADS)

    Frieler, K.; Müller, C.; Elliott, J. W.; Heinke, J.; Arneth, A.; Bierkens, M. F.; Ciais, P.; Clark, D. H.; Deryng, D.; Doll, P. M.; Falloon, P.; Fekete, B. M.; Folberth, C.; Friend, A. D.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M. R.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.

    2013-12-01

    Ensuring future well-being for a growing population under either strong climate change or an aggressive mitigation strategy requires a subtle balance of potentially conflicting response measures. In the case of competing goals, uncertainty in impact estimates plays a central role when high confidence in achieving a primary objective (such as food security) directly implies an increased probability of uncertainty induced failure with regard to a competing target (such as climate protection). We use cross sectoral consistent multi-impact model simulations from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP, www.isi-mip.org) to illustrate this uncertainty dilemma: RCP projections from 7 global crop, 11 hydrological, and 7 biomes models are combined to analyze irrigation and land use changes as possible responses to climate change and increasing crop demand due to population growth and economic development. We show that - while a no-regrets option with regard to climate protection - additional irrigation alone is not expected to balance the demand increase by 2050. In contrast, a strong expansion of cultivated land closes the projected production-demand gap in some crop models. However, it comes at the expense of a loss of natural carbon sinks of order 50%. Given the large uncertainty of state of the art crop model projections even these strong land use changes would not bring us ';on the safe side' with respect to food supply. In a world where increasing carbon emissions continue to shrink the overall solution space, we demonstrate that current impacts-modeling uncertainty is a luxury we cannot afford. ISI-MIP is intended to provide cross sectoral consistent impact projections for model intercomparison and improvement as well as cross-sectoral integration. The results presented here were generated within the first Fast-Track phase of the project covering global impact projections. The second phase will also include regional projections. It is the aim

  6. First-principles study of direct and indirect optical absorption in BaSnO3

    NASA Astrophysics Data System (ADS)

    Kang, Youngho; Peelaers, Hartwin; Krishnaswamy, Karthik; Van de Walle, Chris G.

    2018-02-01

    We report first-principles results for the electronic structure and the optical absorption of perovskite BaSnO3 (BSO). BSO has an indirect fundamental gap, and hence, both direct and indirect transitions need to be examined. We assess direct absorption by calculations of the dipole matrix elements. The phonon-assisted indirect absorption spectrum at room temperature is calculated using a quasiclassical approach. Our analysis provides important insights into the optical properties of BSO and addresses several inconsistencies in the results of optical absorption experiments. We shed light on the variety of bandgap values that have been previously reported, concluding that the indirect gap is 2.98 eV and the direct gap is 3.46 eV.

  7. Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunyuan; Stevens, Andrew J.; Chen, Changyou

    2016-08-10

    Learning the representation of shape cues in 2D & 3D objects for recognition is a fundamental task in computer vision. Deep neural networks (DNNs) have shown promising performance on this task. Due to the large variability of shapes, accurate recognition relies on good estimates of model uncertainty, ignored in traditional training of DNNs, typically learned via stochastic optimization. This paper leverages recent advances in stochastic gradient Markov Chain Monte Carlo (SG-MCMC) to learn weight uncertainty in DNNs. It yields principled Bayesian interpretations for the commonly used Dropout/DropConnect techniques and incorporates them into the SG-MCMC framework. Extensive experiments on 2D &more » 3D shape datasets and various DNN models demonstrate the superiority of the proposed approach over stochastic optimization. Our approach yields higher recognition accuracy when used in conjunction with Dropout and Batch-Normalization.« less

  8. Scenario-based fitted Q-iteration for adaptive control of water reservoir systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea

    2017-04-01

    Over recent years, mathematical models have largely been used to support planning and management of water resources systems. Yet, the increasing uncertainties in their inputs - due to increased variability in the hydrological regimes - are a major challenge to the optimal operations of these systems. Such uncertainty, boosted by projected changing climate, violates the stationarity principle generally used for describing hydro-meteorological processes, which assumes time persisting statistical characteristics of a given variable as inferred by historical data. As this principle is unlikely to be valid in the future, the probability density function used for modeling stochastic disturbances (e.g., inflows) becomes an additional uncertain parameter of the problem, which can be described in a deterministic and set-membership based fashion. This study contributes a novel method for designing optimal, adaptive policies for controlling water reservoir systems under climate-related uncertainty. The proposed method, called scenario-based Fitted Q-Iteration (sFQI), extends the original Fitted Q-Iteration algorithm by enlarging the state space to include the space of the uncertain system's parameters (i.e., the uncertain climate scenarios). As a result, sFQI embeds the set-membership uncertainty of the future inflow scenarios in the action-value function and is able to approximate, with a single learning process, the optimal control policy associated to any scenario included in the uncertainty set. The method is demonstrated on a synthetic water system, consisting of a regulated lake operated for ensuring reliable water supply to downstream users. Numerical results show that the sFQI algorithm successfully identifies adaptive solutions to operate the system under different inflow scenarios, which outperform the control policy designed under historical conditions. Moreover, the sFQI policy generalizes over inflow scenarios not directly experienced during the policy design

  9. Fabrication and characteristics of thin disc piezoelectric transformers based on piezoelectric buzzers with gap circles.

    PubMed

    Chang, Kuo-Tsai; Lee, Chun-Wei

    2008-04-01

    This paper investigates design, fabrication and test of thin disc piezoelectric transformers (PTs) based on piezoelectric buzzers with gap circles at different diameters of the gap circles. The performance test is focused on characteristics of voltage gains, including maximum voltage gains and maximum-gain frequencies, for each piezoelectric transformer under different load conditions. Both a piezoelectric buzzer and a gap circle on a silver electrode of the buzzer are needed to build any type of the PTs. Here, the gap circle is used to form a ring-shaped input electrode and a circle-shaped output electrode for each piezoelectric transformer. To do so, both structure and connection of a PT are first expressed. Then, operating principle of a PT and its related vibration mode observed by a carbon-power imaging technique are described. Moreover, an experimental setup for characterizing each piezoelectric transformer is constructed. Finally, effects of diameters of the gap circles on characteristics of voltage gains at different load resistances are discussed.

  10. Two additional principles for determining which species to monitor.

    PubMed

    Wilson, Howard B; Rhodes, Jonathan R; Possingham, Hugh P

    2015-11-01

    Monitoring to detect population declines is widespread, but also costly. There is, consequently, a need to optimize monitoring to maximize cost-effectiveness. Here we develop a quantitative decision analysis framework for how to optimally allocate resources for monitoring among species. By keeping the framework simple, we analytically establish two new principles about which species are optimal to monitor for detecting declines: (1) those that lie on the boundary between species being allocated resources for conservation action and species that are not and (2) those with the greatest uncertainty in whether they are declining. These two principles are in addition to other factors that are also important in monitoring decisions, such as complementarity. We demonstrate the efficacy of these principles when other factors are not present, and show how the two principles can be combined. This analysis demonstrates that the most cost-effective species to monitor are ones where the information gained from monitoring is most likely to change the allocation of funds for action, not necessarily the most vulnerable or endangered. We suggest these results are general and apply to all ecological monitoring, not just of biological species: monitoring and information are only valuable when they are likely to change how people act.

  11. Carcinoma-astrocyte gap junctions promote brain metastasis by cGAMP transfer

    PubMed Central

    Jin, Xin; Valiente, Manuel; Er, Ekrem Emrah; Lopez-Soto, Alejandro; Jacob, Leni; Patwa, Ruzeen; Shah, Hardik; Xu, Ke; Cross, Justin R.; Massagué, Joan

    2016-01-01

    SUMMARY Brain metastasis represents a substantial source of morbidity and mortality in various cancers, and is characterized by high resistance to chemotherapy. Here we define the role of the most abundant cell type in the brain, the astrocyte, in promoting brain metastasis. Breast and lung cancer cells express protocadherin 7 (PCDH7) to favor the assembly of carcinoma-astrocyte gap junctions composed of connexin 43 (Cx43). Once engaged with the astrocyte gap-junctional network, brain metastatic cancer cells employ these channels to transfer the second messenger cGAMP to astrocytes, activating the STING pathway and production of inflammatory cytokines IFNα and TNFα. As paracrine signals, these factors activate the STAT1 and NF-κB pathways in brain metastatic cells, which support tumour growth and chemoresistance. The orally bioavailable modulators of gap junctions meclofenamate and tonabersat break this paracrine loop, and we provide proof-of-principle for the applicability of this therapeutic strategy to treat established brain metastasis. PMID:27225120

  12. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    DOE PAGES

    Carlsson, Boris; Forssen, Christian; Fahlin Strömberg, D.; ...

    2016-02-24

    Chiral effective field theory ( ΧEFT) provides a systematic approach to describe low-energy nuclear forces. Moreover, EFT is able to provide well-founded estimates of statistical and systematic uncertainties | although this unique advantage has not yet been fully exploited. We ll this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous t to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of EFT. Finally, we study the effect on other observables by demonstrating forward-error-propagation methodsmore » that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain e cient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to EFT and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling showing that statistical errors are in general small compared to systematic ones. In conclusion, we find that a simultaneous t to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in EFT. Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector; in particlar when varying the cutoff in the chiral potentials. The methodology and results presented in this Paper open a new frontier for uncertainty quantification in ab initio nuclear theory.« less

  13. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    PubMed

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  14. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  15. Consolidated principles for screening based on a systematic review and consensus process.

    PubMed

    Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda

    2018-04-09

    In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a

  16. Consolidated principles for screening based on a systematic review and consensus process

    PubMed Central

    Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda

    2018-01-01

    BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles

  17. Functionalized Thallium Antimony Films as Excellent Candidates for Large-Gap Quantum Spin Hall Insulator.

    PubMed

    Zhang, Run-wu; Zhang, Chang-wen; Ji, Wei-xiao; Li, Sheng-shi; Yan, Shi-shen; Li, Ping; Wang, Pei-ji

    2016-02-17

    Group III-V films are of great importance for their potential application in spintronics and quantum computing. Search for two-dimensional III-V films with a nontrivial large-gap are quite crucial for the realization of dissipationless transport edge channels using quantum spin Hall (QSH) effects. Here we use first-principles calculations to predict a class of large-gap QSH insulators in functionalized TlSb monolayers (TlSbX2; (X = H, F, Cl, Br, I)), with sizable bulk gaps as large as 0.22~0.40 eV. The QSH state is identified by Z2 topological invariant together with helical edge states induced by spin-orbit coupling (SOC). Noticeably, the inverted band gap in the nontrivial states can be effectively tuned by the electric field and strain. Additionally, these films on BN substrate also maintain a nontrivial QSH state, which harbors a Dirac cone lying within the band gap. These findings may shed new light in future design and fabrication of QSH insulators based on two-dimensional honeycomb lattices in spintronics.

  18. Functionalized Thallium Antimony Films as Excellent Candidates for Large-Gap Quantum Spin Hall Insulator

    PubMed Central

    Zhang, Run-wu; Zhang, Chang-wen; Ji, Wei-xiao; Li, Sheng-shi; Yan, Shi-shen; Li, Ping; Wang, Pei-ji

    2016-01-01

    Group III-V films are of great importance for their potential application in spintronics and quantum computing. Search for two-dimensional III-V films with a nontrivial large-gap are quite crucial for the realization of dissipationless transport edge channels using quantum spin Hall (QSH) effects. Here we use first-principles calculations to predict a class of large-gap QSH insulators in functionalized TlSb monolayers (TlSbX2; (X = H, F, Cl, Br, I)), with sizable bulk gaps as large as 0.22 ~ 0.40 eV. The QSH state is identified by Z2 topological invariant together with helical edge states induced by spin-orbit coupling (SOC). Noticeably, the inverted band gap in the nontrivial states can be effectively tuned by the electric field and strain. Additionally, these films on BN substrate also maintain a nontrivial QSH state, which harbors a Dirac cone lying within the band gap. These findings may shed new light in future design and fabrication of QSH insulators based on two-dimensional honeycomb lattices in spintronics. PMID:26882865

  19. Band gaps and the possible effect on impact sensitivity for some nitro aromatic explosive materials

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Cheung, Frankie; Zhao, Feng; Cheng, Xin-Lu

    The first principle density functional theory method SIESTA has been used to compute the band gap of several polynitroaromatic explosives, such as TATB, DATB, TNT, and picric acid. In these systems, the weakest bond is the one between an NO2 group and the aromatic ring. The bond dissociation energy (BDE) alone cannot predicate the relative sensitivity to impact of these four systems correctly. It was found that their relative impact sensitivity could be explained by considering the BDE and the band gap value of the crystal state together.

  20. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…

  1. Physical insight into the thermodynamic uncertainty relation using Brownian motion in tilted periodic potentials

    NASA Astrophysics Data System (ADS)

    Hyeon, Changbong; Hwang, Wonseok

    2017-07-01

    Using Brownian motion in periodic potentials V (x ) tilted by a force f , we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q , and in order to limit the output fluctuation within a relative uncertainty ɛ , at least 2 kBT /ɛ2 of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f ≪V'(x ) ] but also at far-from-equilibrium [f ≫V'(x ) ] , more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σq2) increases with its mean (μq), and it cannot be smaller than 2 kBT μq .

  2. Physical insight into the thermodynamic uncertainty relation using Brownian motion in tilted periodic potentials.

    PubMed

    Hyeon, Changbong; Hwang, Wonseok

    2017-07-01

    Using Brownian motion in periodic potentials V(x) tilted by a force f, we provide physical insight into the thermodynamic uncertainty relation, a recently conjectured principle for statistical errors and irreversible heat dissipation in nonequilibrium steady states. According to the relation, nonequilibrium output generated from dissipative processes necessarily incurs an energetic cost or heat dissipation q, and in order to limit the output fluctuation within a relative uncertainty ε, at least 2k_{B}T/ε^{2} of heat must be dissipated. Our model shows that this bound is attained not only at near-equilibrium [f≪V^{'}(x)] but also at far-from-equilibrium [f≫V^{'}(x)], more generally when the dissipated heat is normally distributed. Furthermore, the energetic cost is maximized near the critical force when the barrier separating the potential wells is about to vanish and the fluctuation of Brownian particles is maximized. These findings indicate that the deviation of heat distribution from Gaussianity gives rise to the inequality of the uncertainty relation, further clarifying the meaning of the uncertainty relation. Our derivation of the uncertainty relation also recognizes a bound of nonequilibrium fluctuations that the variance of dissipated heat (σ_{q}^{2}) increases with its mean (μ_{q}), and it cannot be smaller than 2k_{B}Tμ_{q}.

  3. Vibrational renormalisation of the electronic band gap in hexagonal and cubic ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, Edgar A., E-mail: eae32@cam.ac.uk; Needs, Richard J.; Monserrat, Bartomeu

    2015-12-28

    Electron-phonon coupling in hexagonal and cubic water ice is studied using first-principles quantum mechanical methods. We consider 29 distinct hexagonal and cubic ice proton-orderings with up to 192 molecules in the simulation cell to account for proton-disorder. We find quantum zero-point vibrational corrections to the minimum electronic band gaps ranging from −1.5 to −1.7 eV, which leads to improved agreement between calculated and experimental band gaps. Anharmonic nuclear vibrations play a negligible role in determining the gaps. Deuterated ice has a smaller band-gap correction at zero-temperature of −1.2 to −1.4 eV. Vibrations reduce the differences between the electronic band gapsmore » of different proton-orderings from around 0.17 eV to less than 0.05 eV, so that the electronic band gaps of hexagonal and cubic ice are almost independent of the proton-ordering when quantum nuclear vibrations are taken into account. The comparatively small reduction in the band gap over the temperature range 0 − 240 K of around 0.1 eV does not depend on the proton ordering, or whether the ice is protiated or deuterated, or hexagonal, or cubic. We explain this in terms of the atomistic origin of the strong electron-phonon coupling in ice.« less

  4. Age-related differences in gap detection: effects of task difficulty and cognitive ability.

    PubMed

    Harris, Kelly C; Eckert, Mark A; Ahlstrom, Jayne B; Dubno, Judy R

    2010-06-01

    Differences in gap detection for younger and older adults have been shown to vary with the complexity of the task or stimuli, but the factors that contribute to these differences remain unknown. To address this question, we examined the extent to which age-related differences in processing speed and workload predicted age-related differences in gap detection. Gap detection thresholds were measured for 10 younger and 11 older adults in two conditions that varied in task complexity but used identical stimuli: (1) gap location fixed at the beginning, middle, or end of a noise burst and (2) gap location varied randomly from trial to trial from the beginning, middle, or end of the noise. We hypothesized that gap location uncertainty would place increased demands on cognitive and attentional resources and result in significantly higher gap detection thresholds for older but not younger adults. Overall, gap detection thresholds were lower for the middle location as compared to beginning and end locations and were lower for the fixed than the random condition. In general, larger age-related differences in gap detection were observed for more challenging conditions. That is, gap detection thresholds for older adults were significantly larger for the random condition than for the fixed condition when the gap was at the beginning and end locations but not the middle. In contrast, gap detection thresholds for younger adults were not significantly different for the random and fixed condition at any location. Subjective ratings of workload indicated that older adults found the gap detection task more mentally demanding than younger adults. Consistent with these findings, results of the Purdue Pegboard and Connections tests revealed age-related slowing of processing speed. Moreover, age group differences in workload and processing speed predicted gap detection in younger and older adults when gap location varied from trial to trial; these associations were not observed when gap

  5. Age-related differences in gap detection: Effects of task difficulty and cognitive ability

    PubMed Central

    Harris, Kelly C.; Eckert, Mark A.; Ahlstrom, Jayne B.; Dubno, Judy R.

    2009-01-01

    Differences in gap detection for younger and older adults have been shown to vary with the complexity of the task or stimuli, but the factors that contribute to these differences remain unknown. To address this question, we examined the extent to which age-related differences in processing speed and workload predicted age-related differences in gap detection. Gap detection thresholds were measured for 10 younger and 11 older adults in two conditions that varied in task complexity but used identical stimuli: (1) gap location fixed at the beginning, middle, or end of a noise burst and (2) gap location varied randomly from trial to trial from the beginning, middle, or end of the noise. We hypothesized that gap location uncertainty would place increased demands on cognitive and attentional resources and result in significantly higher gap detection thresholds for older but not younger adults. Overall, gap detection thresholds were lower for the middle location as compared to beginning and end locations and were lower for the fixed than the random condition. In general, larger age-related differences in gap detection were observed for more challenging conditions. That is, gap detection thresholds for older adults were significantly larger for the random condition than for the fixed condition when the gap was at the beginning and end locations but not the middle. In contrast, gap detection thresholds for younger adults were not significantly different for the random and fixed condition at any location. Subjective ratings of workload indicated that older adults found the gap-detection task more mentally demanding than younger adults. Consistent with these findings, results of the Purdue Pegboard and Connections tests revealed age-related slowing of processing speed. Moreover, age group differences in workload and processing speed predicted gap detection in younger and older adults when gap location varied from trial to trial; these associations were not observed when gap

  6. GapBlaster-A Graphical Gap Filler for Prokaryote Genomes.

    PubMed

    de Sá, Pablo H C G; Miranda, Fábio; Veras, Adonney; de Melo, Diego Magalhães; Soares, Siomar; Pinheiro, Kenny; Guimarães, Luis; Azevedo, Vasco; Silva, Artur; Ramos, Rommel T J

    2016-01-01

    The advent of NGS (Next Generation Sequencing) technologies has resulted in an exponential increase in the number of complete genomes available in biological databases. This advance has allowed the development of several computational tools enabling analyses of large amounts of data in each of the various steps, from processing and quality filtering to gap filling and manual curation. The tools developed for gap closure are very useful as they result in more complete genomes, which will influence downstream analyses of genomic plasticity and comparative genomics. However, the gap filling step remains a challenge for genome assembly, often requiring manual intervention. Here, we present GapBlaster, a graphical application to evaluate and close gaps. GapBlaster was developed via Java programming language. The software uses contigs obtained in the assembly of the genome to perform an alignment against a draft of the genome/scaffold, using BLAST or Mummer to close gaps. Then, all identified alignments of contigs that extend through the gaps in the draft sequence are presented to the user for further evaluation via the GapBlaster graphical interface. GapBlaster presents significant results compared to other similar software and has the advantage of offering a graphical interface for manual curation of the gaps. GapBlaster program, the user guide and the test datasets are freely available at https://sourceforge.net/projects/gapblaster2015/. It requires Sun JDK 8 and Blast or Mummer.

  7. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    PubMed

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  8. MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle.

    PubMed

    Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter

    2017-12-08

    According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10^{-15} precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ(Ti,Pt)=[-1±9(stat)±9(syst)]×10^{-15} (1σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.

  9. MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle

    NASA Astrophysics Data System (ADS)

    Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter

    2017-12-01

    According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10-15 precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ (Ti ,Pt )=[-1 ±9 (stat)±9 (syst)]×10-15 (1 σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.

  10. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  11. Forest management under uncertainty for multiple bird population objectives

    USGS Publications Warehouse

    Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

  12. Solving Navigational Uncertainty Using Grid Cells on Robots

    PubMed Central

    Milford, Michael J.; Wiles, Janet; Wyeth, Gordon F.

    2010-01-01

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our

  13. Detection thresholds for gaps, overlaps, and no-gap-no-overlaps.

    PubMed

    Heldner, Mattias

    2011-07-01

    Detection thresholds for gaps and overlaps, that is acoustic and perceived silences and stretches of overlapping speech in speaker changes, were determined. Subliminal gaps and overlaps were categorized as no-gap-no-overlaps. The established gap and overlap detection thresholds both corresponded to the duration of a long vowel, or about 120 ms. These detection thresholds are valuable for mapping the perceptual speaker change categories gaps, overlaps, and no-gap-no-overlaps into the acoustic domain. Furthermore, the detection thresholds allow generation and understanding of gaps, overlaps, and no-gap-no-overlaps in human-like spoken dialogue systems. © 2011 Acoustical Society of America

  14. Strong interplay between structure and electronic properties in CuIn(S,Se){2}: a first-principles study.

    PubMed

    Vidal, Julien; Botti, Silvana; Olsson, Pär; Guillemoles, Jean-François; Reining, Lucia

    2010-02-05

    We present a first-principles study of the electronic properties of CuIn(S,Se){2} (CIS) using state-of-the-art self-consistent GW and hybrid functionals. The calculated band gap depends strongly on the anion displacement u, an internal structural parameter that measures lattice distortion. This contrasts with the observed stability of the band gap of CIS solar panels under operating conditions, where a relatively large dispersion of values for u occurs. We solve this apparent paradox considering the coupled effect on the band gap of copper vacancies and lattice distortions. The correct treatment of d electrons in these materials requires going beyond density functional theory, and GW self-consistency is critical to evaluate the quasiparticle gap and the valence band maximum.

  15. Synthetic principles directing charge transport in low-band-gap dithienosilole-benzothiadiazole copolymers.

    PubMed

    Beaujuge, Pierre M; Tsao, Hoi Nok; Hansen, Michael Ryan; Amb, Chad M; Risko, Chad; Subbiah, Jegadesan; Choudhury, Kaushik Roy; Mavrinskiy, Alexei; Pisula, Wojciech; Brédas, Jean-Luc; So, Franky; Müllen, Klaus; Reynolds, John R

    2012-05-30

    Given the fundamental differences in carrier generation and device operation in organic thin-film transistors (OTFTs) and organic photovoltaic (OPV) devices, the material design principles to apply may be expected to differ. In this respect, designing organic semiconductors that perform effectively in multiple device configurations remains a challenge. Following "donor-acceptor" principles, we designed and synthesized an analogous series of solution-processable π-conjugated polymers that combine the electron-rich dithienosilole (DTS) moiety, unsubstituted thiophene spacers, and the electron-deficient core 2,1,3-benzothiadiazole (BTD). Insights into backbone geometry and wave function delocalization as a function of molecular structure are provided by density functional theory (DFT) calculations at the B3LYP/6-31G(d,p) level. Using a combination of X-ray techniques (2D-WAXS and XRD) supported by solid-state NMR (SS-NMR) and atomic force microscopy (AFM), we demonstrate fundamental correlations between the polymer repeat-unit structure, molecular weight distribution, nature of the solubilizing side-chains appended to the backbones, and extent of structural order attainable in p-channel OTFTs. In particular, it is shown that the degree of microstructural order achievable in the self-assembled organic semiconductors increases largely with (i) increasing molecular weight and (ii) appropriate solubilizing-group substitution. The corresponding field-effect hole mobilities are enhanced by several orders of magnitude, reaching up to 0.1 cm(2) V(-1) s(-1) with the highest molecular weight fraction of the branched alkyl-substituted polymer derivative in this series. This trend is reflected in conventional bulk-heterojunction OPV devices using PC(71)BM, whereby the active layers exhibit space-charge-limited (SCL) hole mobilities approaching 10(-3) cm(2) V(-1) s(-1), and yield improved power conversion efficiencies on the order of 4.6% under AM1.5G solar illumination. Beyond

  16. Gaps in knowledge and data driving uncertainty in models of photosynthesis.

    PubMed

    Dietze, Michael C

    2014-02-01

    Regional and global models of the terrestrial biosphere depend critically on models of photosynthesis when predicting impacts of global change. This paper focuses on identifying the primary data needs of these models, what scales drive uncertainty, and how to improve measurements. Overall, there is a need for an open, cross-discipline database on leaf-level photosynthesis in general, and response curves in particular. The parameters in photosynthetic models are not constant through time, space, or canopy position but there is a need for a better understanding of whether relationships with drivers, such as leaf nitrogen, are themselves scale dependent. Across time scales, as ecosystem models become more sophisticated in their representations of succession they needs to be able to approximate sunfleck responses to capture understory growth and survival. At both high and low latitudes, photosynthetic data are inadequate in general and there is a particular need to better understand thermal acclimation. Simple models of acclimation suggest that shifts in optimal temperature are important. However, there is little advantage to synoptic-scale responses and circadian rhythms may be more beneficial than acclimation over shorter timescales. At high latitudes, there is a need for a better understanding of low-temperature photosynthetic limits, while at low latitudes the need is for a better understanding of phosphorus limitations on photosynthesis. In terms of sampling, measuring multivariate photosynthetic response surfaces are potentially more efficient and more accurate than traditional univariate response curves. Finally, there is a need for greater community involvement in model validation and model-data synthesis.

  17. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  18. Modulation of band gap by an applied electric field in BN-based heterostructures

    NASA Astrophysics Data System (ADS)

    Luo, M.; Xu, Y. E.; Zhang, Q. X.

    2018-05-01

    First-principles density functional theory (DFT) calculations are performed on the structural and electronic properties of the SiC/BN van der Waals (vdW) heterostructures under an external electric field (E-field). Our results reveal that the SiC/BN vdW heterostructure has a direct band gap of 2.41 eV in the raw. The results also imply that electrons are likely to transfer from BN to SiC monolayer due to the deeper potential of BN monolayer. It is also observed that, by applying an E-field, ranging from -0.50 to +0.65 V/Å, the band gap decreases from 2.41 eV to zero, which presents a parabola-like relationship around 0.0 V/Å. Through partial density of states (PDOS) plots, it is revealed that, p orbital of Si, C, B, and N atoms are responsible for the significant variations of band gap. These obtained results predict that, the electric field tunable band gap of the SiC/BN vdW heterostructures carries potential applications for nanoelectronics and spintronic device applications.

  19. First-Principles Study of Impurities in TlBr

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Mao-Hua

    2012-01-01

    TlBr is a promising semiconductor material for room-temperature radiation detection. Material purification has been the driver for the recent improvement in the TlBr detector performance, mainly reflected by the significant increase in the carrier mobility-lifetime product. This suggests that impurities have significant impact on the carrier transport in TlBr. In this paper, first-principles calculations are used to study the properties of a number of commonly observed impurities in TlBr. The impurity-induced gap states are presented and their effects on the carrier trapping are discussed.

  20. First-principles study of impurities in TlBr

    NASA Astrophysics Data System (ADS)

    Du, Mao-Hua

    2012-04-01

    TlBr is a promising semiconductor material for room-temperature radiation detection. Material purification has been the driver for the recent improvement in the TlBr detector performance, mainly reflected by the significant increase in the carrier mobility-lifetime product. This suggests that impurities have significant impact on the carrier transport in TlBr. In this paper, first-principles calculations are used to study the properties of a number of commonly observed impurities in TlBr. The impurity-induced gap states are presented and their effects on the carrier trapping are discussed.

  1. New insights into the opening band gap of graphene oxides

    NASA Astrophysics Data System (ADS)

    Tran, Ngoc Thanh Thuy; Lin, Shih-Yang; Lin, Ming-Fa

    Electronic properties of oxygen absorbed few-layer graphenes are investigated using first-principle calculations. They are very sensitive to the changes in the oxygen concentration, number of graphene layer, and stacking configuration. The feature-rich band structures exhibit the destruction or distortion of the Dirac cone, opening of band gap, anisotropic energy dispersions, O- and (C,O)-dominated energy dispersions, and extra critical points. The band decomposed charge distributions reveal the π-bonding dominated energy gap. The orbital-projected density of states (DOS) have many special structures mainly coming from a composite energy band, the parabolic and partially flat ones. The DOS and spatial charge distributions clearly indicate the critical orbital hybridizations in O-O, C-O and C-C bonds, being responsible for the diversified properties. All of the few-layer graphene oxides are semi-metals except for the semiconducting monolayer ones.

  2. The Effects of Graphene Stacking on the Performance of Methane Sensor: A First-Principles Study on the Adsorption, Band Gap and Doping of Graphene

    PubMed Central

    Yang, Daoguo; Zhang, Guoqi; Chen, Liangbiao; Liu, Dongjing; Cai, Miao; Fan, Xuejun

    2018-01-01

    The effects of graphene stacking are investigated by comparing the results of methane adsorption energy, electronic performance, and the doping feasibility of five dopants (i.e., B, N, Al, Si, and P) via first-principles theory. Both zigzag and armchair graphenes are considered. It is found that the zigzag graphene with Bernal stacking has the largest adsorption energy on methane, while the armchair graphene with Order stacking is opposite. In addition, both the Order and Bernal stacked graphenes possess a positive linear relationship between adsorption energy and layer number. Furthermore, they always have larger adsorption energy in zigzag graphene. For electronic properties, the results show that the stacking effects on band gap are significant, but it does not cause big changes to band structure and density of states. In the comparison of distance, the average interlamellar spacing of the Order stacked graphene is the largest. Moreover, the adsorption effect is the result of the interactions between graphene and methane combined with the change of graphene’s structure. Lastly, the armchair graphene with Order stacking possesses the lowest formation energy in these five dopants. It could be the best choice for doping to improve the methane adsorption. PMID:29389860

  3. The Effects of Graphene Stacking on the Performance of Methane Sensor: A First-Principles Study on the Adsorption, Band Gap and Doping of Graphene.

    PubMed

    Yang, Ning; Yang, Daoguo; Zhang, Guoqi; Chen, Liangbiao; Liu, Dongjing; Cai, Miao; Fan, Xuejun

    2018-02-01

    The effects of graphene stacking are investigated by comparing the results of methane adsorption energy, electronic performance, and the doping feasibility of five dopants (i.e., B, N, Al, Si, and P) via first-principles theory. Both zigzag and armchair graphenes are considered. It is found that the zigzag graphene with Bernal stacking has the largest adsorption energy on methane, while the armchair graphene with Order stacking is opposite. In addition, both the Order and Bernal stacked graphenes possess a positive linear relationship between adsorption energy and layer number. Furthermore, they always have larger adsorption energy in zigzag graphene. For electronic properties, the results show that the stacking effects on band gap are significant, but it does not cause big changes to band structure and density of states. In the comparison of distance, the average interlamellar spacing of the Order stacked graphene is the largest. Moreover, the adsorption effect is the result of the interactions between graphene and methane combined with the change of graphene's structure. Lastly, the armchair graphene with Order stacking possesses the lowest formation energy in these five dopants. It could be the best choice for doping to improve the methane adsorption.

  4. The principle of finiteness - a guideline for physical laws

    NASA Astrophysics Data System (ADS)

    Sternlieb, Abraham

    2013-04-01

    I propose a new principle in physics-the principle of finiteness (FP). It stems from the definition of physics as a science that deals with measurable dimensional physical quantities. Since measurement results including their errors, are always finite, FP postulates that the mathematical formulation of legitimate laws in physics should prevent exactly zero or infinite solutions. I propose finiteness as a postulate, as opposed to a statement whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories or principles. Some consequences of FP are discussed, first in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The corrected Lorentz transformations include an additional translation term depending on the minimum length epsilon. The relativistic gamma is replaced by a corrected gamma, that is finite for v=c. To comply with FP, physical laws should include the relevant extremum finite values in their mathematical formulation. An important prediction of FP is that there is a maximum attainable relativistic mass/energy which is the same for all subatomic particles, meaning that there is a maximum theoretical value for cosmic rays energy. The Generalized Uncertainty Principle required by Quantum Gravity is actually a necessary consequence of FP at Planck's scale. Therefore, FP may possibly contribute to the axiomatic foundation of Quantum Gravity.

  5. Identifying acne treatment uncertainties via a James Lind Alliance Priority Setting Partnership

    PubMed Central

    Layton, Alison; Eady, E Anne; Peat, Maggie; Whitehouse, Heather; Levell, Nick; Ridd, Matthew; Cowdell, Fiona; Patel, Mahenda; Andrews, Stephen; Oxnard, Christine; Fenton, Mark; Firkins, Lester

    2015-01-01

    Objectives The Acne Priority Setting Partnership (PSP) was set up to identify and rank treatment uncertainties by bringing together people with acne, and professionals providing care within and beyond the National Health Service (NHS). Setting The UK with international participation. Participants Teenagers and adults with acne, parents, partners, nurses, clinicians, pharmacists, private practitioners. Methods Treatment uncertainties were collected via separate online harvesting surveys, embedded within the PSP website, for patients and professionals. A wide variety of approaches were used to promote the surveys to stakeholder groups with a particular emphasis on teenagers and young adults. Survey submissions were collated using keywords and verified as uncertainties by appraising existing evidence. The 30 most popular themes were ranked via weighted scores from an online vote. At a priority setting workshop, patients and professionals discussed the 18 highest-scoring questions from the vote, and reached consensus on the top 10. Results In the harvesting survey, 2310 people, including 652 professionals and 1456 patients (58% aged 24 y or younger), made submissions containing at least one research question. After checking for relevance and rephrasing, a total of 6255 questions were collated into themes. Valid votes ranking the 30 most common themes were obtained from 2807 participants. The top 10 uncertainties prioritised at the workshop were largely focused on management strategies, optimum use of common prescription medications and the role of non-drug based interventions. More female than male patients took part in the harvesting surveys and vote. A wider range of uncertainties were provided by patients compared to professionals. Conclusions Engaging teenagers and young adults in priority setting is achievable using a variety of promotional methods. The top 10 uncertainties reveal an extensive knowledge gap about widely used interventions and the relative merits

  6. Accounting for uncertainty in health economic decision models by using model averaging.

    PubMed

    Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D

    2009-04-01

    Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment.

  7. Accounting for uncertainty in health economic decision models by using model averaging

    PubMed Central

    Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D

    2009-01-01

    Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment. PMID:19381329

  8. Design Principles for the Atomic and Electronic Structure of Halide Perovskite Photovoltaic Materials: Insights from Computation.

    PubMed

    Berger, Robert F

    2018-02-09

    In the current decade, perovskite solar cell research has emerged as a remarkably active, promising, and rapidly developing field. Alongside breakthroughs in synthesis and device engineering, halide perovskite photovoltaic materials have been the subject of predictive and explanatory computational work. In this Minireview, we focus on a subset of this computation: density functional theory (DFT)-based work highlighting the ways in which the electronic structure and band gap of this class of materials can be tuned via changes in atomic structure. We distill this body of computational literature into a set of underlying design principles for the band gap engineering of these materials, and rationalize these principles from the viewpoint of band-edge orbital character. We hope that this perspective provides guidance and insight toward the rational design and continued improvement of perovskite photovoltaics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Alcohol molecules adsorption on graphane nanosheets - A first-principles investigation

    NASA Astrophysics Data System (ADS)

    Nagarajan, V.; Chandiramouli, R.

    2018-05-01

    The geometric structure, electronic and adsorption properties of methanol, ethanol and 1-propanol molecules on hydrogenated graphene (graphane) were investigated using first-principles calculations. The stability of graphane base material is confirmed using formation energy and phonon band structures. The adsorption of alcohol molecules on bare graphane and hydrogen vacant graphane nanosheet is found to be prominent and the selectivity of alcohol molecules can be achieved using bare or hydrogen vacant graphane nanosheet. Moreover, the interaction of alcohol molecules on bare and hydrogen vacant graphane nanosheets is studied using the adsorption energy, energy band gap variation, Bader charge transfer and average energy band gap variation. The adsorption energy ranges from -0.149 to -0.383 eV upon alcohol adsorption. The energy gap varies from 4.71 to 2.62 eV for bare graphane and from 4.02 to 3.60 eV for hydrogen vacant graphane nanosheets upon adsorption of alcohol molecules. The adsorption properties of alcohol molecules provide useful information for the possible application of graphane nanosheet as a base material for the detection of alcohol molecules.

  10. Doping and band gap control at poly(vinylidene fluoride)/graphene interface

    NASA Astrophysics Data System (ADS)

    Cai, Jia; Wang, Jian-Lu; Gao, Heng; Tian, Bobo; Gong, Shi-Jing; Duan, Chun-Gang; Chu, Jun-Hao

    2018-05-01

    Using the density-functional first-principles calculations, we investigate the electronic structures of poly(vinylidene fluoride) PVDF/graphene composite systems. The n- and p-doping of graphene can be flexibly switched by reversing the ferroelectric polarization of PVDF, without scarifying the intrinsic π-electron band dispersions of graphene that are usually undermined by chemical doping. The doping degree is also dependent on the thickness of PVDF layers, which will get saturated when PVDF is thick enough. In PVDF/bilayer graphene (BLG) heterostructure, the doping degree directly determines the local energy gap of the charged BLG. The sandwich structure of PVDF/BLG/PVDF can further enhance the local energy gap as well as keep the electric neutrality of BLG, which will be of great application potentials in graphene-based nanoelectronics.

  11. First principles investigation of GaNbO{sub 4} as a photocatalytic material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Neelam, E-mail: sneelam@issc.unipune.ac.in; Verma, Mukta; Shah, Vaishali

    We have performed first principles density functional total energy calculations on pure and doped GaNbO{sub 4} to investigate its applicability as a photo catalyst. Pure GaNbO{sub 4} is an indirect, wide band gap semiconductor similar to the widely investigated TiO{sub 2} which is known to be a photo catalyst in UV light [K. Yang et. al. Chem. Mater. 20, 6528 (2008)]. S atom doping of TiO{sub 2} reduces the band gap [F. Tian et. al. J. Phys. Chem. B 110, 17866 (2006)], and increases its efficiency in the visible light range. It has been experimentally reported that S doping ofmore » GaNbO{sub 4} at the O site, decreases its photo catalytic efficiency. Our band structure calculations show that both pure and doped GaNbO{sub 4} have indirect band gaps and S atom doping reduces the band gap in agreement with experiments. The decrease in the band gap is due to the lowering of the conduction band minimum towards the Fermi level. An unequal reduction in the band gap was observed at the four inequivalent O sites chosen for S doping. This suggests that the photo catalytic activity varies with the dopant site.« less

  12. Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons

    PubMed Central

    Fetsch, Christopher R.

    2013-01-01

    The richness of perceptual experience, as well as its usefulness for guiding behavior, depends upon the synthesis of information across multiple senses. Recent decades have witnessed a surge in our understanding of how the brain combines sensory signals, or cues. Much of this research has been guided by one of two distinct approaches, one driven primarily by neurophysiological observations, the other guided by principles of mathematical psychology and psychophysics. Conflicting results and interpretations have contributed to a conceptual gap between psychophysical and physiological accounts of cue integration, but recent studies of visual-vestibular cue integration have narrowed this gap considerably. PMID:23686172

  13. Uncertainty and the difficulty of thinking through disjunctions.

    PubMed

    Shafir, E

    1994-01-01

    This paper considers the relationship between decision under uncertainty and thinking through disjunctions. Decision situations that lead to violations of Savage's sure-thing principle are examined, and a variety of simple reasoning problems that often generate confusion and error are reviewed. The common difficulty is attributed to people's reluctance to think through disjunctions. Instead of hypothetically traveling through the branches of a decision tree, it is suggested, people suspend judgement and remain at the node. This interpretation is applied to instances of decision making, information search, deductive and inductive reasoning, probabilistic judgement, games, puzzles and paradoxes. Some implications of the reluctance to think through disjunctions, as well as potential corrective procedures, are discussed.

  14. Carcinoma-astrocyte gap junctions promote brain metastasis by cGAMP transfer.

    PubMed

    Chen, Qing; Boire, Adrienne; Jin, Xin; Valiente, Manuel; Er, Ekrem Emrah; Lopez-Soto, Alejandro; Jacob, Leni; Patwa, Ruzeen; Shah, Hardik; Xu, Ke; Cross, Justin R; Massagué, Joan

    2016-05-26

    Brain metastasis represents a substantial source of morbidity and mortality in various cancers, and is characterized by high resistance to chemotherapy. Here we define the role of the most abundant cell type in the brain, the astrocyte, in promoting brain metastasis. We show that human and mouse breast and lung cancer cells express protocadherin 7 (PCDH7), which promotes the assembly of carcinoma-astrocyte gap junctions composed of connexin 43 (Cx43). Once engaged with the astrocyte gap-junctional network, brain metastatic cancer cells use these channels to transfer the second messenger cGAMP to astrocytes, activating the STING pathway and production of inflammatory cytokines such as interferon-α (IFNα) and tumour necrosis factor (TNF). As paracrine signals, these factors activate the STAT1 and NF-κB pathways in brain metastatic cells, thereby supporting tumour growth and chemoresistance. The orally bioavailable modulators of gap junctions meclofenamate and tonabersat break this paracrine loop, and we provide proof-of-principle that these drugs could be used to treat established brain metastasis.

  15. Relating the defect band gap and the density functional band gap

    NASA Astrophysics Data System (ADS)

    Schultz, Peter; Edwards, Arthur

    2014-03-01

    Density functional theory (DFT) is an important tool to probe the physics of materials. The Kohn-Sham (KS) gap in DFT is typically (much) smaller than the observed band gap for materials in nature, the infamous ``band gap problem.'' Accurate prediction of defect energy levels is often claimed to be a casualty--the band gap defines the energy scale for defect levels. By applying rigorous control of boundary conditions in size-converged supercell calculations, however, we compute defect levels in Si and GaAs with accuracies of ~0.1 eV, across the full gap, unhampered by a band gap problem. Using GaAs as a theoretical laboratory, we show that the defect band gap--the span of computed defect levels--is insensitive to variations in the KS gap (with functional and pseudopotential), these KS gaps ranging from 0.1 to 1.1 eV. The defect gap matches the experimental 1.52 eV gap. The computed defect gaps for several other III-V, II-VI, I-VII, and other compounds also agree with the experimental gap, and show no correlation with the KS gap. Where, then, is the band gap problem? This talk presents these results, discusses why the defect gap and the KS gap are distinct, implying that current understanding of what the ``band gap problem'' means--and how to ``fix'' it--need to be rethought. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Company, for the U.S. Department of Energy's NNSA under contract DE-AC04-94AL85000.

  16. Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.

    PubMed

    Spadavecchia, L; Williams, M; Law, B E

    2011-07-01

    We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly

  17. Optical spectroscopy and band gap analysis of hybrid improper ferroelectric Ca3Ti2O7

    NASA Astrophysics Data System (ADS)

    Musfeldt, Janice; Cherian, Judy; Birol, Turan; Harms, Nathan; Gao, Bin; Cheong, Sang; Vanderbilt, David

    We bring together optical absorption spectroscopy, photoconductivity, and first principles calculations to reveal the electronic structure of the room temperature ferroelectric Ca3Ti2O7. The 3.94 eV direct gap in Ca3Ti2O7 is charge transfer in nature and noticeably higher than that in CaTiO3 (3.4 eV), a finding that we attribute to dimensional confinement in the n = 2 member of the Ruddlesden-Popper series. While Sr substitution introduces disorder and broadens the gap edge slightly, oxygen deficiency reduces the gap to 3.7 eV and gives rise to a broad tail that persists to much lower energies. MSD, BES, U. S. DoE and DMREF, NSF.

  18. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  19. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  20. Image restoration, uncertainty, and information.

    PubMed

    Yu, F T

    1969-01-01

    Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy.

  1. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    PubMed

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth

  2. Production of photocurrent due to intermediate-to-conduction-band transitions: a demonstration of a key operating principle of the intermediate-band solar cell.

    PubMed

    Martí, A; Antolín, E; Stanley, C R; Farmer, C D; López, N; Díaz, P; Cánovas, E; Linares, P G; Luque, A

    2006-12-15

    We present intermediate-band solar cells manufactured using quantum dot technology that show for the first time the production of photocurrent when two sub-band-gap energy photons are absorbed simultaneously. One photon produces an optical transition from the intermediate-band to the conduction band while the second pumps an electron from the valence band to the intermediate-band. The detection of this two-photon absorption process is essential to verify the principles of operation of the intermediate-band solar cell. The phenomenon is the cornerstone physical principle that ultimately allows the production of photocurrent in a solar cell by below band gap photon absorption, without degradation of its output voltage.

  3. Direct Band Gap Gallium Antimony Phosphide (GaSbxP1−x) Alloys

    PubMed Central

    Russell, H. B.; Andriotis, A. N.; Menon, M.; Jasinski, J. B.; Martinez-Garcia, A.; Sunkara, M. K.

    2016-01-01

    Here, we report direct band gap transition for Gallium Phosphide (GaP) when alloyed with just 1–2 at% antimony (Sb) utilizing both density functional theory based computations and experiments. First principles density functional theory calculations of GaSbxP1−x alloys in a 216 atom supercell configuration indicate that an indirect to direct band gap transition occurs at x = 0.0092 or higher Sb incorporation into GaSbxP1−x. Furthermore, these calculations indicate band edge straddling of the hydrogen evolution and oxygen evolution reactions for compositions ranging from x = 0.0092 Sb up to at least x = 0.065 Sb making it a candidate for use in a Schottky type photoelectrochemical water splitting device. GaSbxP1−x nanowires were synthesized by reactive transport utilizing a microwave plasma discharge with average compositions ranging from x = 0.06 to x = 0.12 Sb and direct band gaps between 2.21 eV and 1.33 eV. Photoelectrochemical experiments show that the material is photoactive with p-type conductivity. This study brings attention to a relatively uninvestigated, tunable band gap semiconductor system with tremendous potential in many fields. PMID:26860470

  4. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  5. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  6. Rapid research and implementation priority setting for wound care uncertainties.

    PubMed

    Gray, Trish A; Dumville, Jo C; Christie, Janice; Cullum, Nicky A

    2017-01-01

    People with complex wounds are more likely to be elderly, living with multimorbidity and wound related symptoms. A variety of products are available for managing complex wounds and a range of healthcare professionals are involved in wound care, yet there is a lack of good evidence to guide practice and services. These factors create uncertainty for those who deliver and those who manage wound care. Formal priority setting for research and implementation topics is needed to more accurately target the gaps in treatment and services. We solicited practitioner and manager uncertainties in wound care and held a priority setting workshop to facilitate a collaborative approach to prioritising wound care-related uncertainties. We recruited healthcare professionals who regularly cared for patients with complex wounds, were wound care specialists or managed wound care services. Participants submitted up to five wound care uncertainties in consultation with their colleagues, via an on-line survey and attended a priority setting workshop. Submitted uncertainties were collated, sorted and categorised according professional group. On the day of the workshop, participants were divided into four groups depending on their profession. Uncertainties submitted by their professional group were viewed, discussed and amended, prior to the first of three individual voting rounds. Participants cast up to ten votes for the uncertainties they judged as being high priority. Continuing in the professional groups, the top 10 uncertainties from each group were displayed, and the process was repeated. Groups were then brought together for a plenary session in which the final priorities were individually scored on a scale of 0-10 by participants. Priorities were ranked and results presented. Nominal group technique was used for generating the final uncertainties, voting and discussions. Thirty-three participants attended the workshop comprising; 10 specialist nurses, 10 district nurses, seven

  7. Rapid research and implementation priority setting for wound care uncertainties

    PubMed Central

    Dumville, Jo C.; Christie, Janice; Cullum, Nicky A.

    2017-01-01

    Introduction People with complex wounds are more likely to be elderly, living with multimorbidity and wound related symptoms. A variety of products are available for managing complex wounds and a range of healthcare professionals are involved in wound care, yet there is a lack of good evidence to guide practice and services. These factors create uncertainty for those who deliver and those who manage wound care. Formal priority setting for research and implementation topics is needed to more accurately target the gaps in treatment and services. We solicited practitioner and manager uncertainties in wound care and held a priority setting workshop to facilitate a collaborative approach to prioritising wound care-related uncertainties. Methods We recruited healthcare professionals who regularly cared for patients with complex wounds, were wound care specialists or managed wound care services. Participants submitted up to five wound care uncertainties in consultation with their colleagues, via an on-line survey and attended a priority setting workshop. Submitted uncertainties were collated, sorted and categorised according professional group. On the day of the workshop, participants were divided into four groups depending on their profession. Uncertainties submitted by their professional group were viewed, discussed and amended, prior to the first of three individual voting rounds. Participants cast up to ten votes for the uncertainties they judged as being high priority. Continuing in the professional groups, the top 10 uncertainties from each group were displayed, and the process was repeated. Groups were then brought together for a plenary session in which the final priorities were individually scored on a scale of 0–10 by participants. Priorities were ranked and results presented. Nominal group technique was used for generating the final uncertainties, voting and discussions. Results Thirty-three participants attended the workshop comprising; 10 specialist nurses

  8. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  9. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    NASA Astrophysics Data System (ADS)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions

  10. Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study

    USGS Publications Warehouse

    Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,

    2005-01-01

    Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.

  11. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  12. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  13. Embracing uncertainty, managing complexity: applying complexity thinking principles to transformation efforts in healthcare systems.

    PubMed

    Khan, Sobia; Vandermorris, Ashley; Shepherd, John; Begun, James W; Lanham, Holly Jordan; Uhl-Bien, Mary; Berta, Whitney

    2018-03-21

    Complexity thinking is increasingly being embraced in healthcare, which is often described as a complex adaptive system (CAS). Applying CAS to healthcare as an explanatory model for understanding the nature of the system, and to stimulate changes and transformations within the system, is valuable. A seminar series on systems and complexity thinking hosted at the University of Toronto in 2016 offered a number of insights on applications of CAS perspectives to healthcare that we explore here. We synthesized topics from this series into a set of six insights on how complexity thinking fosters a deeper understanding of accepted ideas in healthcare, applications of CAS to actors within the system, and paradoxes in applications of complexity thinking that may require further debate: 1) a complexity lens helps us better understand the nebulous term "context"; 2) concepts of CAS may be applied differently when actors are cognizant of the system in which they operate; 3) actor responses to uncertainty within a CAS is a mechanism for emergent and intentional adaptation; 4) acknowledging complexity supports patient-centred intersectional approaches to patient care; 5) complexity perspectives can support ways that leaders manage change (and transformation) in healthcare; and 6) complexity demands different ways of implementing ideas and assessing the system. To enhance our exploration of key insights, we augmented the knowledge gleaned from the series with key articles on complexity in the literature. Ultimately, complexity thinking acknowledges the "messiness" that we seek to control in healthcare and encourages us to embrace it. This means seeing challenges as opportunities for adaptation, stimulating innovative solutions to ensure positive adaptation, leveraging the social system to enable ideas to emerge and spread across the system, and even more important, acknowledging that these adaptive actions are part of system behaviour just as much as periods of stability are. By

  14. Key principles to improve programmes and interventions in complementary feeding.

    PubMed

    Lutter, Chessa K; Iannotti, Lora; Creed-Kanashiro, Hilary; Guyon, Agnes; Daelmans, Bernadette; Robert, Rebecca; Haider, Rukhsana

    2013-09-01

    Although there are some examples of successful complementary feeding programmes to promote healthy growth and prevent stunting at the community level, to date there are few, if any, examples of successful programmes at scale. A lack of systematic process and impact evaluations on pilot projects to generate lessons learned has precluded scaling up of effective programmes. Programmes to effect positive change in nutrition rarely follow systematic planning, implementation, and evaluation (PIE) processes to enhance effectiveness over the long term. As a result a set of programme-oriented key principles to promote healthy growth remains elusive. The purpose of this paper is to fill this gap by proposing a set of principles to improve programmes and interventions to promote healthy growth and development. Identifying such principles for programme success has three requirements: rethinking traditional paradigms used to promote improved infant and young child feeding; ensuring better linkages to delivery platforms; and, improving programming. Following the PIE model for programmes and learning from experiences from four relatively large-scale programmes described in this paper, 10 key principles are identified in the areas of programme planning, programme implementation, programme evaluation, and dissemination, replication, and scaling up. Nonetheless, numerous operational research questions remain, some of which are highlighted in this paper. © 2013 John Wiley & Sons Ltd.

  15. The precautionary principle in environmental science.

    PubMed Central

    Kriebel, D; Tickner, J; Epstein, P; Lemons, J; Levins, R; Loechler, E L; Quinn, M; Rudel, R; Schettler, T; Stoto, M

    2001-01-01

    Environmental scientists play a key role in society's responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy. PMID:11673114

  16. A national streamflow network gap analysis

    USGS Publications Warehouse

    Kiang, Julie E.; Stewart, David W.; Archfield, Stacey A.; Osborne, Emily B.; Eng, Ken

    2013-01-01

    The U.S. Geological Survey (USGS) conducted a gap analysis to evaluate how well the USGS streamgage network meets a variety of needs, focusing on the ability to calculate various statistics at locations that have streamgages (gaged) and that do not have streamgages (ungaged). This report presents the results of analysis to determine where there are gaps in the network of gaged locations, how accurately desired statistics can be calculated with a given length of record, and whether the current network allows for estimation of these statistics at ungaged locations. The analysis indicated that there is variability across the Nation’s streamflow data-collection network in terms of the spatial and temporal coverage of streamgages. In general, the Eastern United States has better coverage than the Western United States. The arid Southwestern United States, Alaska, and Hawaii were observed to have the poorest spatial coverage, using the dataset assembled for this study. Except in Hawaii, these areas also tended to have short streamflow records. Differences in hydrology lead to differences in the uncertainty of statistics calculated in different regions of the country. Arid and semiarid areas of the Central and Southwestern United States generally exhibited the highest levels of interannual variability in flow, leading to larger uncertainty in flow statistics. At ungaged locations, information can be transferred from nearby streamgages if there is sufficient similarity between the gaged watersheds and the ungaged watersheds of interest. Areas where streamgages exhibit high correlation are most likely to be suitable for this type of information transfer. The areas with the most highly correlated streamgages appear to coincide with mountainous areas of the United States. Lower correlations are found in the Central United States and coastal areas of the Southeastern United States. Information transfer from gaged basins to ungaged basins is also most likely to be successful

  17. Water adsorption on the P-rich GaP(100) surface: optical spectroscopy from first principles

    NASA Astrophysics Data System (ADS)

    May, Matthias M.; Sprik, Michiel

    2018-03-01

    The contact of water with semiconductors typically changes its surface electronic structure by oxidation or corrosion processes. A detailed knowledge—or even control of—the surface structure is highly desirable, as it impacts the performance of opto-electronic devices from gas-sensing to energy conversion applications. It is also a prerequisite for density functional theory-based modelling of the electronic structure in contact with an electrolyte. The P-rich GaP(100) surface is extraordinary with respect to its contact with gas-phase water, as it undergoes a surface reordering, but does not oxidise. We investigate the underlying changes of the surface in contact with water by means of theoretically derived reflection anisotropy spectroscopy (RAS). A comparison of our results with experiment reveals that a water-induced hydrogen-rich phase on the surface is compatible with the boundary conditions from experiment, reproducing the optical spectra. We discuss potential reaction paths that comprise a water-enhanced hydrogen mobility on the surface. Our results also show that computational RAS—required for the interpretation of experimental signatures—is feasible for GaP in contact with water double layers. Here, RAS is sensitive to surface electric fields, which are an important ingredient of the Helmholtz-layer. This paves the way for future investigations of RAS at the semiconductor–electrolyte interface.

  18. Constrained sampling experiments reveal principles of detection in natural scenes.

    PubMed

    Sebastian, Stephen; Abrams, Jared; Geisler, Wilson S

    2017-07-11

    A fundamental everyday visual task is to detect target objects within a background scene. Using relatively simple stimuli, vision science has identified several major factors that affect detection thresholds, including the luminance of the background, the contrast of the background, the spatial similarity of the background to the target, and uncertainty due to random variations in the properties of the background and in the amplitude of the target. Here we use an experimental approach based on constrained sampling from multidimensional histograms of natural stimuli, together with a theoretical analysis based on signal detection theory, to discover how these factors affect detection in natural scenes. We sorted a large collection of natural image backgrounds into multidimensional histograms, where each bin corresponds to a particular luminance, contrast, and similarity. Detection thresholds were measured for a subset of bins spanning the space, where a natural background was randomly sampled from a bin on each trial. In low-uncertainty conditions, both the background bin and the amplitude of the target were fixed, and, in high-uncertainty conditions, they varied randomly on each trial. We found that thresholds increase approximately linearly along all three dimensions and that detection accuracy is unaffected by background bin and target amplitude uncertainty. The results are predicted from first principles by a normalized matched-template detector, where the dynamic normalizing gain factor follows directly from the statistical properties of the natural backgrounds. The results provide an explanation for classic laws of psychophysics and their underlying neural mechanisms.

  19. The ends of uncertainty: Air quality science and planning in Central California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fine, James

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently

  20. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity

  1. Business Simulation Exercises in Small Business Management Education: Using Principles and Ideas from Action Learning

    ERIC Educational Resources Information Center

    Gabrielsson, Jonas; Tell, Joakim; Politis, Diamanto

    2010-01-01

    Recent calls to close the rigour-relevance gap in business school education have suggested incorporating principles and ideas from action learning in small business management education. In this paper we discuss how business simulation exercises can be used as a platform to trigger students' learning by providing them with a platform where they…

  2. Gap Resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labutti, Kurt; Foster, Brian; Lapidus, Alla

    Gap Resolution is a software package that was developed to improve Newbler genome assemblies by automating the closure of sequence gaps caused by repetitive regions in the DNA. This is done by performing the follow steps:1) Identify and distribute the data for each gap in sub-projects. 2) Assemble the data associated with each sub-project using a secondary assembler, such as Newbler or PGA. 3) Determine if any gaps are closed after reassembly, and either design fakes (consensus of closed gap) for those that closed or lab experiments for those that require additional data. The software requires as input a genomemore » assembly produce by the Newbler assembler provided by Roche and 454 data containing paired-end reads.« less

  3. Some Aspects of the Implementation of the Principle of Transparency in Russian Universities: Research, Experience, Perspectives

    ERIC Educational Resources Information Center

    Egorov, Evgeny Evgenievich; Lebedev?, Tatiana Evgenievna; Bulganina, Svetlana Viktorovna; Vasilyeva, Lyudmila Ivanovna

    2015-01-01

    The aim of this study is to identify achieved successes, existing gaps and possible prospects of implementing the principle of transparency by Russian universities. It was focused upon the information transparency of educational activities from the perspective of legal requirements and interests of applicants and university students. The analysis…

  4. Gap effects on leaf traits of tropical rainforest trees differing in juvenile light requirement.

    PubMed

    Houter, Nico C; Pons, Thijs L

    2014-05-01

    The relationships of 16 leaf traits and their plasticity with the dependence of tree species on gaps for regeneration (gap association index; GAI) were examined in a Neotropical rainforest. Young saplings of 24 species with varying GAI were grown under a closed canopy, in a medium-sized and in a large gap, thus capturing the full range of plasticity with respect to canopy openness. Structural, biomechanical, chemical and photosynthetic traits were measured. At the chloroplast level, the chlorophyll a/b ratio and plasticity in this variable were not related to the GAI. However, plasticity in total carotenoids per unit chlorophyll was larger in shade-tolerant species. At the leaf level, leaf mass per unit area (LMA) decreased with the GAI under the closed canopy and in the medium gap, but did not significantly decrease with the GAI in the large gap. This was a reflection of the larger plasticity in LMA and leaf thickness of gap-dependent species. The well-known opposite trends in LMA for adaptation and acclimation to high irradiance in evergreen tropical trees were thus not invariably found. Although leaf strength was dependent on LMA and thickness, plasticity in this trait was not related to the GAI. Photosynthetic capacity expressed on each basis increased with the GAI, but the large plasticity in these traits was not clearly related to the GAI. Although gap-dependent species tended to have a greater plasticity overall, as evident from a principle component analysis, leaf traits of gap-dependent species are thus not invariably more phenotypically plastic.

  5. Scientific basis for the Precautionary Principle.

    PubMed

    Vineis, Paolo

    2005-09-01

    The Precautionary Principle is based on two general criteria: (a) appropriate public action should be taken in response to limited, but plausible and credible, evidence of likely and substantial harm; (b) the burden of proof is shifted from demonstrating the presence of risk to demonstrating the absence of risk. Not much has been written about the scientific basis of the precautionary principle, apart from the uncertainty that characterizes epidemiologic research on chronic disease, and the use of surrogate evidence when human evidence cannot be provided. It is proposed in this paper that a new scientific paradigm, based on the theory of evolution, is emerging; this might offer stronger support to the need for precaution in the regulation of environmental risks. Environmental hazards do not consist only in direct attacks to the integrity of DNA or other macromolecules. They can consist in changes that take place already in utero, and that condition disease risks many years later. Also, environmental exposures can act as "stressors", inducing hypermutability (the mutator phenotype) as an adaptive response. Finally, environmental changes should be evaluated against a background of a not-so-easily modifiable genetic make-up, inherited from a period in which humans were mainly hunters-gatherers and had dietary habits very different from the current ones.

  6. Using creation science to demonstrate evolution: application of a creationist method for visualizing gaps in the fossil record to a phylogenetic study of coelurosaurian dinosaurs.

    PubMed

    Senter, P

    2010-08-01

    It is important to demonstrate evolutionary principles in such a way that they cannot be countered by creation science. One such way is to use creation science itself to demonstrate evolutionary principles. Some creation scientists use classic multidimensional scaling (CMDS) to quantify and visualize morphological gaps or continuity between taxa, accepting gaps as evidence of independent creation and accepting continuity as evidence of genetic relatedness. Here, I apply CMDS to a phylogenetic analysis of coelurosaurian dinosaurs and show that it reveals morphological continuity between Archaeopteryx, other early birds, and a wide range of nonavian coelurosaurs. Creation scientists who use CMDS must therefore accept that these animals are genetically related. Other uses of CMDS for evolutionary biologists include the identification of taxa with much missing evolutionary history and the tracing of the progressive filling of morphological gaps in the fossil record through successive years of discovery.

  7. Pediatric disaster response in developed countries: ten guiding principles.

    PubMed

    Brandenburg, Mark A; Arneson, Wendy L

    2007-01-01

    Mass casualty incidents and large-scale disasters involving children are likely to overwhelm a regional disaster response system. Children have unique vulnerabilities that require special considerations when developing pediatric response systems. Although medical and trauma strategies exist for the evaluation and treatment of children on a daily basis, the application of these strategies under conditions of resource-constrained triage and treatment have rarely been evaluated. A recent report, however, by the Institute of Medicine did conclude that on a day-to-day basis the U.S. healthcare system does not adequately provide emergency medical services for children. The variability, scale, and uncertainty of disasters call for a set of guiding principles rather than rigid protocols when developing pediatric response plans. The authors propose the following guiding principles in addressing the well-recognized, unique vulnerabilities of children: (1) terrorism prevention and preparedness, (2) all-hazards preparedness, (3) postdisaster disease and injury prevention, (4) nutrition and hydration, (5) equipment and supplies, (6) pharmacology, (7) mental health, (8) identification and reunification of displaced children, (9) day care and school, and (10) perinatology. It is hoped that the 10 guiding principles discussed in this article will serve as a basic framework for developing pediatric response plans and teams in developed countries.

  8. Dynamic traversal of large gaps by insects and legged robots reveals a template.

    PubMed

    Gart, Sean W; Yan, Changxin; Othayoth, Ratan; Ren, Zhiyi; Li, Chen

    2018-02-02

    It is well known that animals can use neural and sensory feedback via vision, tactile sensing, and echolocation to negotiate obstacles. Similarly, most robots use deliberate or reactive planning to avoid obstacles, which relies on prior knowledge or high-fidelity sensing of the environment. However, during dynamic locomotion in complex, novel, 3D terrains, such as a forest floor and building rubble, sensing and planning suffer bandwidth limitation and large noise and are sometimes even impossible. Here, we study rapid locomotion over a large gap-a simple, ubiquitous obstacle-to begin to discover the general principles of the dynamic traversal of large 3D obstacles. We challenged the discoid cockroach and an open-loop six-legged robot to traverse a large gap of varying length. Both the animal and the robot could dynamically traverse a gap as large as one body length by bridging the gap with its head, but traversal probability decreased with gap length. Based on these observations, we developed a template that accurately captured body dynamics and quantitatively predicted traversal performance. Our template revealed that a high approach speed, initial body pitch, and initial body pitch angular velocity facilitated dynamic traversal, and successfully predicted a new strategy for using body pitch control that increased the robot's maximal traversal gap length by 50%. Our study established the first template of dynamic locomotion beyond planar surfaces, and is an important step in expanding terradynamics into complex 3D terrains.

  9. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  10. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    PubMed

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  11. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    PubMed

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline.

  12. Measurement uncertainty relations: characterising optimal error bounds for qubits

    NASA Astrophysics Data System (ADS)

    Bullock, T.; Busch, P.

    2018-07-01

    In standard formulations of the uncertainty principle, two fundamental features are typically cast as impossibility statements: two noncommuting observables cannot in general both be sharply defined (for the same state), nor can they be measured jointly. The pioneers of quantum mechanics were acutely aware and puzzled by this fact, and it motivated Heisenberg to seek a mitigation, which he formulated in his seminal paper of 1927. He provided intuitive arguments to show that the values of, say, the position and momentum of a particle can at least be unsharply defined, and they can be measured together provided some approximation errors are allowed. Only now, nine decades later, a working theory of approximate joint measurements is taking shape, leading to rigorous and experimentally testable formulations of associated error tradeoff relations. Here we briefly review this new development, explaining the concepts and steps taken in the construction of optimal joint approximations of pairs of incompatible observables. As a case study, we deduce measurement uncertainty relations for qubit observables using two distinct error measures. We provide an operational interpretation of the error bounds and discuss some of the first experimental tests of such relations.

  13. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  14. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  15. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  16. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation.

    PubMed

    Zarr, Robert R

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors.

  17. Assessment of Uncertainties for the NIST 1016 mm Guarded-Hot-Plate Apparatus: Extended Analysis for Low-Density Fibrous-Glass Thermal Insulation

    PubMed Central

    Zarr, Robert R.

    2010-01-01

    An assessment of uncertainties for the National Institute of Standards and Technology (NIST) 1016 mm Guarded-Hot-Plate apparatus is presented. The uncertainties are reported in a format consistent with current NIST policy on the expression of measurement uncertainty. The report describes a procedure for determination of component uncertainties for thermal conductivity and thermal resistance for the apparatus under operation in either the double-sided or single-sided mode of operation. An extensive example for computation of uncertainties for the single-sided mode of operation is provided for a low-density fibrous-glass blanket thermal insulation. For this material, the relative expanded uncertainty for thermal resistance increases from 1 % for a thickness of 25.4 mm to 3 % for a thickness of 228.6 mm. Although these uncertainties have been developed for a particular insulation material, the procedure and, to a lesser extent, the results are applicable to other insulation materials measured at a mean temperature close to 297 K (23.9 °C, 75 °F). The analysis identifies dominant components of uncertainty and, thus, potential areas for future improvement in the measurement process. For the NIST 1016 mm Guarded-Hot-Plate apparatus, considerable improvement, especially at higher values of thermal resistance, may be realized by developing better control strategies for guarding that include better measurement techniques for the guard gap thermopile voltage and the temperature sensors. PMID:27134779

  18. BC8 Silicon (Si-III) is a Narrow-Gap Semiconductor

    NASA Astrophysics Data System (ADS)

    Zhang, Haidong; Liu, Hanyu; Wei, Kaya; Kurakevych, Oleksandr O.; Le Godec, Yann; Liu, Zhenxian; Martin, Joshua; Guerrette, Michael; Nolas, George S.; Strobel, Timothy A.

    2017-04-01

    Large-volume, phase-pure synthesis of BC8 silicon (I a 3 ¯ , c I 16 ) has enabled bulk measurements of optical, electronic, and thermal properties. Unlike previous reports that conclude BC8-Si is semimetallic, we demonstrate that this phase is a direct band gap semiconductor with a very small energy gap and moderate carrier concentration and mobility at room temperature, based on far- and midinfrared optical spectroscopy, temperature-dependent electrical conductivity, Seebeck and heat capacity measurements. Samples exhibit a plasma wavelength near 11 μ m , indicating potential for infrared plasmonic applications. Thermal conductivity is reduced by 1-2 orders of magnitude depending on temperature as compared with the diamond cubic (DC-Si) phase. The electronic structure and dielectric properties can be reproduced by first-principles calculations with hybrid functionals after adjusting the level of exact Hartree-Fock (HF) exchange mixing. These results clarify existing limited and controversial experimental data sets and ab initio calculations.

  19. Achieving biodiversity benefits with offsets: Research gaps, challenges, and needs.

    PubMed

    Gelcich, Stefan; Vargas, Camila; Carreras, Maria Jose; Castilla, Juan Carlos; Donlan, C Josh

    2017-03-01

    Biodiversity offsets are becoming increasingly common across a portfolio of settings: national policy, voluntary programs, international lending, and corporate business structures. Given the diversity of ecological, political, and socio-economic systems where offsets may be applied, place-based information is likely to be most useful in designing and implementing offset programs, along with guiding principles that assure best practice. We reviewed the research on biodiversity offsets to explore gaps and needs. While the peer-reviewed literature on offsets is growing rapidly, it is heavily dominated by ecological theory, wetland ecosystems, and U.S.-based research. Given that majority of offset policies and programs are occurring in middle- and low-income countries, the research gaps we identified present a number of risks. They also present an opportunity to create regionally based learning platforms focused on pilot projects and institutional capacity building. Scientific research should diversify, both topically and geographically, in order to support the successful design, implementation, and monitoring of biodiversity offset programs.

  20. Judging the 'passability' of dynamic gaps in a virtual rugby environment.

    PubMed

    Watson, Gareth; Brault, Sebastien; Kulpa, Richard; Bideau, Benoit; Butterfield, Joe; Craig, Cathy

    2011-10-01

    Affordances have recently been proposed as a guiding principle in perception-action research in sport (Fajen, Riley, & Turvey, 2009). In the present study, perception of the 'passability' affordance of a gap between two approaching defenders in rugby is explored. A simplified rugby gap closure scenario was created using immersive, interactive virtual reality technology where 14 novice participants (attacker) judged the passability of the gap between two virtual defenders via a perceptual judgment (button press) task. The scenario was modeled according to tau theory (Lee, 1976) and a psychophysical function was fitted to the response data. Results revealed that a tau-based informational quantity could account for 82% of the variance in the data. Findings suggest that the passability affordance in this case, is defined by this variable and participants were able to use it in order to inform prospective judgments as to passability. These findings contribute to our understanding of affordances and how they may be defined in this particular sporting scenario; however, some limitations regarding methodology, such as decoupling perception and action are also acknowledged. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  2. Comprehensive comparison of gap filling techniques for eddy covariance net carbon fluxes

    NASA Astrophysics Data System (ADS)

    Moffat, A. M.; Papale, D.; Reichstein, M.; Hollinger, D. Y.; Richardson, A. D.; Barr, A. G.; Beckstein, C.; Braswell, B. H.; Churkina, G.; Desai, A. R.; Falge, E.; Gove, J. H.; Heimann, M.; Hui, D.; Jarvis, A. J.; Kattge, J.; Noormets, A.; Stauch, V. J.

    2007-12-01

    Review of fifteen techniques for estimating missing values of net ecosystem CO2 exchange (NEE) in eddy covariance time series and evaluation of their performance for different artificial gap scenarios based on a set of ten benchmark datasets from six forested sites in Europe. The goal of gap filling is the reproduction of the NEE time series and hence this present work focuses on estimating missing NEE values, not on editing or the removal of suspect values in these time series due to systematic errors in the measurements (e.g. nighttime flux, advection). The gap filling was examined by generating fifty secondary datasets with artificial gaps (ranging in length from single half-hours to twelve consecutive days) for each benchmark dataset and evaluating the performance with a variety of statistical metrics. The performance of the gap filling varied among sites and depended on the level of aggregation (native half- hourly time step versus daily), long gaps were more difficult to fill than short gaps, and differences among the techniques were more pronounced during the day than at night. The non-linear regression techniques (NLRs), the look-up table (LUT), marginal distribution sampling (MDS), and the semi-parametric model (SPM) generally showed good overall performance. The artificial neural network based techniques (ANNs) were generally, if only slightly, superior to the other techniques. The simple interpolation technique of mean diurnal variation (MDV) showed a moderate but consistent performance. Several sophisticated techniques, the dual unscented Kalman filter (UKF), the multiple imputation method (MIM), the terrestrial biosphere model (BETHY), but also one of the ANNs and one of the NLRs showed high biases which resulted in a low reliability of the annual sums, indicating that additional development might be needed. An uncertainty analysis comparing the estimated random error in the ten benchmark datasets with the artificial gap residuals suggested that the

  3. Response to Burgman and Regan: the elephant in the rhetoric on info-gap decision theory.

    PubMed

    Sniedovich, Moshe

    2014-01-01

    The formal, rigorous assessment of IGDT in Sniedovich (2012) reveals that this theory's central pillar, namely its robustness model, is a reinvention of a well-established model of local robustness, known universally as radius of stability (circa 1960). As a matter of fact, this robustness model is a simple model derived from Wald's famous maximin paradigm (circa 1940). This means that had there been any gap in the state of the art that IGDT could have possibly presumed to fill, this gap had already been filled decades ago, well before IGDT was even contemplated. The conclusion therefore is that there is no gap in the state of the art that IGDT does fill, or can possibly fill, or is called upon to fill. Also, since IGDT is based on a definition of local robustness, the theory is unsuitable for the treatment of a severe uncertainty of the type that this theory claims to address. Therefore, since the theory claims to be particularly suitable for the treatment of a severe, unbounded uncertainty, the inevitable conclusion is that this theory constitutes a voodoo decision theory par excellence. Fig. 1 speaks for itself so that no amount of rhetoric can explain this fact away. The Letter's attempt to brush off valid, rigorous, well-documented criticism of IGDT as "... haggling over terminology ..." is yet another attempt to avoid dealing with the elephant in the IGDT room. Nothing will be gained from the use of misleading rhetorics to argue that ideas, models, techniques, approaches, etc., that go back to the 1940s and 1960s, are IGDT innovations. But more than this, what good can come of misapplications of these ideas in applied ecology and conservation biology? In the Appendix, I address a more intriguing question, namely: QUESTION 2: What could possibly be the rationale that motivated a search for a (nonexistent) gap in the state of the art for IGDT to fill?

  4. Band gaps in grid structure with periodic local resonator subsystems

    NASA Astrophysics Data System (ADS)

    Zhou, Xiaoqin; Wang, Jun; Wang, Rongqi; Lin, Jieqiong

    2017-09-01

    The grid structure is widely used in architectural and mechanical field for its high strength and saving material. This paper will present a study on an acoustic metamaterial beam (AMB) based on the normal square grid structure with local resonators owning both flexible band gaps and high static stiffness, which have high application potential in vibration control. Firstly, the AMB with variable cross-section frame is analytically modeled by the beam-spring-mass model that is provided by using the extended Hamilton’s principle and Bloch’s theorem. The above model is used for computing the dispersion relation of the designed AMB in terms of the design parameters, and the influences of relevant parameters on band gaps are discussed. Then a two-dimensional finite element model of the AMB is built and analyzed in COMSOL Multiphysics, both the dispersion properties of unit cell and the wave attenuation in a finite AMB have fine agreement with the derived model. The effects of design parameters of the two-dimensional model in band gaps are further examined, and the obtained results can well verify the analytical model. Finally, the wave attenuation performances in three-dimensional AMBs with equal and unequal thickness are presented and discussed.

  5. Seismic gaps and source zones of recent large earthquakes in coastal Peru

    USGS Publications Warehouse

    Dewey, J.W.; Spence, W.

    1979-01-01

    The earthquakes of central coastal Peru occur principally in two distinct zones of shallow earthquake activity that are inland of and parallel to the axis of the Peru Trench. The interface-thrust (IT) zone includes the great thrust-fault earthquakes of 17 October 1966 and 3 October 1974. The coastal-plate interior (CPI) zone includes the great earthquake of 31 May 1970, and is located about 50 km inland of and 30 km deeper than the interface thrust zone. The occurrence of a large earthquake in one zone may not relieve elastic strain in the adjoining zone, thus complicating the application of the seismic gap concept to central coastal Peru. However, recognition of two seismic zones may facilitate detection of seismicity precursory to a large earthquake in a given zone; removal of probable CPI-zone earthquakes from plots of seismicity prior to the 1974 main shock dramatically emphasizes the high seismic activity near the rupture zone of that earthquake in the five years preceding the main shock. Other conclusions on the seismicity of coastal Peru that affect the application of the seismic gap concept to this region are: (1) Aftershocks of the great earthquakes of 1966, 1970, and 1974 occurred in spatially separated clusters. Some clusters may represent distinct small source regions triggered by the main shock rather than delimiting the total extent of main-shock rupture. The uncertainty in the interpretation of aftershock clusters results in corresponding uncertainties in estimates of stress drop and estimates of the dimensions of the seismic gap that has been filled by a major earthquake. (2) Aftershocks of the great thrust-fault earthquakes of 1966 and 1974 generally did not extend seaward as far as the Peru Trench. (3) None of the three great earthquakes produced significant teleseismic activity in the following month in the source regions of the other two earthquakes. The earthquake hypocenters that form the basis of this study were relocated using station

  6. Technical note: Dynamic INtegrated Gap-filling and partitioning for OzFlux (DINGO)

    NASA Astrophysics Data System (ADS)

    Beringer, Jason; McHugh, Ian; Hutley, Lindsay B.; Isaac, Peter; Kljun, Natascha

    2017-03-01

    Standardised, quality-controlled and robust data from flux networks underpin the understanding of ecosystem processes and tools necessary to support the management of natural resources, including water, carbon and nutrients for environmental and production benefits. The Australian regional flux network (OzFlux) currently has 23 active sites and aims to provide a continental-scale national research facility to monitor and assess Australia's terrestrial biosphere and climate for improved predictions. Given the need for standardised and effective data processing of flux data, we have developed a software suite, called the Dynamic INtegrated Gap-filling and partitioning for OzFlux (DINGO), that enables gap-filling and partitioning of the primary fluxes into ecosystem respiration (Fre) and gross primary productivity (GPP) and subsequently provides diagnostics and results. We outline the processing pathways and methodologies that are applied in DINGO (v13) to OzFlux data, including (1) gap-filling of meteorological and other drivers; (2) gap-filling of fluxes using artificial neural networks; (3) the u* threshold determination; (4) partitioning into ecosystem respiration and gross primary productivity; (5) random, model and u* uncertainties; and (6) diagnostic, footprint calculation, summary and results outputs. DINGO was developed for Australian data, but the framework is applicable to any flux data or regional network. Quality data from robust systems like DINGO ensure the utility and uptake of the flux data and facilitates synergies between flux, remote sensing and modelling.

  7. First-principles study of electronic structure modulations in graphene on Ru(0001) by Au intercalation

    NASA Astrophysics Data System (ADS)

    Nishidate, Kazume; Tanibayashi, Satoru; Yoshimoto, Noriyuki; Hasegawa, Masayuki

    2018-03-01

    First-principles calculations based on density functional theory are used to explore the electronic-structure modulations in graphene on Ru(0001) by Au intercalation. We first use a lattice-matched model to demonstrate that a substantial band gap is induced in graphene by sufficiently strong A-B sublattice symmetry breaking. This band gap opening occurs even in the absence of hybridization between graphene π states and Au states, and a strong sublattice asymmetry is established for a small separation (d ) between the graphene and Au layer, typically, d <3.0 Å , which can actually be achieved for a low Au coverage. In realistic situations, which are mimicked using lattice-mismatched models, graphene π states near the Dirac point easily hybridize with nearby (in energy) Au states even for a van der Waals distance, d ˜3.4 Å , and this hybridization usually dictates a band gap opening in graphene. In that case, the top parts of the intact Dirac cones survive the hybridization and are isolated to form midgap states within the hybridization gap, denying that the band gap is induced by sublattice symmetry breaking. This feature of a band gap opening is similar to that found for the so-called "first" graphene layer on silicon carbide (SiC) and the predicted band gap and doping level are in good agreement with the experiments for graphene/Au/Ru(0001).

  8. A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.

    PubMed

    Zheng, Chaojie; Wang, Xiuying; Feng, Dagan

    2015-01-01

    PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.

  9. [Research progress of larger flexion gap than extension gap in total knee arthroplasty].

    PubMed

    Zhang, Weisong; Hao, Dingjun

    2017-05-01

    To summarize the progress of larger flexion gap than extension gap in total knee arthro-plasty (TKA). The domestic and foreign related literature about larger flexion gap than extension gap in TKA, and its impact factors, biomechanical and kinematic features, and clinical results were summarized. During TKA, to adjust the relations of flexion gap and extension gap is one of the key factors of successful operation. The biomechanical, kinematic, and clinical researches show that properly larger flexion gap than extension gap can improve both the postoperative knee range of motion and the satisfaction of patients, but does not affect the stability of the knee joint. However, there are also contrary findings. So adjustment of flexion gap and extension gap during TKA is still in dispute. Larger flexion gap than extension gap in TKA is a new joint space theory, and long-term clinical efficacy, operation skills, and related complications still need further study.

  10. Quantifying catchment water balances and their uncertainties by expert elicitation

    NASA Astrophysics Data System (ADS)

    Sebok, Eva; Refsgaard, Jens Christian; Warmink, Jord J.; Stisen, Simon; Høgh Jensen, Karsten

    2017-04-01

    The increasing demand on water resources necessitates a more responsible and sustainable water management requiring a thorough understanding of hydrological processes both on small scale and on catchment scale. On catchment scale, the characterization of hydrological processes is often carried out by calculating a water balance based on the principle of mass conservation in hydrological fluxes. Assuming a perfect water balance closure and estimating one of these fluxes as a residual of the water balance is a common practice although this estimate will contain uncertainties related to uncertainties in the other components. Water balance closure on the catchment scale is also an issue in Denmark, thus, it was one of the research objectives of the HOBE hydrological observatory, that has been collecting data in the Skjern river catchment since 2008. Water balance components in the 1050 km2 Ahlergaarde catchment and the nested 120 km2 Holtum catchment, located in the glacial outwash plan of the Skjern catchment, were estimated using a multitude of methods. As the collected data enables the complex assessment of uncertainty of both the individual water balance components and catchment-scale water balances, the expert elicitation approach was chosen to integrate the results of the hydrological observatory. This approach relies on the subjective opinion of experts whose available knowledge and experience about the subject allows to integrate complex information from multiple sources. In this study 35 experts were involved in a multi-step elicitation process with the aim of (1) eliciting average annual values of water balance components for two nested catchments and quantifying the contribution of different sources of uncertainties to the total uncertainty in these average annual estimates; (2) calculating water balances for two catchments by reaching consensus among experts interacting in form of group discussions. To address the complex problem of water balance closure

  11. The Iquique earthquake sequence of April 2014: Bayesian modeling accounting for prediction uncertainty

    USGS Publications Warehouse

    Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Riel, Bryan; Owen, Susan E; Moore, Angelyn W; Samsonov, Sergey V; Ortega Culaciati, Francisco; Minson, Sarah E.

    2016-01-01

    The subduction zone in northern Chile is a well-identified seismic gap that last ruptured in 1877. On 1 April 2014, this region was struck by a large earthquake following a two week long series of foreshocks. This study combines a wide range of observations, including geodetic, tsunami, and seismic data, to produce a reliable kinematic slip model of the Mw=8.1 main shock and a static slip model of the Mw=7.7 aftershock. We use a novel Bayesian modeling approach that accounts for uncertainty in the Green's functions, both static and dynamic, while avoiding nonphysical regularization. The results reveal a sharp slip zone, more compact than previously thought, located downdip of the foreshock sequence and updip of high-frequency sources inferred by back-projection analysis. Both the main shock and the Mw=7.7 aftershock did not rupture to the trench and left most of the seismic gap unbroken, leaving the possibility of a future large earthquake in the region.

  12. Gap Junctions

    PubMed Central

    Nielsen, Morten Schak; Axelsen, Lene Nygaard; Sorgen, Paul L.; Verma, Vandana; Delmar, Mario; Holstein-Rathlou, Niels-Henrik

    2013-01-01

    Gap junctions are essential to the function of multicellular animals, which require a high degree of coordination between cells. In vertebrates, gap junctions comprise connexins and currently 21 connexins are known in humans. The functions of gap junctions are highly diverse and include exchange of metabolites and electrical signals between cells, as well as functions, which are apparently unrelated to intercellular communication. Given the diversity of gap junction physiology, regulation of gap junction activity is complex. The structure of the various connexins is known to some extent; and structural rearrangements and intramolecular interactions are important for regulation of channel function. Intercellular coupling is further regulated by the number and activity of channels present in gap junctional plaques. The number of connexins in cell-cell channels is regulated by controlling transcription, translation, trafficking, and degradation; and all of these processes are under strict control. Once in the membrane, channel activity is determined by the conductive properties of the connexin involved, which can be regulated by voltage and chemical gating, as well as a large number of posttranslational modifications. The aim of the present article is to review our current knowledge on the structure, regulation, function, and pharmacology of gap junctions. This will be supported by examples of how different connexins and their regulation act in concert to achieve appropriate physiological control, and how disturbances of connexin function can lead to disease. © 2012 American Physiological Society. Compr Physiol 2:1981-2035, 2012. PMID:23723031

  13. Decision-Making under Criteria Uncertainty

    NASA Astrophysics Data System (ADS)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  14. Embracing uncertainty in applied ecology.

    PubMed

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  15. Water Table Uncertainties due to Uncertainties in Structure and Properties of an Unconfined Aquifer.

    PubMed

    Hauser, Juerg; Wellmann, Florian; Trefry, Mike

    2018-03-01

    We consider two sources of geology-related uncertainty in making predictions of the steady-state water table elevation for an unconfined aquifer. That is the uncertainty in the depth to base of the aquifer and in the hydraulic conductivity distribution within the aquifer. Stochastic approaches to hydrological modeling commonly use geostatistical techniques to account for hydraulic conductivity uncertainty within the aquifer. In the absence of well data allowing derivation of a relationship between geophysical and hydrological parameters, the use of geophysical data is often limited to constraining the structural boundaries. If we recover the base of an unconfined aquifer from an analysis of geophysical data, then the associated uncertainties are a consequence of the geophysical inversion process. In this study, we illustrate this by quantifying water table uncertainties for the unconfined aquifer formed by the paleochannel network around the Kintyre Uranium deposit in Western Australia. The focus of the Bayesian parametric bootstrap approach employed for the inversion of the available airborne electromagnetic data is the recovery of the base of the paleochannel network and the associated uncertainties. This allows us to then quantify the associated influences on the water table in a conceptualized groundwater usage scenario and compare the resulting uncertainties with uncertainties due to an uncertain hydraulic conductivity distribution within the aquifer. Our modeling shows that neither uncertainties in the depth to the base of the aquifer nor hydraulic conductivity uncertainties alone can capture the patterns of uncertainty in the water table that emerge when the two are combined. © 2017, National Ground Water Association.

  16. First principles study of CuAlO2 doping with S

    NASA Astrophysics Data System (ADS)

    Gao, Haigen; Zhou, Jian; Lu, Minghui

    2010-07-01

    We study the electronic properties of CuAlO2 doped with S by the first principles calculations and find that the band gap of CuAlO2 is reduced after the doping. At the same time, the effective masses are also reduced and the density of states could cross the Fermi level. These results show that the conductivity of CuAlO2 could be enhanced by doping the impurities of S, which needs to be further studied.

  17. Gate-independent energy gap in noncovalently intercalated bilayer graphene on SiC(0001)

    NASA Astrophysics Data System (ADS)

    Li, Yuanchang

    2016-12-01

    Our first-principles calculations show that an energy gap around 0.12-0.25 eV can be engineered in epitaxial graphene on SiC(0001) through the noncovalent intercalation of transition or alkali metals but originated from the distinct mechanisms. The former is attributed to the combined effects of a metal-induced perpendicular electric field and interaction, while the latter is solely attributed to the built-in electric field. A great advantage of this scheme is that the gap size is almost independent of the gate voltage up to 1 V/nm, thus reserving the electric means to tune the Fermi level of graphene when configured as field-effect transistors. Given the recent progress in experimental techniques for intercalated graphene, our findings provide a practical way to incorporate graphene in the current semiconductor industry.

  18. Test of Equivalence Principle at 10(-8) Level by a Dual-Species Double-Diffraction Raman Atom Interferometer.

    PubMed

    Zhou, Lin; Long, Shitong; Tang, Biao; Chen, Xi; Gao, Fen; Peng, Wencui; Duan, Weitao; Zhong, Jiaqi; Xiong, Zongyuan; Wang, Jin; Zhang, Yuanzhong; Zhan, Mingsheng

    2015-07-03

    We report an improved test of the weak equivalence principle by using a simultaneous 85Rb-87Rb dual-species atom interferometer. We propose and implement a four-wave double-diffraction Raman transition scheme for the interferometer, and demonstrate its ability in suppressing common-mode phase noise of Raman lasers after their frequencies and intensity ratios are optimized. The statistical uncertainty of the experimental data for Eötvös parameter η is 0.8×10(-8) at 3200 s. With various systematic errors corrected, the final value is η=(2.8±3.0)×10(-8). The major uncertainty is attributed to the Coriolis effect.

  19. Trajectory formation principles are the same after mild or moderate stroke

    PubMed Central

    van Dokkum, Liesjet Elisabeth Henriette; Froger, Jérôme; Gouaïch, Abdelkader; Laffont, Isabelle

    2017-01-01

    When we make rapid reaching movements, we have to trade speed for accuracy. To do so, the trajectory of our hand is the result of an optimal balance between feed-forward and feed-back control in the face of signal-dependant noise in the sensorimotor system. How far do these principles of trajectory formation still apply after a stroke, for persons with mild to moderate sensorimotor deficits who recovered some reaching ability? Here, we examine the accuracy of fast hand reaching movements with a focus on the information capacity of the sensorimotor system and its relation to trajectory formation in young adults, in persons who had a stroke and in age-matched control participants. We find that persons with stroke follow the same trajectory formation principles, albeit parameterized differently in the face of higher sensorimotor uncertainty. Higher directional errors after a stroke result in less feed-forward control, hence more feed-back loops responsible for segmented movements. As a consequence, movements are globally slower to reach the imposed accuracy, and the information throughput of the sensorimotor system is lower after a stroke. The fact that the most abstract principles of motor control remain after a stroke suggests that clinicians can capitalize on existing theories of motor control and learning to derive principled rehabilitation strategies. PMID:28329000

  20. Genetic concepts and uncertainties in restoring fish populations and species

    USGS Publications Warehouse

    Reisenbichler, R.R.; Utter, F.M.; Krueger, C.C.

    2003-01-01

    Genetic considerations can be crucially important to the success of reintroductions of lotic species. Current paradigms for conservation and population genetics provide guidance for reducing uncertainties in genetic issues and for increasing the likelihood of achieving restoration. Effective restoration is facilitated through specific goals and objectives developed from the definition that a restored or healthy population is (i) genetically adapted to the local environment, (ii) self-sustaining at abundances consistent with the carrying capacity of the river system, (iii) genetically compatible with neighboring populations so that substantial outbreeding depression does not result from straying and interbreeding between populations, and (iv) sufficiently diverse genetically to accommodate environmental variability over many decades. Genetic principles reveal the importance of describing and adhering to the ancestral lineages for the species to be restored and enabling genetic processes to maintain diversity and fitness in the populations under restoration. Newly established populations should be protected from unnecessary human sources of mortality, gene flow from maladapted (e.g., hatchery) or exotic populations, and inadvertent selection by fisheries or other human activities. Such protection facilitates initial, rapid adaptation of the population to its environment and should enhance the chances for persistence. Various uncertainties about specific restoration actions must be addressed on a case-by-case basis. Such uncertainties include whether to allow natural colonization or to introduce fish, which populations are suitable as sources for reintroduction, appropriate levels of gene flow from other populations, appropriate levels of artificial production, appropriate minimum numbers of individuals released or maintained in the population, and the best developmental stages for releasing fish into the restored stream. Rigorous evaluation or

  1. Uncertainty Propagation in OMFIT

    NASA Astrophysics Data System (ADS)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  2. Sources of uncertanity as a basis to fill the information gap in a response to flood

    NASA Astrophysics Data System (ADS)

    Kekez, Toni; Knezic, Snjezana

    2016-04-01

    Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.

  3. Engineering an Insulating Ferroelectric Superlattice with a Tunable Band Gap from Metallic Components

    NASA Astrophysics Data System (ADS)

    Ghosh, Saurabh; Borisevich, Albina Y.; Pantelides, Sokrates T.

    2017-10-01

    The recent discovery of "polar metals" with ferroelectriclike displacements offers the promise of designing ferroelectrics with tunable energy gaps by inducing controlled metal-insulator transitions. Here we employ first-principles calculations to design a metallic polar superlattice from nonpolar metal components and show that controlled intermixing can lead to a true insulating ferroelectric with a tunable band gap. We consider a 2 /2 superlattice made of two centrosymmetric metallic oxides, La0.75 Sr0.25 MnO3 and LaNiO3 , and show that ferroelectriclike displacements are induced. The ferroelectriclike distortion is found to be strongly dependent on the carrier concentration (Sr content). Further, we show that a metal-to-insulator (MI) transition is feasible in this system via disproportionation of the Ni sites. Such a disproportionation and, hence, a MI transition can be driven by intermixing of transition metal ions between Mn and Ni layers. As a result, the energy gap of the resulting ferroelectric can be tuned by varying the degree of intermixing in the experimental fabrication method.

  4. Tuning of electronic band gaps and optoelectronic properties of binary strontium chalcogenides by means of doping of magnesium atom(s)- a first principles based theoretical initiative with mBJ, B3LYP and WC-GGA functionals

    NASA Astrophysics Data System (ADS)

    Debnath, Bimal; Sarkar, Utpal; Debbarma, Manish; Bhattacharjee, Rahul; Chattopadhyaya, Surya

    2018-02-01

    First principle based theoretical initiative is taken to tune the optoelectronic properties of binary strontium chalcogenide semiconductors by doping magnesium atom(s) into their rock-salt unit cells at specific concentrations x = 0.0, 0.25, 0.50, 0.75 and 1.0 and such tuning is established by studying structural, electronic and optical properties of designed binary compounds and ternary alloys employing WC-GGA, B3LYP and mBJ exchange-correlation functionals. Band structure of each compound is constructed and respective band gaps under all the potential schemes are measured. The band gap bowing and its microscopic origin are calculated using quadratic fit and Zunger's approach, respectively. The atomic and orbital origins of electronic states in the band structure of any compound are explored from its density of states. The nature of chemical bonds between the constituent atoms in each compound is explored from the valence electron density contour plots. Optical properties of any specimen are explored from the computed spectra of its dielectric function, refractive index, extinction coefficient, normal incidence reflectivity, optical conductivity optical absorption and energy loss function. Several calculated results are compared with available experimental and earlier theoretical data.

  5. Innovative surgery and the precautionary principle.

    PubMed

    Meyerson, Denise

    2013-12-01

    Surgical innovation involves practices, such as new devices, technologies, procedures, or applications, which are novel and untested. Although innovative practices are believed to offer an improvement on the standard surgical approach, they may prove to be inefficacious or even dangerous. This article considers how surgeons considering innovation should reason in the conditions of uncertainty that characterize innovative surgery. What attitude to the unknown risks of innovative surgery should they take? The answer to this question involves value judgments about the acceptability of risk taking when satisfactory scientific information is not available. This question has been confronted in legal contexts, where risk aversion in the form of the precautionary principle has become increasingly influential as a regulatory response to innovative technologies that pose uncertain future hazards. This article considers whether it is appropriate to apply a precautionary approach when making decisions about innovative surgery.

  6. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  7. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle.

  8. Models in animal collective decision-making: information uncertainty and conflicting preferences

    PubMed Central

    Conradt, Larissa

    2012-01-01

    Collective decision-making plays a central part in the lives of many social animals. Two important factors that influence collective decision-making are information uncertainty and conflicting preferences. Here, I bring together, and briefly review, basic models relating to animal collective decision-making in situations with information uncertainty and in situations with conflicting preferences between group members. The intention is to give an overview about the different types of modelling approaches that have been employed and the questions that they address and raise. Despite the use of a wide range of different modelling techniques, results show a coherent picture, as follows. Relatively simple cognitive mechanisms can lead to effective information pooling. Groups often face a trade-off between decision accuracy and speed, but appropriate fine-tuning of behavioural parameters could achieve high accuracy while maintaining reasonable speed. The right balance of interdependence and independence between animals is crucial for maintaining group cohesion and achieving high decision accuracy. In conflict situations, a high degree of decision-sharing between individuals is predicted, as well as transient leadership and leadership according to needs and physiological status. Animals often face crucial trade-offs between maintaining group cohesion and influencing the decision outcome in their own favour. Despite the great progress that has been made, there remains one big gap in our knowledge: how do animals make collective decisions in situations when information uncertainty and conflict of interest operate simultaneously? PMID:23565335

  9. "He loves me, he loves me not . . . ": uncertainty can increase romantic attraction.

    PubMed

    Whitchurch, Erin R; Wilson, Timothy D; Gilbert, Daniel T

    2011-02-01

    This research qualifies a social psychological truism: that people like others who like them (the reciprocity principle). College women viewed the Facebook profiles of four male students who had previously seen their profiles. They were told that the men (a) liked them a lot, (b) liked them only an average amount, or (c) liked them either a lot or an average amount (uncertain condition). Comparison of the first two conditions yielded results consistent with the reciprocity principle. Participants were more attracted to men who liked them a lot than to men who liked them an average amount. Results for the uncertain condition, however, were consistent with research on the pleasures of uncertainty. Participants in the uncertain condition were most attracted to the men-even more attracted than were participants who were told that the men liked them a lot. Uncertain participants reported thinking about the men the most, and this increased their attraction toward the men.

  10. Preparing Teachers for Uncertainty.

    ERIC Educational Resources Information Center

    Floden, Robert E.; Clark, Christopher M.

    An examination of the various ways in which teaching is uncertain and how uncertainty pervades teachers' lives points out that teachers face uncertainties in their instructional content, ranging from difficult concepts, to unclarity about how teaching might be improved. These forms of uncertainty undermine teachers' authority, creating situations…

  11. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  12. NetCDF-U - Uncertainty conventions for netCDF datasets

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo; Nativi, Stefano; Domenico, Ben

    2013-04-01

    To facilitate the automated processing of uncertain data (e.g. uncertainty propagation in modeling applications), we have proposed a set of conventions for expressing uncertainty information within the netCDF data model and format: the NetCDF Uncertainty Conventions (NetCDF-U). From a theoretical perspective, it can be said that no dataset is a perfect representation of the reality it purports to represent. Inevitably, errors arise from the observation process, including the sensor system and subsequent processing, differences in scales of phenomena and the spatial support of the observation mechanism, lack of knowledge about the detailed conversion between the measured quantity and the target variable. This means that, in principle, all data should be treated as uncertain. The most natural representation of an uncertain quantity is in terms of random variables, with a probabilistic approach. However, it must be acknowledged that almost all existing data resources are not treated in this way. Most datasets come simply as a series of values, often without any uncertainty information. If uncertainty information is present, then it is typically within the metadata, as a data quality element. This is typically a global (dataset wide) representation of uncertainty, often derived through some form of validation process. Typically, it is a statistical measure of spread, for example the standard deviation of the residuals. The introduction of a mechanism by which such descriptions of uncertainty can be integrated into existing geospatial applications is considered a practical step towards a more accurate modeling of our uncertain understanding of any natural process. Given the generality and flexibility of the netCDF data model, conventions on naming, syntax, and semantics have been adopted by several communities of practice, as a means of improving data interoperability. Some of the existing conventions include provisions on uncertain elements and concepts, but, to our

  13. Physical Uncertainty Bounds (PUB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less

  14. Quantum scattering in one-dimensional systems satisfying the minimal length uncertainty relation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph

    In quantum gravity theories, when the scattering energy is comparable to the Planck energy the Heisenberg uncertainty principle breaks down and is replaced by the minimal length uncertainty relation. In this paper, the consequences of the minimal length uncertainty relation on one-dimensional quantum scattering are studied using an approach involving a recently proposed second-order differential equation. An exact analytical expression for the tunneling probability through a locally-periodic rectangular potential barrier system is obtained. Results show that the existence of a non-zero minimal length uncertainty tends to shift the resonant tunneling energies to the positive direction. Scattering through a locally-periodic potentialmore » composed of double-rectangular potential barriers shows that the first band of resonant tunneling energies widens for minimal length cases when the double-rectangular potential barrier is symmetric but narrows down when the double-rectangular potential barrier is asymmetric. A numerical solution which exploits the use of Wronskians is used to calculate the transmission probabilities through the Pöschl–Teller well, Gaussian barrier, and double-Gaussian barrier. Results show that the probability of passage through the Pöschl–Teller well and Gaussian barrier is smaller in the minimal length cases compared to the non-minimal length case. For the double-Gaussian barrier, the probability of passage for energies that are more positive than the resonant tunneling energy is larger in the minimal length cases compared to the non-minimal length case. The approach is exact and applicable to many types of scattering potential.« less

  15. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will

  16. New model of the average neutron and proton pairing gaps

    NASA Astrophysics Data System (ADS)

    Madland, David G.; Nix, J. Rayford

    1988-01-01

    By use of the BCS approximation applied to a distribution of dense, equally spaced levels, we derive new expressions for the average neutron pairing gap ¯gD n and average proton pairing gap ¯gD p. These expressions, which contain exponential terms, take into account the dependencies of ¯gD n and ¯gD p upon both the relative neutron excess and shape of the nucleus. The three constants that appear are determined by a least-squares adjustment to experimental pairing gaps obtained by use of fourth-order differences of measured masses. For this purpose we use the 1986 Audi-Wapstra mid-stream mass evaluation and take into account experimental uncertainties. Our new model explains not only the dependencies of ¯gD n and ¯gD p upon relative neutron excess and nuclear shape, but also the experimental result that for medium and heavy nuclei ¯gD n is generally smaller than ¯gD p. We also introduce a new expression for the average residual neutron-proton interaction energy ¯gd that appears in the masses of odd-odd nuclei, and determine the constant that appears by an analogous least-squares adjustment to experimental mass differences. Our new expressions for ¯gD n, ¯gD p and ¯gd should permit extrapolation of these quantities to heavier nuclei and to nuclei farther removed from the valley of β stability than do previous parameterizations.

  17. Optical spectroscopy and band gap analysis of hybrid improper ferroelectric Ca{sub 3}Ti{sub 2}O{sub 7}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherian, Judy G.; Harms, Nathan C.; Birol, Turan

    2016-06-27

    We bring together optical absorption spectroscopy, photoconductivity, and first principles calculations to reveal the electronic structure of the room temperature ferroelectric Ca{sub 3}Ti{sub 2}O{sub 7}. The 3.94 eV direct gap in Ca{sub 3}Ti{sub 2}O{sub 7} is charge transfer in nature and noticeably higher than that in CaTiO{sub 3} (3.4 eV), a finding that we attribute to dimensional confinement in the n = 2 member of the Ruddlesden-Popper series. While Sr substitution introduces disorder and broadens the gap edge slightly, oxygen deficiency reduces the gap to 3.7 eV and gives rise to a broad tail that persists to much lower energies.

  18. First-principles study of nitrogen-doped CuAlO2

    NASA Astrophysics Data System (ADS)

    Xu, Ying; Ao, Zhi Min; Yuan, Ding Wang

    2012-08-01

    The electronic structure and formation energies of N-doped CuAlO2 are studied using first-principles calculations. It is found that, when a N atom is doped into CuAlO2, the N atom prefers to substitute an O atom rather than to occupy an interstitial site of the Cu layer. The NO acts as a shallow accepter while the Ni acts as a deep accepter. The results of the electronic structure show that the N-doping doesn't alter the band gap of CuAlO2 for the both cases. In the substitutional case, the N impurity states occur at the top of valance band maximum (VBM), which provides holes and increases the p-type conductivity. However, in the interstitial case, the N impurity states occur in the middle of the band gap, which are more localized and this indicates that it is not good for p-type conductivity.

  19. Thermoelectric properties of AgSbTe₂ from first-principles calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaei, Nafiseh; Akbarzadeh, Hadi; Hashemifar, S. Javad, E-mail: hashemifar@cc.iut.ac.ir

    2014-09-14

    The structural, electronic, and transport properties of AgSbTe₂ are studied by using full-relativistic first-principles electronic structure calculation and semiclassical description of transport parameters. The results indicate that, within various exchange-correlation functionals, the cubic Fd3⁻m and trigonal R3⁻m structures of AgSbTe₂ are more stable than two other considered structures. The computed Seebeck coefficients at different values of the band gap and carrier concentration are accurately compared with the available experimental data to speculate a band gap of about 0.1–0.35 eV for AgSbTe₂ compound, in agreement with our calculated electronic structure within the hybrid HSE (Heyd-Scuseria-Ernzerhof) functional. By calculating the semiclassical Seebeckmore » coefficient, electrical conductivity, and electronic part of thermal conductivity, we present the theoretical upper limit of the thermoelectric figure of merit of AgSbTe₂ as a function of temperature and carrier concentration.« less

  20. First-principles studies of electron transport in Ga2O3

    NASA Astrophysics Data System (ADS)

    Kang, Youngho; Krishnaswamy, Karthik; Peelaers, Hartwin; van de Walle, Chris G.

    Ga2O3 is a wide-gap semiconductor with a monoclinic crystal structure and a band gap of 4.8 eV. Its high carrier mobility and large band gap have attracted a lot of attention for use in high power electronics and transparent conductors. Despite its potential for adoption in these applications, an understanding of its carrier transport properties is still lacking. In this study we use first-principles calculations to analyze and compute the electron scattering rates in Ga2O3. Scattering due to ionized impurities and polar longitudinal-optical (LO) phonon is taken into account. We find that the electron mobility is nearly isotropic, despite the low-symmetry monoclinic structure of Ga2O3. At low carrier densities ( 1017 cm-3), the mobility is limited by LO phonon scattering. Scattering by ionized impurities becomes increasingly important at higher carrier densities. This type of scattering is enhanced when compensating native point defects are present; in particular, gallium vacancies, which are triply negatively charged, can have a strong effect on mobility. These effects explain the downturn in mobility observed in experiments at high carrier densities. This work was supported by ARO and NSF.

  1. Conservation in the face of climate change: The roles of alternative models, monitoring, and adaptation in confronting and reducing uncertainty

    USGS Publications Warehouse

    Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.

    2011-01-01

    The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.

  2. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  3. The Knowledge Gap Versus the Belief Gap and Abstinence-Only Sex Education.

    PubMed

    Hindman, Douglas Blanks; Yan, Changmin

    2015-08-01

    The knowledge gap hypothesis predicts widening disparities in knowledge of heavily publicized public affairs issues among socioeconomic status groups. The belief gap hypothesis extends the knowledge gap hypothesis to account for knowledge and beliefs about politically contested issues based on empirically verifiable information. This analysis of 3 national surveys shows belief gaps developed between liberals and conservatives regarding abstinence-only sex education; socioeconomic status-based knowledge gaps did not widen. The findings partially support both belief gap and knowledge gap hypotheses. In addition, the unique contributions of exposure to Fox News, CNN, and MSNBC in this process were investigated. Only exposure to Fox News was linked to beliefs about abstinence-only sex education directly and indirectly through the cultivation of conservative ideology.

  4. Modelling guidelines--terminology and guiding principles

    NASA Astrophysics Data System (ADS)

    Refsgaard, Jens Christian; Henriksen, Hans Jørgen

    2004-01-01

    Some scientists argue, with reference to Popper's scientific philosophical school, that models cannot be verified or validated. Other scientists and many practitioners nevertheless use these terms, but with very different meanings. As a result of an increasing number of examples of model malpractice and mistrust to the credibility of models, several modelling guidelines are being elaborated in recent years with the aim of improving the quality of modelling studies. This gap between the views and the lack of consensus experienced in the scientific community and the strongly perceived need for commonly agreed modelling guidelines is constraining the optimal use and benefits of models. This paper proposes a framework for quality assurance guidelines, including a consistent terminology and a foundation for a methodology bridging the gap between scientific philosophy and pragmatic modelling. A distinction is made between the conceptual model, the model code and the site-specific model. A conceptual model is subject to confirmation or falsification like scientific theories. A model code may be verified within given ranges of applicability and ranges of accuracy, but it can never be universally verified. Similarly, a model may be validated, but only with reference to site-specific applications and to pre-specified performance (accuracy) criteria. Thus, a model's validity will always be limited in terms of space, time, boundary conditions and types of application. This implies a continuous interaction between manager and modeller in order to establish suitable accuracy criteria and predictions associated with uncertainty analysis.

  5. Regression analysis for bivariate gap time with missing first gap time data.

    PubMed

    Huang, Chia-Hui; Chen, Yi-Hau

    2017-01-01

    We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.

  6. Effects of correlated parameters and uncertainty in electronic-structure-based chemical kinetic modelling

    NASA Astrophysics Data System (ADS)

    Sutton, Jonathan E.; Guo, Wei; Katsoulakis, Markos A.; Vlachos, Dionisios G.

    2016-04-01

    Kinetic models based on first principles are becoming common place in heterogeneous catalysis because of their ability to interpret experimental data, identify the rate-controlling step, guide experiments and predict novel materials. To overcome the tremendous computational cost of estimating parameters of complex networks on metal catalysts, approximate quantum mechanical calculations are employed that render models potentially inaccurate. Here, by introducing correlative global sensitivity analysis and uncertainty quantification, we show that neglecting correlations in the energies of species and reactions can lead to an incorrect identification of influential parameters and key reaction intermediates and reactions. We rationalize why models often underpredict reaction rates and show that, despite the uncertainty being large, the method can, in conjunction with experimental data, identify influential missing reaction pathways and provide insights into the catalyst active site and the kinetic reliability of a model. The method is demonstrated in ethanol steam reforming for hydrogen production for fuel cells.

  7. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  8. The Precautionary Principle: implications for risk management strategies.

    PubMed

    Saltelli, Andrea; Funtowicz, Silvio

    2004-01-01

    The European Commission has published a Communication on the Precautionary Principle and a White Book on Governance. These provide us (as research civil servants of the Commission) an institutional framework for handling scientific information that is often incomplete, uncertain, and contested. But, although the Precautionary Principle is intuitively straightforward to understand, there is no agreed way of applying it to real decision-making. To meet this perceived need, researchers have proposed a vast number of taxonomies. These include ignorance auditing, type one-two-three errors, a combination of uncertainty and decision stakes through post-normal science and the plotting of ignorance of probabilities against ignorance of consequences. Any of these could be used to define a precautionary principle region inside a multidimensional space and to position an issue within that region. The role of anticipatory research is clearly critical but scientific input is only part of the picture. It is difficult to imagine an issue where the application of the Precautionary Principle would be non-contentious. From genetically-modified food to electro-smog, from climate change to hormone growth in meat, it is clear that: 1) risk and cost-benefit are only part of the picture; 2) there are ethical issues involved; 3) there is a plurality of interests and perspectives that are often in conflict; 4) there will be losers and winners whatever decision is made. Operationalization of the Precautionary Principle must preserve transparency. Only in this way will the incommensurable costs and benefits associated with different stakeholders be registered. A typical decision will include the following sorts of considerations: 1) the commercial interests of companies and the communities that depend on them; 2) the worldviews of those who might want a greener, less consumerist society and/or who believe in the sanctity of human or animal life; 3) potential benefits such as enabling the

  9. Laboratory Studies on Surface Sampling of Bacillus anthracis Contamination: Summary, Gaps, and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Hu, Rebecca

    2011-11-28

    This report summarizes previous laboratory studies to characterize the performance of methods for collecting, storing/transporting, processing, and analyzing samples from surfaces contaminated by Bacillus anthracis or related surrogates. The focus is on plate culture and count estimates of surface contamination for swab, wipe, and vacuum samples of porous and nonporous surfaces. Summaries of the previous studies and their results were assessed to identify gaps in information needed as inputs to calculate key parameters critical to risk management in biothreat incidents. One key parameter is the number of samples needed to make characterization or clearance decisions with specified statistical confidence. Othermore » key parameters include the ability to calculate, following contamination incidents, the (1) estimates of Bacillus anthracis contamination, as well as the bias and uncertainties in the estimates, and (2) confidence in characterization and clearance decisions for contaminated or decontaminated buildings. Gaps in knowledge and understanding identified during the summary of the studies are discussed and recommendations are given for future studies.« less

  10. The uncertainty room: strategies for managing uncertainty in a surgical waiting room.

    PubMed

    Stone, Anne M; Lammers, John C

    2012-01-01

    To describe experiences of uncertainty and management strategies for staff working with families in a hospital waiting room. A 288-bed, nonprofit community hospital in a Midwestern city. Data were collected during individual, semistructured interviews with 3 volunteers, 3 technical staff members, and 1 circulating nurse (n = 7), and during 40 hours of observation in a surgical waiting room. Interview transcripts were analyzed using constant comparative techniques. The surgical waiting room represents the intersection of several sources of uncertainty that families experience. Findings also illustrate the ways in which staff manage the uncertainty of families in the waiting room by communicating support. Staff in surgical waiting rooms are responsible for managing family members' uncertainty related to insufficient information. Practically, this study provided some evidence that staff are expected to help manage the uncertainty that is typical in a surgical waiting room, further highlighting the important role of communication in improving family members' experiences.

  11. The Thermal Conductivity of Earth's Core: A Key Geophysical Parameter's Constraints and Uncertainties

    NASA Astrophysics Data System (ADS)

    Williams, Q.

    2018-05-01

    The thermal conductivity of iron alloys at high pressures and temperatures is a critical parameter in governing ( a) the present-day heat flow out of Earth's core, ( b) the inferred age of Earth's inner core, and ( c) the thermal evolution of Earth's core and lowermost mantle. It is, however, one of the least well-constrained important geophysical parameters, with current estimates for end-member iron under core-mantle boundary conditions varying by about a factor of 6. Here, the current state of calculations, measurements, and inferences that constrain thermal conductivity at core conditions are reviewed. The applicability of the Wiedemann-Franz law, commonly used to convert electrical resistivity data to thermal conductivity data, is probed: Here, whether the constant of proportionality, the Lorenz number, is constant at extreme conditions is of vital importance. Electron-electron inelastic scattering and increases in Fermi-liquid-like behavior may cause uncertainties in thermal conductivities derived from both first-principles-associated calculations and electrical conductivity measurements. Additional uncertainties include the role of alloying constituents and local magnetic moments of iron in modulating the thermal conductivity. Thus, uncertainties in thermal conductivity remain pervasive, and hence a broad range of core heat flows and inner core ages appear to remain plausible.

  12. Long Term Uncertainty Investigations of 1 MN Force Calibration Machine at NPL, India (NPLI)

    NASA Astrophysics Data System (ADS)

    Kumar, Rajesh; Kumar, Harish; Kumar, Anil; Vikram

    2012-01-01

    The present paper is an attempt to study the long term uncertainty of 1 MN hydraulic multiplication system (HMS) force calibration machine (FCM) at the National Physical Laboratory, India (NPLI), which is used for calibration of the force measuring instruments in the range of 100 kN - 1 MN. The 1 MN HMS FCM was installed at NPLI in 1993 and was built on the principle of hydraulic amplifications of dead weights. The best measurement capability (BMC) of the machine is ± 0.025% (k = 2) and it is traceable to national standards by means of precision force transfer standards (FTS). The present study discusses the uncertainty variations of the 1 MN HMS FCM over the years and describes the other parameters in detail, too. The 1 MN HMS FCM was calibrated in the years 2004, 2006, 2007, 2008, 2009 and 2010 and the results have been reported.

  13. Smoothing and gap-filling of high resolution multi-spectral time series: Example of Landsat data

    NASA Astrophysics Data System (ADS)

    Vuolo, Francesco; Ng, Wai-Tim; Atzberger, Clement

    2017-05-01

    This paper introduces a novel methodology for generating 15-day, smoothed and gap-filled time series of high spatial resolution data. The approach is based on templates from high quality observations to fill data gaps that are subsequently filtered. We tested our method for one large contiguous area (Bavaria, Germany) and for nine smaller test sites in different ecoregions of Europe using Landsat data. Overall, our results match the validation dataset to a high degree of accuracy with a mean absolute error (MAE) of 0.01 for visible bands, 0.03 for near-infrared and 0.02 for short-wave-infrared. Occasionally, the reconstructed time series are affected by artefacts due to undetected clouds. Less frequently, larger uncertainties occur as a result of extended periods of missing data. Reliable cloud masks are highly warranted for making full use of time series.

  14. Uncertainty prediction for PUB

    NASA Astrophysics Data System (ADS)

    Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.

    2003-04-01

    IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.

  15. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  16. Understanding of sub-band gap absorption of femtosecond-laser sulfur hyperdoped silicon using synchrotron-based techniques

    PubMed Central

    Limaye, Mukta V.; Chen, S. C.; Lee, C. Y.; Chen, L. Y.; Singh, Shashi B.; Shao, Y. C.; Wang, Y. F.; Hsieh, S. H.; Hsueh, H. C.; Chiou, J. W.; Chen, C. H.; Jang, L. Y.; Cheng, C. L.; Pong, W. F.; Hu, Y. F.

    2015-01-01

    The correlation between sub-band gap absorption and the chemical states and electronic and atomic structures of S-hyperdoped Si have been extensively studied, using synchrotron-based x-ray photoelectron spectroscopy (XPS), x-ray absorption near-edge spectroscopy (XANES), extended x-ray absorption fine structure (EXAFS), valence-band photoemission spectroscopy (VB-PES) and first-principles calculation. S 2p XPS spectra reveal that the S-hyperdoped Si with the greatest (~87%) sub-band gap absorption contains the highest concentration of S2− (monosulfide) species. Annealing S-hyperdoped Si reduces the sub-band gap absorptance and the concentration of S2− species, but significantly increases the concentration of larger S clusters [polysulfides (Sn2−, n > 2)]. The Si K-edge XANES spectra show that S hyperdoping in Si increases (decreased) the occupied (unoccupied) electronic density of states at/above the conduction-band-minimum. VB-PES spectra evidently reveal that the S-dopants not only form an impurity band deep within the band gap, giving rise to the sub-band gap absorption, but also cause the insulator-to-metal transition in S-hyperdoped Si samples. Based on the experimental results and the calculations by density functional theory, the chemical state of the S species and the formation of the S-dopant states in the band gap of Si are critical in determining the sub-band gap absorptance of hyperdoped Si samples. PMID:26098075

  17. Thermodynamic properties of ideal Fermi gases in a harmonic potential in an n-dimensional space under the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Li, Heling; Ren, Jinxiu; Wang, Wenwei; Yang, Bin; Shen, Hongjun

    2018-02-01

    Using the semi-classical (Thomas-Fermi) approximation, the thermodynamic properties of ideal Fermi gases in a harmonic potential in an n-dimensional space are studied under the generalized uncertainty principle (GUP). The mean particle number, internal energy, heat capacity and other thermodynamic variables of the Fermi system are calculated analytically. Then, analytical expressions of the mean particle number, internal energy, heat capacity, chemical potential, Fermi energy, ground state energy and amendments of the GUP are obtained at low temperatures. The influence of both the GUP and the harmonic potential on the thermodynamic properties of a copper-electron gas and other systems with higher electron densities are studied numerically at low temperatures. We find: (1) When the GUP is considered, the influence of the harmonic potential is very much larger, and the amendments produced by the GUP increase by eight to nine orders of magnitude compared to when no external potential is applied to the electron gas. (2) The larger the particle density, or the smaller the particle masses, the bigger the influence of the GUP. (3) The effect of the GUP increases with the increase in the spatial dimensions. (4) The amendments of the chemical potential, Fermi energy and ground state energy increase with an increase in temperature, while the heat capacity decreases. T F0 is the Fermi temperature of the ideal Fermi system in a harmonic potential. When the temperature is lower than a certain value (0.22 times T F0 for the copper-electron gas, and this value decreases with increasing electron density), the amendment to the internal energy is positive, however, the amendment decreases with increasing temperature. When the temperature increases to the value, the amendment is zero, and when the temperature is higher than the value, the amendment to the internal energy is negative and the absolute value of the amendment increases with increasing temperature. (5) When electron

  18. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  19. Quantum test of the equivalence principle for atoms in coherent superposition of internal energy states

    PubMed Central

    Rosi, G.; D'Amico, G.; Cacciapuoti, L.; Sorrentino, F.; Prevedelli, M.; Zych, M.; Brukner, Č.; Tino, G. M.

    2017-01-01

    The Einstein equivalence principle (EEP) has a central role in the understanding of gravity and space–time. In its weak form, or weak equivalence principle (WEP), it directly implies equivalence between inertial and gravitational mass. Verifying this principle in a regime where the relevant properties of the test body must be described by quantum theory has profound implications. Here we report on a novel WEP test for atoms: a Bragg atom interferometer in a gravity gradiometer configuration compares the free fall of rubidium atoms prepared in two hyperfine states and in their coherent superposition. The use of the superposition state allows testing genuine quantum aspects of EEP with no classical analogue, which have remained completely unexplored so far. In addition, we measure the Eötvös ratio of atoms in two hyperfine levels with relative uncertainty in the low 10−9, improving previous results by almost two orders of magnitude. PMID:28569742

  20. Spin asymmetric band gap opening in graphene by Fe adsorption

    NASA Astrophysics Data System (ADS)

    del Castillo, E.; Cargnoni, F.; Achilli, S.; Tantardini, G. F.; Trioni, M. I.

    2015-04-01

    The adsorption of Fe atom on graphene is studied by first-principles Density Functional Theory. The structural, electronic, and magnetic properties are analyzed at different coverages, all preserving C6v symmetry for the Fe adatom. We observed that binding energies, magnetic moments, and adsorption distances rapidly converge as the size of the supercell increases. Among the considered supercells, those constituted by 3n graphene unit cells show a very peculiar behavior: the adsorption of a Fe atom induces the opening of a spin-dependent gap in the band structure. In particular, the gap amounts to tenths of eV in the majority spin component, while in the minority one it has a width of about 1 eV for the 3 × 3 supercell and remains significant even at very low coverages (0.25 eV for θ ≃ 2%). The charge redistribution upon Fe adsorption has also been analyzed according to state of the art formalisms indicating an appreciable charge transfer from Fe to the graphene layer.

  1. Equilibration and analysis of first-principles molecular dynamics simulations of water

    NASA Astrophysics Data System (ADS)

    Dawson, William; Gygi, François

    2018-03-01

    First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.

  2. Equilibration and analysis of first-principles molecular dynamics simulations of water.

    PubMed

    Dawson, William; Gygi, François

    2018-03-28

    First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.

  3. Quantum spin Hall insulator BiXH (XH = OH, SH) monolayers with a large bulk band gap.

    PubMed

    Hu, Xing-Kai; Lyu, Ji-Kai; Zhang, Chang-Wen; Wang, Pei-Ji; Ji, Wei-Xiao; Li, Ping

    2018-05-16

    A large bulk band gap is critical for the application of two-dimensional topological insulators (TIs) in spintronic devices operating at room temperature. On the basis of first-principles calculations, we predict BiXH (X = OH, SH) monolayers as TIs with an extraordinarily large bulk gap of 820 meV for BiOH and 850 meV for BiSH, and propose a tight-binding model considering spin-orbit coupling to describe the electronic properties of BiXH. These large gaps are entirely due to the strong spin-orbit interaction related to the pxy orbitals of the Bi atoms of the honeycomb lattice. The orbital filtering mechanism can be used to understand the topological properties of BiXH. The XH groups simply remove one branch of orbitals (pz of Bi) and reduce the trivial 6-band lattice into a 4-band, which is topologically non-trivial. The topological characteristics of BiXH monolayers are confirmed by nonzero topological invariant Z2 and a single pair of gapless helical edge states in the bulk gap. Owing to these features, the BiXH monolayers of the large-gap TIs are an ideal platform to realize many exotic phenomena and fabricate new quantum devices working at room temperature.

  4. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  6. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  7. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  8. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  9. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision

  10. Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?

    USGS Publications Warehouse

    Geier, J.; Voss, C.I.; Dverstorp, B.

    2002-01-01

    A simple evaluation can be used to characterize the capacity of crystalline bedrock to act as a barrier to release radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealized models of pore geometry. Application to an intensively characterized site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geological barrier performance. Comparison with seven other less intensively characterized crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterized by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.

  11. Conceptual uncertainty in crystalline bedrock: Is simple evaluation the only practical approach?

    USGS Publications Warehouse

    Geier, J.; Voss, C.I.; Dverstorp, B.

    2002-01-01

    A simple evaluation can be used to characterise the capacity of crystalline bedrock to act as a barrier to releases of radionuclides from a nuclear waste repository. Physically plausible bounds on groundwater flow and an effective transport-resistance parameter are estimated based on fundamental principles and idealised models of pore geometry. Application to an intensively characterised site in Sweden shows that, due to high spatial variability and uncertainty regarding properties of transport paths, the uncertainty associated with the geological barrier is too high to allow meaningful discrimination between good and poor performance. Application of more complex (stochastic-continuum and discrete-fracture-network) models does not yield a significant improvement in the resolution of geologic-barrier performance. Comparison with seven other less intensively characterised crystalline study sites in Sweden leads to similar results, raising a question as to what extent the geological barrier function can be characterised by state-of-the art site investigation methods prior to repository construction. A simple evaluation provides a simple and robust practical approach for inclusion in performance assessment.

  12. Uncertainty vs. Information (Invited)

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  13. Water, Resilience and the Law: From General Concepts and Governance Design Principles to Actionable Mechanisms

    NASA Astrophysics Data System (ADS)

    Hill Clarvis, M.; Allan, A.; Hannah, D. M.

    2013-12-01

    Climate change has significant ramifications for water law and governance, yet, there is strong evidence that legal regulations have often failed to protect environments or promote sustainable development. Scholars have increasingly suggested that the preservation and restoration paradigms of legislation and regulation are no longer adequate for climate change related challenges in complex and cross-scale social-ecological systems. This is namely due to past assumptions of stationarity, uniformitarianism and the perception of ecosystem change as predictable and reversible. This paper reviews the literature on law and resilience and then presents and discusses a set of practical examples of legal mechanisms from the water resources management sector, identified according to a set of guiding principles from the literature on adaptive capacity, adaptive governance as well as adaptive and integrated water resources management. It then assesses the aptness of these different measures according to scientific evidence of increased uncertainty and changing ecological baselines. A review of the best practice examples demonstrates that there are a number of best practice examples attempting to integrate adaptive elements of flexibility, iterativity, connectivity and subsidiarity into a variety of legislative mechanisms, suggesting that there is not as significant a tension between resilience and the law as many scholars have suggested. However, while many of the mechanisms may indeed be suitable for addressing challenges relating to current levels of change and uncertainty, analysis across a broader range of uncertainty highlights challenges relating to more irreversible changes associated with greater levels of warming. Furthermore the paper identifies a set of pre-requisites that are fundamental to the successful implementation of such mechanisms, namely monitoring and data sharing, financial and technical capacity, particularly in nations that are most at risk with the

  14. Contemporary treatment principles for early rheumatoid arthritis: a consensus statement.

    PubMed

    Kiely, Patrick D W; Brown, Andrew K; Edwards, Christopher J; O'Reilly, David T; Ostör, Andrew J K; Quinn, Mark; Taggart, Allister; Taylor, Peter C; Wakefield, Richard J; Conaghan, Philip G

    2009-07-01

    RA has a substantial impact on both patients and healthcare systems. Our objective is to advance the understanding of modern management principles in light of recent evidence concerning the condition's diagnosis and treatment. A group of practicing UK rheumatologists formulated contemporary management principles and clinical practice recommendations concerning both diagnosis and treatment. Areas of clinical uncertainty were documented, leading to research recommendations. A fundamental concept governing treatment of RA is minimization of cumulative inflammation, referred to as the inflammation-time area under the curve (AUC). To achieve this, four core principles of management were identified: (i) detect and refer patients early, even if the diagnosis is uncertain: patients should be referred at the first suspicion of persistent inflammatory polyarthritis and rheumatology departments should provide rapid access to a diagnostic and prognostic service; (ii) treat RA immediately: optimizing outcomes with conventional DMARDs and biologics requires that effective treatment be started early-ideally within 3 months of symptom onset; (iii) tight control of inflammation in RA improves outcome: frequent assessments and an objective protocol should be used to make treatment changes that maintain low-disease activity/remission at an agreed target; (iv) consider the risk-benefit ratio and tailor treatment to each patient: differing patient, disease and drug characteristics require long-term monitoring of risks and benefits with adaptations of treatments to suit individual circumstances. These principles focus on effective control of the inflammatory process in RA, but optimal uptake may require changes in service provision to accommodate appropriate care pathways.

  15. Uncertainty Analysis Principles and Methods

    DTIC Science & Technology

    2007-09-01

    error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden

  16. Magnetic Nature of Light Transmission through a 5-nm Gap.

    PubMed

    Yang, Hyosim; Kim, Dai-Sik; Kim, Richard H Joon-Yeon; Ahn, Jae Sung; Kang, Taehee; Jeong, Jeeyoon; Lee, Dukhyung

    2018-02-09

    Slot antennas have been exploited as important building blocks of optical magnetism because their radiations are invoked by the magnetic fields along the axes, as vectorial Babinet principle predicts. However, optical magnetism of a few-nanometer-width slit, for which fascinating applications are found due to the colossal field enhancement but Babinet principle fails due to the nonnegligible thickness, has not been investigated. In this paper, we demonstrated that the magnetic field plays a dominant role in light transmission through a 5-nm slit on a 150-nm-thick gold film. The 5-nm slit was fabricated by atomic layer lithography, and the transmission was investigated for various incident angles by experiment and simulation at 785-nm wavelength. We found that, due to the deep subwavelength gap width, the transmission has the same incident angle dependence as the tangential magnetic field on the metal surface and this magnetic nature of a nanogap holds up to ~100-nm width. Our analysis establishes conditions for nanogap optical magnetism and suggests new possibilities in realizing magnetic-field-driven optical nonlinearities.

  17. First-principle study of effect of variation of `x' on the band alignment in CZTS1-xSex

    NASA Astrophysics Data System (ADS)

    Ghemud, Vipul; Kshirsagar, Anjali

    2018-04-01

    The present work concentrates on the electronic structure study of CZTS1-xSex alloy with x ranging from 0 to 1. For the alloy study, we have carried out first-principles calculations employing generalized gradient approximation for structural optimization and further hybrid functional approach to compare the optical band gap with that obtained from the experiments. A systematic increase in the lattice parameters with lowering of band gap from 1.52eV to 1.04eV is seen with increasing Se concentration from 0 to 100%, however the lowering of valence band edge and conduction band edge is not linear with the concentration variation. Our results indicate that the lowering of band gap is a result increased Cu:d and Se:p hybridization with increasing `x'.

  18. Preliminary analysis of Dione Regio, Venus: The final Magellan regional imaging gap

    NASA Technical Reports Server (NTRS)

    Keddie, S. T.

    1993-01-01

    In Sep. 1992, the Magellan spacecraft filled the final large gap in its coverage of Venus when it imaged an area west of Alpha Regio. F-BIDR's and some test MIDR's of parts of this area were available as of late December. Dione Regio was imaged by the Arecibo observatory and a preliminary investigation of Magellan images supports the interpretations made based on these earlier images: Dione Regio is a regional highland on which is superposed three large, very distinct volcanic edifices. The superior resolution and different viewing geometry of the Magellan images also clarified some uncertainties and revealed fascinating details about this region.

  19. Gap state analysis in electric-field-induced band gap for bilayer graphene.

    PubMed

    Kanayama, Kaoru; Nagashio, Kosuke

    2015-10-29

    The origin of the low current on/off ratio at room temperature in dual-gated bilayer graphene field-effect transistors is considered to be the variable range hopping in gap states. However, the quantitative estimation of gap states has not been conducted. Here, we report the systematic estimation of the energy gap by both quantum capacitance and transport measurements and the density of states for gap states by the conductance method. An energy gap of ~ 250 meV is obtained at the maximum displacement field of ~ 3.1 V/nm, where the current on/off ratio of ~ 3 × 10(3) is demonstrated at 20 K. The density of states for the gap states are in the range from the latter half of 10(12) to 10(13) eV(-1) cm(-2). Although the large amount of gap states at the interface of high-k oxide/bilayer graphene limits the current on/off ratio at present, our results suggest that the reduction of gap states below ~ 10(11) eV(-1) cm(-2) by continual improvement of the gate stack makes bilayer graphene a promising candidate for future nanoelectronic device applications.

  20. Historic emissions from deforestation and forest degradation in Mato Grosso, Brazil: 1) source data uncertainties

    PubMed Central

    2011-01-01

    Background Historic carbon emissions are an important foundation for proposed efforts to Reduce Emissions from Deforestation and forest Degradation and enhance forest carbon stocks through conservation and sustainable forest management (REDD+). The level of uncertainty in historic carbon emissions estimates is also critical for REDD+, since high uncertainties could limit climate benefits from credited mitigation actions. Here, we analyzed source data uncertainties based on the range of available deforestation, forest degradation, and forest carbon stock estimates for the Brazilian state of Mato Grosso during 1990-2008. Results Deforestation estimates showed good agreement for multi-year periods of increasing and decreasing deforestation during the study period. However, annual deforestation rates differed by > 20% in more than half of the years between 1997-2008, even for products based on similar input data. Tier 2 estimates of average forest carbon stocks varied between 99-192 Mg C ha-1, with greatest differences in northwest Mato Grosso. Carbon stocks in deforested areas increased over the study period, yet this increasing trend in deforested biomass was smaller than the difference among carbon stock datasets for these areas. Conclusions Estimates of source data uncertainties are essential for REDD+. Patterns of spatial and temporal disagreement among available data products provide a roadmap for future efforts to reduce source data uncertainties for estimates of historic forest carbon emissions. Specifically, regions with large discrepancies in available estimates of both deforestation and forest carbon stocks are priority areas for evaluating and improving existing estimates. Full carbon accounting for REDD+ will also require filling data gaps, including forest degradation and secondary forest, with annual data on all forest transitions. PMID:22208947

  1. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty

    PubMed Central

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications. PMID:27835670

  2. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    PubMed

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  3. Band gap tuning of armchair silicene nanoribbons using periodic hexagonal holes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehdi Aghaei, Sadegh; Calizo, Irene, E-mail: icalizo@fiu.edu

    2015-09-14

    The popularity of graphene owing to its unique and exotic properties has triggered a great deal of interest in other two-dimensional nanomaterials. Among them silicene shows considerable promise for electronic devices with a carrier mobility comparable to graphene, flexible buckled structure, and expected compatibility with silicon electronics. Using first-principle calculations based on density functional theory, the electronic properties of armchair silicene nanoribbons perforated with periodic nanoholes (ASiNRPNHs) are investigated. Two different configurations of mono-hydrogenated (:H) and di-hydrogenated (:2H) silicene edges are considered. Pristine armchair silicene nanoribbons (ASiNRs) can be categorized into three branches with width W = 3P − 1, 3P, andmore » 3P + 1, P is an integer. The order of their energy gaps change from “E{sub G} (3P − 1) < E{sub G} (3P) < E{sub G} (3P + 1)” for W-ASiNRs:H to “E{sub G} (3P + 1) < E{sub G} (3P − 1) < E{sub G} (3P)” for W-ASiNRs:2H. We found the band gaps of W-ASiNRs:H and (W + 2)-ASiNRs:2H are slightly different, giving larger band gaps for wider ASiNRs:2H. ASiNRPNHs' band gaps changed based on the nanoribbon's width, nanohole's repeat periodicity and position relative to the nanoribbon's edge compared to pristine ASiNRs because of changes in quantum confinement strength. ASiNRPNHs:2H are more stable than ASiNRPNHs:H and their band gaps are noticeably greater than ASiNRPNHs:H. We found that the value of energy band gap for 12-ASiNRPNHs:2H with repeat periodicity of 2 is 0.923 eV. This value is about 2.2 times greater than pristine ASiNR:2H and double that of the 12-ASiNRPNHs:H with repeat periodicity of 2.« less

  4. Engineering an Insulating Ferroelectric Superlattice with a Tunable Band Gap from Metallic Components

    DOE PAGES

    Ghosh, Saurabh; Borisevich, Albina Y.; Pantelides, Sokrates T.

    2017-10-25

    The recent discovery of “polar metals” with ferroelectriclike displacements offers the promise of designing ferroelectrics with tunable energy gaps by inducing controlled metal-insulator transitions. Here in this work, we employ first-principles calculations to design a metallic polar superlattice from nonpolar metal components and show that controlled intermixing can lead to a true insulating ferroelectric with a tunable band gap. We consider a 2/2 superlattice made of two centrosymmetric metallic oxides, La 0.75Sr 0.25MnO 3 and LaNiO 3, and show that ferroelectriclike displacements are induced. The ferroelectriclike distortion is found to be strongly dependent on the carrier concentration (Sr content). Further,more » we show that a metal-to-insulator (MI) transition is feasible in this system via disproportionation of the Ni sites. Such a disproportionation and, hence, a MI transition can be driven by intermixing of transition metal ions between Mn and Ni layers. Finally, as a result, the energy gap of the resulting ferroelectric can be tuned by varying the degree of intermixing in the experimental fabrication method.« less

  5. Engineering an Insulating Ferroelectric Superlattice with a Tunable Band Gap from Metallic Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Saurabh; Borisevich, Albina Y.; Pantelides, Sokrates T.

    The recent discovery of “polar metals” with ferroelectriclike displacements offers the promise of designing ferroelectrics with tunable energy gaps by inducing controlled metal-insulator transitions. Here in this work, we employ first-principles calculations to design a metallic polar superlattice from nonpolar metal components and show that controlled intermixing can lead to a true insulating ferroelectric with a tunable band gap. We consider a 2/2 superlattice made of two centrosymmetric metallic oxides, La 0.75Sr 0.25MnO 3 and LaNiO 3, and show that ferroelectriclike displacements are induced. The ferroelectriclike distortion is found to be strongly dependent on the carrier concentration (Sr content). Further,more » we show that a metal-to-insulator (MI) transition is feasible in this system via disproportionation of the Ni sites. Such a disproportionation and, hence, a MI transition can be driven by intermixing of transition metal ions between Mn and Ni layers. Finally, as a result, the energy gap of the resulting ferroelectric can be tuned by varying the degree of intermixing in the experimental fabrication method.« less

  6. Uncertainty of Polarized Parton Distributions

    NASA Astrophysics Data System (ADS)

    Hirai, M.; Goto, Y.; Horaguchi, T.; Kobayashi, H.; Kumano, S.; Miyama, M.; Saito, N.; Shibata, T.-A.

    Polarized parton distribution functions are determined by a χ2 analysis of polarized deep inelastic experimental data. In this paper, uncertainty of obtained distribution functions is investigated by a Hessian method. We find that the uncertainty of the polarized gluon distribution is fairly large. Then, we estimate the gluon uncertainty by including the fake data which are generated from prompt photon process at RHIC. We observed that the uncertainty could be reduced with these data.

  7. On the role of budget sufficiency, cost efficiency, and uncertainty in species management

    USGS Publications Warehouse

    van der Burg, Max Post; Bly, Bartholomew B.; Vercauteren, Tammy; Grand, James B.; Tyre, Andrew J.

    2014-01-01

    Many conservation planning frameworks rely on the assumption that one should prioritize locations for management actions based on the highest predicted conservation value (i.e., abundance, occupancy). This strategy may underperform relative to the expected outcome if one is working with a limited budget or the predicted responses are uncertain. Yet, cost and tolerance to uncertainty rarely become part of species management plans. We used field data and predictive models to simulate a decision problem involving western burrowing owls (Athene cunicularia hypugaea) using prairie dog colonies (Cynomys ludovicianus) in western Nebraska. We considered 2 species management strategies: one maximized abundance and the other maximized abundance in a cost-efficient way. We then used heuristic decision algorithms to compare the 2 strategies in terms of how well they met a hypothetical conservation objective. Finally, we performed an info-gap decision analysis to determine how these strategies performed under different budget constraints and uncertainty about owl response. Our results suggested that when budgets were sufficient to manage all sites, the maximizing strategy was optimal and suggested investing more in expensive actions. This pattern persisted for restricted budgets up to approximately 50% of the sufficient budget. Below this budget, the cost-efficient strategy was optimal and suggested investing in cheaper actions. When uncertainty in the expected responses was introduced, the strategy that maximized abundance remained robust under a sufficient budget. Reducing the budget induced a slight trade-off between expected performance and robustness, which suggested that the most robust strategy depended both on one's budget and tolerance to uncertainty. Our results suggest that wildlife managers should explicitly account for budget limitations and be realistic about their expected levels of performance.

  8. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    PubMed

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  9. Funding gap for immunization across 94 low- and middle-income countries.

    PubMed

    Ozawa, Sachiko; Grewal, Simrun; Portnoy, Allison; Sinha, Anushua; Arilotta, Richard; Stack, Meghan L; Brenzel, Logan

    2016-12-07

    Novel vaccine development and production has given rise to a growing number of vaccines that can prevent disease and save lives. In order to realize these health benefits, it is essential to ensure adequate immunization financing to enable equitable access to vaccines for people in all communities. This analysis estimates the full immunization program costs, projected available financing, and resulting funding gap for 94 low- and middle-income countries over five years (2016-2020). Vaccine program financing by country governments, Gavi, and other development partners was forecasted for vaccine, supply chain, and service delivery, based on an analysis of comprehensive multi-year plans together with a series of scenario and sensitivity analyses. Findings indicate that delivery of full vaccination programs across 94 countries would result in a total funding gap of $7.6 billion (95% uncertainty range: $4.6-$11.8 billion) over 2016-2020, with the bulk (98%) of the resources required for routine immunization programs. More than half (65%) of the resources to meet this funding gap are required for service delivery at $5.0 billion ($2.7-$8.4 billion) with an additional $1.1 billion ($0.9-$2.7 billion) needed for vaccines and $1.5 billion ($1.1-$2.0 billion) for supply chain. When viewed as a percentage of total projected costs, the funding gap represents 66% of projected supply chain costs, 30% of service delivery costs, and 9% of vaccine costs. On average, this funding gap corresponds to 0.2% of general government expenditures and 2.3% of government health expenditures. These results suggest greater need for country and donor resource mobilization and funding allocation for immunizations. Both service delivery and supply chain are important areas for further resource mobilization. Further research on the impact of advances in service delivery technology and reductions in vaccine prices beyond this decade would be important for efficient investment decisions for

  10. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  11. Gage Measures Recessed Gaps

    NASA Technical Reports Server (NTRS)

    Zepeda, J. L.

    1983-01-01

    New tool measures separation between recessed parallel surfaces. Tiles have overhanging edges, tool designed to slip into gap from end so it extends through 0.040-inch crack. Measure gaps between 0.200 and 0.400 inch so gap fillers of proper thickness can be selected. Useful in numerous industrial situation involving gap measurements in inaccessable places.

  12. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    PubMed

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  13. The basic principles of migration health: Population mobility and gaps in disease prevalence

    PubMed Central

    Gushulak, Brian D; MacPherson, Douglas W

    2006-01-01

    Currently, migrants and other mobile individuals, such as migrant workers and asylum seekers, are an expanding global population of growing social, demographic and political importance. Disparities often exist between a migrant population's place of origin and its destination, particularly with relation to health determinants. The effects of those disparities can be observed at both individual and population levels. Migration across health and disease disparities influences the epidemiology of certain diseases globally and in nations receiving migrants. While specific disease-based outcomes may vary between migrant group and location, general epidemiological principles may be applied to any situation where numbers of individuals move between differences in disease prevalence. Traditionally, migration health activities have been designed for national application and lack an integrated international perspective. Present and future health challenges related to migration may be more effectively addressed through collaborative global undertakings. This paper reviews the epidemiological relationships resulting from health disparities bridged by migration and describes the growing role of migration and population mobility in global disease epidemiology. The implications for national and international health policy and program planning are presented. PMID:16674820

  14. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  15. Where do uncertainties reside within environmental risk assessments? Testing UnISERA, a guide for uncertainty assessment.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2017-06-01

    A means for identifying and prioritising the treatment of uncertainty (UnISERA) in environmental risk assessments (ERAs) is tested, using three risk domains where ERA is an established requirement and one in which ERA practice is emerging. UnISERA's development draws on 19 expert elicitations across genetically modified higher plants, particulate matter, and agricultural pesticide release and is stress tested here for engineered nanomaterials (ENM). We are concerned with the severity of uncertainty; its nature; and its location across four accepted stages of ERAs. Using an established uncertainty scale, the risk characterisation stage of ERA harbours the highest severity level of uncertainty, associated with estimating, aggregating and evaluating expressions of risk. Combined epistemic and aleatory uncertainty is the dominant nature of uncertainty. The dominant location of uncertainty is associated with data in problem formulation, exposure assessment and effects assessment. Testing UnISERA produced agreements of 55%, 90%, and 80% for the severity level, nature and location dimensions of uncertainty between the combined case studies and the ENM stress test. UnISERA enables environmental risk analysts to prioritise risk assessment phases, groups of tasks, or individual ERA tasks and it can direct them towards established methods for uncertainty treatment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Principles of Chemical Bonding and Band Gap Engineering in Hybrid Organic-Inorganic Halide Perovskites.

    PubMed

    Walsh, Aron

    2015-03-19

    The performance of solar cells based on hybrid halide perovskites has seen an unparalleled rate of progress, while our understanding of the underlying physical chemistry of these materials trails behind. Superficially, CH 3 NH 3 PbI 3 is similar to other thin-film photovoltaic materials: a semiconductor with an optical band gap in the optimal region of the electromagnetic spectrum. Microscopically, the material is more unconventional. Progress in our understanding of the local and long-range chemical bonding of hybrid perovskites is discussed here, drawing from a series of computational studies involving electronic structure, molecular dynamics, and Monte Carlo simulation techniques. The orientational freedom of the dipolar methylammonium ion gives rise to temperature-dependent dielectric screening and the possibility for the formation of polar (ferroelectric) domains. The ability to independently substitute on the A, B, and X lattice sites provides the means to tune the optoelectronic properties. Finally, ten critical challenges and opportunities for physical chemists are highlighted.

  17. Principles of Chemical Bonding and Band Gap Engineering in Hybrid Organic–Inorganic Halide Perovskites

    PubMed Central

    2015-01-01

    The performance of solar cells based on hybrid halide perovskites has seen an unparalleled rate of progress, while our understanding of the underlying physical chemistry of these materials trails behind. Superficially, CH3NH3PbI3 is similar to other thin-film photovoltaic materials: a semiconductor with an optical band gap in the optimal region of the electromagnetic spectrum. Microscopically, the material is more unconventional. Progress in our understanding of the local and long-range chemical bonding of hybrid perovskites is discussed here, drawing from a series of computational studies involving electronic structure, molecular dynamics, and Monte Carlo simulation techniques. The orientational freedom of the dipolar methylammonium ion gives rise to temperature-dependent dielectric screening and the possibility for the formation of polar (ferroelectric) domains. The ability to independently substitute on the A, B, and X lattice sites provides the means to tune the optoelectronic properties. Finally, ten critical challenges and opportunities for physical chemists are highlighted. PMID:25838846

  18. Band-Gap Engineering in ZnO Thin Films: A Combined Experimental and Theoretical Study

    NASA Astrophysics Data System (ADS)

    Pawar, Vani; Jha, Pardeep K.; Panda, S. K.; Jha, Priyanka A.; Singh, Prabhakar

    2018-05-01

    Zinc oxide thin films are synthesized and characterized using x-ray diffraction, field-emission scanning electron microscopy, atomic force microscopy, and optical spectroscopy. Our results reveal that the structural, morphological, and optical properties are closely related to the stress of the sample provided that the texture of the film remains the same. The anomalous results are obtained once the texture is altered to a different orientation. We support this experimental observation by carrying out first-principles hybrid functional calculations for two different orientations of the sample and show that the effect of quantum confinement is much stronger for the (100) surface than the (001) surface of ZnO. Furthermore, our calculations provide a route to enhance the band gap of ZnO by more than 50% compared to the bulk band gap, opening up possibilities for wide-range industrial applications.

  19. A method for acquiring random range uncertainty probability distributions in proton therapy

    NASA Astrophysics Data System (ADS)

    Holloway, S. M.; Holloway, M. D.; Thomas, S. J.

    2018-01-01

    In treatment planning we depend upon accurate knowledge of geometric and range uncertainties. If the uncertainty model is inaccurate then the plan will produce under-dosing of the target and/or overdosing of OAR. We aim to provide a method for which centre and site-specific population range uncertainty due to inter-fraction motion can be quantified to improve the uncertainty model in proton treatment planning. Daily volumetric MVCT data from previously treated radiotherapy patients has been used to investigate inter-fraction changes to water equivalent path-length (WEPL). Daily image-guidance scans were carried out for each patient and corrected for changes in CTV position (using rigid transformations). An effective depth algorithm was used to determine residual range changes, after corrections had been applied, throughout the treatment by comparing WEPL within the CTV at each fraction for several beam angles. As a proof of principle this method was used to quantify uncertainties for inter-fraction range changes for a sample of head and neck patients of Σ=3.39 mm, σ = 4.72 mm and overall mean = -1.82 mm. For prostate Σ=5.64 mm, σ = 5.91 mm and overall mean = 0.98 mm. The choice of beam angle for head and neck did not affect the inter-fraction range error significantly; however this was not the same for prostate. Greater range changes were seen using a lateral beam compared to an anterior beam for prostate due to relative motion of the prostate and femoral heads. A method has been developed to quantify population range changes due to inter-fraction motion that can be adapted for the clinic. The results of this work highlight the importance of robust planning and analysis in proton therapy. Such information could be used in robust optimisation algorithms or treatment plan robustness analysis. Such knowledge will aid in establishing beam start conditions at planning and for establishing adaptive planning protocols.

  20. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.