Sample records for uncertainty importance measure

  1. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  2. Applications of explicitly-incorporated/post-processing measurement uncertainty in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...

  3. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  4. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  5. Impact of measurement uncertainty from experimental load distribution factors on bridge load rating

    NASA Astrophysics Data System (ADS)

    Gangone, Michael V.; Whelan, Matthew J.

    2018-03-01

    Load rating and testing of highway bridges is important in determining the capacity of the structure. Experimental load rating utilizes strain transducers placed at critical locations of the superstructure to measure normal strains. These strains are then used in computing diagnostic performance measures (neutral axis of bending, load distribution factor) and ultimately a load rating. However, it has been shown that experimentally obtained strain measurements contain uncertainties associated with the accuracy and precision of the sensor and sensing system. These uncertainties propagate through to the diagnostic indicators that in turn transmit into the load rating calculation. This paper will analyze the effect that measurement uncertainties have on the experimental load rating results of a 3 span multi-girder/stringer steel and concrete bridge. The focus of this paper will be limited to the uncertainty associated with the experimental distribution factor estimate. For the testing discussed, strain readings were gathered at the midspan of each span of both exterior girders and the center girder. Test vehicles of known weight were positioned at specified locations on each span to generate maximum strain response for each of the five girders. The strain uncertainties were used in conjunction with a propagation formula developed by the authors to determine the standard uncertainty in the distribution factor estimates. This distribution factor uncertainty is then introduced into the load rating computation to determine the possible range of the load rating. The results show the importance of understanding measurement uncertainty in experimental load testing.

  6. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report: Updated in 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sisterson, Douglas

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. ARM currently provides data and supporting metadata (information about the data or data quality) to its users through several sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, ARM relies on Instrument Mentors and the ARM Data Quality Office to ensure, assess, and report measurement quality. Therefore, anmore » easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. This report is a continuation of the work presented by Campos and Sisterson (2015) and provides additional uncertainty information from instruments not available in their report. As before, a total measurement uncertainty has been calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). This study will not expand on methods for computing these uncertainties. As before, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available to the ARM community through the ARM Instrument Mentors and their ARM instrument handbooks. This study continues the first steps towards reporting ARM measurement uncertainty as: (1) identifying how the uncertainty of individual ARM measurements is currently expressed, (2) identifying a consistent approach to measurement uncertainty, and then (3) reclassifying ARM instrument measurement uncertainties in a common framework.« less

  7. Method to Calculate Uncertainty Estimate of Measuring Shortwave Solar Irradiance using Thermopile and Semiconductor Solar Radiometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, I.

    2011-07-01

    The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections tomore » solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.« less

  8. Impact of Pitot tube calibration on the uncertainty of water flow rate measurement

    NASA Astrophysics Data System (ADS)

    de Oliveira Buscarini, Icaro; Costa Barsaglini, Andre; Saiz Jabardo, Paulo Jose; Massami Taira, Nilson; Nader, Gilder

    2015-10-01

    Water utility companies often use Cole type Pitot tubes to map velocity profiles and thus measure flow rate. Frequent monitoring and measurement of flow rate is an important step in identifying leaks and other types of losses. In Brazil losses as high as 42% are common and in some places even higher values are found. When using Cole type Pitot tubes to measure the flow rate, the uncertainty of the calibration coefficient (Cd) is a major component of the overall flow rate measurement uncertainty. A common practice is to employ the usual value Cd = 0.869, in use since Cole proposed his Pitot tube in 1896. Analysis of 414 calibrations of Cole type Pitot tubes show that Cd varies considerably and values as high 0.020 for the expanded uncertainty are common. Combined with other uncertainty sources, the overall velocity measurement uncertainty is 0.02, increasing flowrate measurement uncertainty by 1.5% which, for the Sao Paulo metropolitan area (Brazil) corresponds to 3.5 × 107 m3/year.

  9. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, E; Sisterson, Douglas

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty: 1) identifying how the uncertainty of individual ARM measurements is currently expressed, 2) identifying a consistent approach to measurement uncertainty, and then 3) reclassifying ARM instrument measurement uncertainties in a common framework.« less

  10. Quantifying radar-rainfall uncertainties in urban drainage flow modelling

    NASA Astrophysics Data System (ADS)

    Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.

    2015-09-01

    This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.

  11. Measurement uncertainty of ester number, acid number and patchouli alcohol of patchouli oil produced in Yogyakarta

    NASA Astrophysics Data System (ADS)

    Istiningrum, Reni Banowati; Saepuloh, Azis; Jannah, Wirdatul; Aji, Didit Waskito

    2017-03-01

    Yogyakarta is one of patchouli oil distillation center in Indonesia. The quality of patchouli oil greatly affect its market price. Therefore, testing quality of patchouli oil parameters is an important concern, one through determination of the measurement uncertainty. This study will determine the measurement uncertainty of ester number, acid number and content of patchouli alcohol through a bottom up approach. Source contributor to measurement uncertainty of ester number is a mass of the sample, a blank and sample titration volume, the molar mass of KOH, HCl normality, and replication. While the source contributor of the measurement uncertainty of acid number is the mass of the sample, the sample titration volume, the relative mass and normality of KOH, and repetition. Determination of patchouli alcohol by Gas Chromatography considers the sources of measurement uncertainty only from repeatability because reference materials are not available.

  12. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    NASA Astrophysics Data System (ADS)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most important parameter to decrease the uncertainty space is to avoid drilling steeper than about -80°.

  13. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  14. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  15. Uncertainties of Mayak urine data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3more » to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.« less

  16. Quantifying uncertainty in forest nutrient budgets

    Treesearch

    Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell

    2012-01-01

    Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...

  17. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    USGS Publications Warehouse

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.

  18. Low Reynolds number wind tunnel measurements - The importance of being earnest

    NASA Technical Reports Server (NTRS)

    Mueller, Thomas J.; Batill, Stephen M.; Brendel, Michael; Perry, Mark L.; Bloch, Diane R.

    1986-01-01

    A method for obtaining two-dimensional aerodynamic force coefficients at low Reynolds numbers using a three-component external platform balance is presented. Regardless of method, however, the importance of understanding the possible influence of the test facility and instrumentation on the final results cannot be overstated. There is an uncertainty in the ability of the facility to simulate a two-dimensional flow environment due to the confinement effect of the wind tunnel and the method used to mount the airfoil. Additionally, the ability of the instrumentation to accurately measure forces and pressures has an associated uncertainty. This paper focuses on efforts taken to understand the errors introduced by the techniques and apparatus used at the University of Notre Dame, and, the importance of making an earnest estimate of the uncertainty. Although quantitative estimates of facility induced errors are difficult to obtain, the uncertainty in measured results can be handled in a straightforward manner and provide the experimentalist, and others, with a basis to evaluate experimental results.

  19. Using a Meniscus to Teach Uncertainty in Measurement

    NASA Astrophysics Data System (ADS)

    Backman, Philip

    2008-02-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know something about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is determined or calculated, it remains as only a number without a concrete physical connection back to the experiment. For the activity described here—presented as a challenge—groups of students are given a container and asked to make certain measurements and to estimate the uncertainty in each of those measurements. They are then challenged to complete a particular task involving the container and a volume of water. Whether the assigned task is actually achievable, however, slowly comes into question once the magnitude of the uncertainties in the original measurements is compared to the specific requirements of the challenge.

  20. Measurement of photon indistinguishability to a quantifiable uncertainty using a Hong-Ou-Mandel interferometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Peter J.; Cheung, Jessica Y.; Chunnilall, Christopher J.

    2010-04-10

    We present a method for using the Hong-Ou-Mandel (HOM) interference technique to quantify photon indistinguishability within an associated uncertainty. The method allows the relative importance of various experimental factors affecting the HOM visibility to be identified, and enables the actual indistinguishability, with an associated uncertainty, to be estimated from experimentally measured quantities. A measurement equation has been derived that accounts for the non-ideal performance of the interferometer. The origin of each term of the equation is explained, along with procedures for their experimental evaluation and uncertainty estimation. These uncertainties are combined to give an overall uncertainty for the derived photonmore » indistinguishability. The analysis was applied to measurements from an interferometer sourced with photon pairs from a parametric downconversion process. The measured photon indistinguishably was found to be 0.954+/-0.036 by using the prescribed method.« less

  1. Estimation of uncertainty in pKa values determined by potentiometric titration.

    PubMed

    Koort, Eve; Herodes, Koit; Pihl, Viljar; Leito, Ivo

    2004-06-01

    A procedure is presented for estimation of uncertainty in measurement of the pK(a) of a weak acid by potentiometric titration. The procedure is based on the ISO GUM. The core of the procedure is a mathematical model that involves 40 input parameters. A novel approach is used for taking into account the purity of the acid, the impurities are not treated as inert compounds only, their possible acidic dissociation is also taken into account. Application to an example of practical pK(a) determination is presented. Altogether 67 different sources of uncertainty are identified and quantified within the example. The relative importance of different uncertainty sources is discussed. The most important source of uncertainty (with the experimental set-up of the example) is the uncertainty of pH measurement followed by the accuracy of the burette and the uncertainty of weighing. The procedure gives uncertainty separately for each point of the titration curve. The uncertainty depends on the amount of titrant added, being lowest in the central part of the titration curve. The possibilities of reducing the uncertainty and interpreting the drift of the pK(a) values obtained from the same curve are discussed.

  2. Intolerance of uncertainty, causal uncertainty, causal importance, self-concept clarity and their relations to generalized anxiety disorder.

    PubMed

    Kusec, Andrea; Tallon, Kathleen; Koerner, Naomi

    2016-06-01

    Although numerous studies have provided support for the notion that intolerance of uncertainty plays a key role in pathological worry (the hallmark feature of generalized anxiety disorder (GAD)), other uncertainty-related constructs may also have relevance for the understanding of individuals who engage in pathological worry. Three constructs from the social cognition literature, causal uncertainty, causal importance, and self-concept clarity, were examined in the present study to assess the degree to which these explain unique variance in GAD, over and above intolerance of uncertainty. N = 235 participants completed self-report measures of trait worry, GAD symptoms, and uncertainty-relevant constructs. A subgroup was subsequently classified as low in GAD symptoms (n = 69) or high in GAD symptoms (n = 54) based on validated cut scores on measures of trait worry and GAD symptoms. In logistic regressions, only elevated intolerance of uncertainty and lower self-concept clarity emerged as unique correlates of high (vs. low) GAD symptoms. The possible role of self-concept uncertainty in GAD and the utility of integrating social cognition theories and constructs into clinical research on intolerance of uncertainty are discussed.

  3. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, E; Sisterson, DL

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore,more » an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.« less

  4. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    NASA Astrophysics Data System (ADS)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  5. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  6. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    PubMed Central

    2011-01-01

    Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM) is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India) for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG). In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2). PMID:21466671

  7. Evaluation of thermal cameras in quality systems according to ISO 9000 or EN 45000 standards

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof

    2001-03-01

    According to the international standards ISO 9001-9004 and EN 45001-45003 the industrial plants and the accreditation laboratories that implemented the quality systems according to these standards are required to evaluate an uncertainty of measurements. Manufacturers of thermal cameras do not offer any data that could enable estimation of measurement uncertainty of these imagers. Difficulties in determining the measurement uncertainty is an important limitation of thermal cameras for applications in the industrial plants and the cooperating accreditation laboratories that have implemented these quality systems. A set of parameters for characterization of commercial thermal cameras, a measuring set, some results of testing of these cameras, a mathematical model of uncertainty, and a software that enables quick calculation of uncertainty of temperature measurements with thermal cameras are presented in this paper.

  8. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  9. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  10. Measurement uncertainty: Friend or foe?

    PubMed

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. Shared and unshared exposure measurement error in occupational cohort studies and their effects on statistical inference in proportional hazards models.

    PubMed

    Hoffmann, Sabine; Laurier, Dominique; Rage, Estelle; Guihenneuc, Chantal; Ancelet, Sophie

    2018-01-01

    Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies.

  12. Shared and unshared exposure measurement error in occupational cohort studies and their effects on statistical inference in proportional hazards models

    PubMed Central

    Laurier, Dominique; Rage, Estelle

    2018-01-01

    Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies. PMID:29408862

  13. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less

  14. Unveiling the decoherence effect of noise on the entropic uncertainty relation and its control by partially collapsed operations

    NASA Astrophysics Data System (ADS)

    Chen, Min-Nan; Sun, Wen-Yang; Huang, Ai-Jun; Ming, Fei; Wang, Dong; Ye, Liu

    2018-01-01

    In this work, we investigate the dynamics of quantum-memory-assisted entropic uncertainty relations under open systems, and how to steer the uncertainty under different types of decoherence. Specifically, we develop the dynamical behaviors of the uncertainty of interest under two typical categories of noise; bit flipping and depolarizing channels. It has been shown that the measurement uncertainty firstly increases and then decreases with the growth of the decoherence strength in bit flipping channels. In contrast, the uncertainty monotonically increases with the increase of the decoherence strength in depolarizing channels. Notably, and to a large degree, it is shown that the uncertainty depends on both the systematic quantum correlation and the minimal conditional entropy of the observed subsystem. Moreover, we present a possible physical interpretation for these distinctive behaviors of the uncertainty within such scenarios. Furthermore, we propose a simple and effective strategy to reduce the entropic uncertainty by means of a partially collapsed operation—quantum weak measurement. Therefore, our investigations might offer an insight into the dynamics of the measurment uncertainty under decoherence, and be of importance to quantum precision measurement in open systems.

  15. Uncertainty propagation in q and current profiles derived from motional Stark effect polarimetry on TFTR (abstract)a)

    NASA Astrophysics Data System (ADS)

    Batha, S. H.; Levinton, F. M.; Bell, M. G.; Wieland, R. M.; Hirschman, S. P.

    1995-01-01

    The magnetic-field pitch-angle profile, γp(R)≡arctan(Bpol/Btor), is measured on the TFTR tokamak using a motional Stark effect (MSE) polarimeter. Measured profiles are converted to q profiles with the equilibrium code vmec. Uncertainties in the q profile due to uncertainties in the γp(R), magnetics, and kinetic measurements are quantified. Subsequent uncertainties in the vmec-calculated profiles of current density and shear, both of which are important for stability and transport analyses, are also quantified. Examples of circular plasmas under various confinement modes, including the supershot and L mode, will be given.

  16. Traceable measurements of small forces and local mechanical properties

    NASA Astrophysics Data System (ADS)

    Campbellová, Anna; Valtr, Miroslav; Zůda, Jaroslav; Klapetek, Petr

    2011-09-01

    Measurement of local mechanical properties is an important topic in the fields of nanoscale device fabrication, thin film deposition and composite material development. Nanoindentation instruments are commonly used to study hardness and related mechanical properties at the nanoscale. However, traceability and uncertainty aspects of the measurement process often remain left aside. In this contribution, the use of a commercial nanoindentation instrument for metrology purposes will be discussed. Full instrument traceability, provided using atomic force microscope cantilevers and a mass comparator (normal force), interferometer (depth) and atomic force microscope (area function) is described. The uncertainty of the loading/unloading curve measurements will be analyzed and the resulting uncertainties for quantities, that are computed from loading curves such as hardness or elastic modulus, are studied. For this calculation a combination of uncertainty propagation law and Monte Carlo uncertainty evaluations are used.

  17. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    PubMed

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  18. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    PubMed

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth, specificity, and precision to efforts to conceptualize and measure UT. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bamberger, Judith A.; Piepel, Gregory F.; Enderlin, Carl W.

    Understanding how uncertainty manifests itself in complex experiments is important for developing the testing protocol and interpreting the experimental results. This paper describes experimental and measurement uncertainties, and how they can depend on the order of performing experimental tests. Experiments with pulse-jet mixers in tanks at three scales were conducted to characterize the performance of transient-developing periodic flows in Newtonian slurries. Other test parameters included the simulant, solids concentration, and nozzle exit velocity. Critical suspension velocity and cloud height were the metrics used to characterize Newtonian slurry flow associated with mobilization and mixing. During testing, near-replicate and near-repeat tests weremore » conducted. The experimental results were used to quantify the combined experimental and measurement uncertainties using standard deviations and percent relative standard deviations (%RSD) The uncertainties in critical suspension velocity and cloud height tend to increase with the values of these responses. Hence, the %RSD values are the more appropriate summary measure of near-replicate testing and measurement uncertainty.« less

  1. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    PubMed

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate matter (PM) concentrations approach regulatory limits, the uncertainty of the measurement is essential in determining the sample size and the probability of type II errors in hypothesis testing. This is an important factor in determining if ambient PM concentrations exceed regulatory limits. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.

  2. Uncertainty in cloud optical depth estimates made from satellite radiance measurements

    NASA Technical Reports Server (NTRS)

    Pincus, Robert; Szczodrak, Malgorzata; Gu, Jiujing; Austin, Philip

    1995-01-01

    The uncertainty in optical depths retrieved from satellite measurements of visible wavelength radiance at the top of the atmosphere is quantified. Techniques are briefly reviewed for the estimation of optical depth from measurements of radiance, and it is noted that these estimates are always more uncertain at greater optical depths and larger solar zenith angles. The lack of radiometric calibration for visible wavelength imagers on operational satellites dominates the uncertainty retrievals of optical depth. This is true for both single-pixel retrievals and for statistics calculated from a population of individual retrievals. For individual estimates or small samples, sensor discretization can also be significant, but the sensitivity of the retrieval to the specification of the model atmosphere is less important. The relative uncertainty in calibration affects the accuracy with which optical depth distributions measured by different sensors may be quantitatively compared, while the absolute calibration uncertainty, acting through the nonlinear mapping of radiance to optical depth, limits the degree to which distributions measured by the same sensor may be distinguished.

  3. The role of the uncertainty of measurement of serum creatinine concentrations in the diagnosis of acute kidney injury.

    PubMed

    Kin Tekce, Buket; Tekce, Hikmet; Aktas, Gulali; Uyeturk, Ugur

    2016-01-01

    Uncertainty of measurement is the numeric expression of the errors associated with all measurements taken in clinical laboratories. Serum creatinine concentration is the most common diagnostic marker for acute kidney injury. The goal of this study was to determine the effect of the uncertainty of measurement of serum creatinine concentrations on the diagnosis of acute kidney injury. We calculated the uncertainty of measurement of serum creatinine according to the Nordtest Guide. Retrospectively, we identified 289 patients who were evaluated for acute kidney injury. Of the total patient pool, 233 were diagnosed with acute kidney injury using the AKIN classification scheme and then were compared using statistical analysis. We determined nine probabilities of the uncertainty of measurement of serum creatinine concentrations. There was a statistically significant difference in the number of patients diagnosed with acute kidney injury when uncertainty of measurement was taken into consideration (first probability compared to the fifth p = 0.023 and first probability compared to the ninth p = 0.012). We found that the uncertainty of measurement for serum creatinine concentrations was an important factor for correctly diagnosing acute kidney injury. In addition, based on the AKIN classification scheme, minimizing the total allowable error levels for serum creatinine concentrations is necessary for the accurate diagnosis of acute kidney injury by clinicians.

  4. Evaluation of thyroid radioactivity measurement data from Hanford workers, 1944--1946

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ikenberry, T.A.

    1991-05-01

    This report describes the preliminary results of an evaluation conducted in support of the Hanford Environmental Dose Reconstruction (HEDR) Project. The primary objective of the HEDR Project is to estimate the radiation doses that populations could have received from nuclear operations at the Hanford Site since 1944. A secondary objective is to make information that HEDR staff members used in estimate radiation doses available to the public. The objectives of this report to make available thyroid measurement data from Hanford workers for the year 1944 through 1946, and to investigate the suitability of those data for use in the HEDRmore » dose estimation process. An important part of this investigation was to provide a description of the uncertainty associated with the data. Lack of documentation on thyroid measurements from this period required that assumptions be made to perform data evaluations. These assumptions introduce uncertainty into the evaluations that could be significant. It is important to recognize the nature of these assumptions, the inherent uncertainty, and the propagation of this uncertainty, and the propagation of this uncertainty through data evaluations to any conclusions that can be made by using the data. 15 refs., 1 fig., 5 tabs.« less

  5. Measurement of absolute gamma emission probabilities

    NASA Astrophysics Data System (ADS)

    Sumithrarachchi, Chandana S.; Rengan, Krish; Griffin, Henry C.

    2003-06-01

    The energies and emission probabilities (intensities) of gamma-rays emitted in radioactive decays of particular nuclides are the most important characteristics by which to quantify mixtures of radionuclides. Often, quantification is limited by uncertainties in measured intensities. A technique was developed to reduce these uncertainties. The method involves obtaining a pure sample of a nuclide using radiochemical techniques, and using appropriate fractions for beta and gamma measurements. The beta emission rates were measured using a liquid scintillation counter, and the gamma emission rates were measured with a high-purity germanium detector. Results were combined to obtain absolute gamma emission probabilities. All sources of uncertainties greater than 0.1% were examined. The method was tested with 38Cl and 88Rb.

  6. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  7. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    NASA Astrophysics Data System (ADS)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  8. Measurement time and statistics for a noise thermometer with a synthetic-noise reference

    NASA Astrophysics Data System (ADS)

    White, D. R.; Benz, S. P.; Labenski, J. R.; Nam, S. W.; Qu, J. F.; Rogalla, H.; Tew, W. L.

    2008-08-01

    This paper describes methods for reducing the statistical uncertainty in measurements made by noise thermometers using digital cross-correlators and, in particular, for thermometers using pseudo-random noise for the reference signal. First, a discrete-frequency expression for the correlation bandwidth for conventional noise thermometers is derived. It is shown how an alternative frequency-domain computation can be used to eliminate the spectral response of the correlator and increase the correlation bandwidth. The corresponding expressions for the uncertainty in the measurement of pseudo-random noise in the presence of uncorrelated thermal noise are then derived. The measurement uncertainty in this case is less than that for true thermal-noise measurements. For pseudo-random sources generating a frequency comb, an additional small reduction in uncertainty is possible, but at the cost of increasing the thermometer's sensitivity to non-linearity errors. A procedure is described for allocating integration times to further reduce the total uncertainty in temperature measurements. Finally, an important systematic error arising from the calculation of ratios of statistical variables is described.

  9. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    PubMed

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  10. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    NASA Astrophysics Data System (ADS)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  11. Decoherence effect on quantum-memory-assisted entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-01-01

    Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.

  12. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  13. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    DOE PAGES

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; ...

    2017-01-23

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scanmore » geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time–space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. Lastly, it was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.« less

  14. Photomask applications of traceable atomic force microscope dimensional metrology at NIST

    NASA Astrophysics Data System (ADS)

    Dixson, Ronald; Orji, Ndubuisi G.; Potzick, James; Fu, Joseph; Allen, Richard A.; Cresswell, Michael; Smith, Stewart; Walton, Anthony J.; Tsiamis, Andreas

    2007-10-01

    The National Institute of Standards and Technology (NIST) has a multifaceted program in atomic force microscope (AFM) dimensional metrology. Three major instruments are being used for traceable measurements. The first is a custom in-house metrology AFM, called the calibrated AFM (C-AFM), the second is the first generation of commercially available critical dimension AFM (CD-AFM), and the third is a current generation CD-AFM at SEMATECH - for which NIST has established the calibration and uncertainties. All of these instruments have useful applications in photomask metrology. Linewidth reference metrology is an important application of CD-AFM. We have performed a preliminary comparison of linewidths measured by CD-AFM and by electrical resistance metrology on a binary mask. For the ten selected test structures with on-mask linewidths between 350 nm and 600 nm, most of the observed differences were less than 5 nm, and all of them were less than 10 nm. The offsets were often within the estimated uncertainties of the AFM measurements, without accounting for the effect of linewidth roughness or the uncertainties of electrical measurements. The most recent release of the NIST photomask standard - which is Standard Reference Material (SRM) 2059 - was also supported by CD-AFM reference measurements. We review the recent advances in AFM linewidth metrology that will reduce the uncertainty of AFM measurements on this and future generations of the NIST photomask standard. The NIST C-AFM has displacement metrology for all three axes traceable to the 633 nm wavelength of the iodine-stabilized He-Ne laser. One of the important applications of the C-AFM is step height metrology, which has some relevance to phase shift calibration. In the current generation of the system, the approximate level of relative standard uncertainty for step height measurements at the 100 nm scale is 0.1 %. We discuss the monitor history of a 290 nm step height, originally measured on the C-AFM with a 1.9 nm (k = 2) expanded uncertainty, and describe advances that bring the step height uncertainty of recent measurements to an estimated 0.6 nm (k = 2). Based on this work, we expect to be able to reduce the topographic component of phase uncertainty in alternating aperture phase shift masks (AAPSM) by a factor of three compared to current calibrations based on earlier generation step height references.

  15. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.

  16. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    NASA Astrophysics Data System (ADS)

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  17. Model uncertainties of local-thermodynamic-equilibrium K-shell spectroscopy

    NASA Astrophysics Data System (ADS)

    Nagayama, T.; Bailey, J. E.; Mancini, R. C.; Iglesias, C. A.; Hansen, S. B.; Blancard, C.; Chung, H. K.; Colgan, J.; Cosse, Ph.; Faussurier, G.; Florido, R.; Fontes, C. J.; Gilleron, F.; Golovkin, I. E.; Kilcrease, D. P.; Loisel, G.; MacFarlane, J. J.; Pain, J.-C.; Rochau, G. A.; Sherrill, M. E.; Lee, R. W.

    2016-09-01

    Local-thermodynamic-equilibrium (LTE) K-shell spectroscopy is a common tool to diagnose electron density, ne, and electron temperature, Te, of high-energy-density (HED) plasmas. Knowing the accuracy of such diagnostics is important to provide quantitative conclusions of many HED-plasma research efforts. For example, Fe opacities were recently measured at multiple conditions at the Sandia National Laboratories Z machine (Bailey et al., 2015), showing significant disagreement with modeled opacities. Since the plasma conditions were measured using K-shell spectroscopy of tracer Mg (Nagayama et al., 2014), one concern is the accuracy of the inferred Fe conditions. In this article, we investigate the K-shell spectroscopy model uncertainties by analyzing the Mg spectra computed with 11 different models at the same conditions. We find that the inferred conditions differ by ±20-30% in ne and ±2-4% in Te depending on the choice of spectral model. Also, we find that half of the Te uncertainty comes from ne uncertainty. To refine the accuracy of the K-shell spectroscopy, it is important to scrutinize and experimentally validate line-shape theory. We investigate the impact of the inferred ne and Te model uncertainty on the Fe opacity measurements. Its impact is small and does not explain the reported discrepancies.

  18. Propagation of stage measurement uncertainties to streamflow time series

    NASA Astrophysics Data System (ADS)

    Horner, Ivan; Le Coz, Jérôme; Renard, Benjamin; Branger, Flora; McMillan, Hilary

    2016-04-01

    Streamflow uncertainties due to stage measurements errors are generally overlooked in the promising probabilistic approaches that have emerged in the last decade. We introduce an original error model for propagating stage uncertainties through a stage-discharge rating curve within a Bayesian probabilistic framework. The method takes into account both rating curve (parametric errors and structural errors) and stage uncertainty (systematic and non-systematic errors). Practical ways to estimate the different types of stage errors are also presented: (1) non-systematic errors due to instrument resolution and precision and non-stationary waves and (2) systematic errors due to gauge calibration against the staff gauge. The method is illustrated at a site where the rating-curve-derived streamflow can be compared with an accurate streamflow reference. The agreement between the two time series is overall satisfying. Moreover, the quantification of uncertainty is also satisfying since the streamflow reference is compatible with the streamflow uncertainty intervals derived from the rating curve and the stage uncertainties. Illustrations from other sites are also presented. Results are much contrasted depending on the site features. In some cases, streamflow uncertainty is mainly due to stage measurement errors. The results also show the importance of discriminating systematic and non-systematic stage errors, especially for long term flow averages. Perspectives for improving and validating the streamflow uncertainty estimates are eventually discussed.

  19. Accounting for Berkson and Classical Measurement Error in Radon Exposure Using a Bayesian Structural Approach in the Analysis of Lung Cancer Mortality in the French Cohort of Uranium Miners.

    PubMed

    Hoffmann, Sabine; Rage, Estelle; Laurier, Dominique; Laroche, Pierre; Guihenneuc, Chantal; Ancelet, Sophie

    2017-02-01

    Many occupational cohort studies on underground miners have demonstrated that radon exposure is associated with an increased risk of lung cancer mortality. However, despite the deleterious consequences of exposure measurement error on statistical inference, these analyses traditionally do not account for exposure uncertainty. This might be due to the challenging nature of measurement error resulting from imperfect surrogate measures of radon exposure. Indeed, we are typically faced with exposure uncertainty in a time-varying exposure variable where both the type and the magnitude of error may depend on period of exposure. To address the challenge of accounting for multiplicative and heteroscedastic measurement error that may be of Berkson or classical nature, depending on the year of exposure, we opted for a Bayesian structural approach, which is arguably the most flexible method to account for uncertainty in exposure assessment. We assessed the association between occupational radon exposure and lung cancer mortality in the French cohort of uranium miners and found the impact of uncorrelated multiplicative measurement error to be of marginal importance. However, our findings indicate that the retrospective nature of exposure assessment that occurred in the earliest years of mining of this cohort as well as many other cohorts of underground miners might lead to an attenuation of the exposure-risk relationship. More research is needed to address further uncertainties in the calculation of lung dose, since this step will likely introduce important sources of shared uncertainty.

  20. The uncertainty in the radon hazard classification of areas as a function of the number of measurements.

    PubMed

    Friedmann, H; Baumgartner, A; Gruber, V; Kaineder, H; Maringer, F J; Ringer, W; Seidel, C

    2017-07-01

    The administration in many countries demands a classification of areas concerning their radon risk taking into account the requirements of the EU Basic Safety Standards. The wide variation of indoor radon concentrations in an area which is caused by different house construction, different living style and different geological situations introduces large uncertainties for any classification scheme. Therefore, it is of importance to estimate the size of the experimental coefficient of variation (relative standard deviation) of the parameter which is used to classify an area. Besides the time period of measurement it is the number of measurements which strongly influences this uncertainty and it is important to find a compromise between the economic possibilities and the needed confidence level. Some countries do not use pure measurement results for the classification of areas but use derived quantities, usually called radon potential, which should reduce the influence of house construction, living style etc. and should rather represent the geological situation of an area. Here, radon indoor measurements in nearly all homes in three municipalities and its conversion into a radon potential were used to determine the uncertainty of the mean radon potential of an area as a function of the number of investigated homes. It could be shown that the coefficient of variation scales like 1/√n with n the number of measured dwellings. The question how to deal with uncertainties when using a classification scheme for the radon risk is discussed and a general procedure is proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE PAGES

    Ilas, Germina; Liljenfeldt, Henrik

    2017-05-19

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  2. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Liljenfeldt, Henrik

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  3. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  4. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  5. Dynamic rating curve assessment for hydrometric stations and computation of the associated uncertainties: Quality and station management indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan

    2014-09-01

    A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.

  6. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  7. Measurement uncertainty budget of an interferometric flow velocity sensor

    NASA Astrophysics Data System (ADS)

    Bermuske, Mike; Büttner, Lars; Czarske, Jürgen

    2017-06-01

    Flow rate measurements are a common topic for process monitoring in chemical engineering and food industry. To achieve the requested low uncertainties of 0:1% for flow rate measurements, a precise measurement of the shear layers of such flows is necessary. The Laser Doppler Velocimeter (LDV) is an established method for measuring local flow velocities. For exact estimation of the flow rate, the flow profile in the shear layer is of importance. For standard LDV the axial resolution and therefore the number of measurement points in the shear layer is defined by the length of the measurement volume. A decrease of this length is accompanied by a larger fringe distance variation along the measurement axis which results in a rise of the measurement uncertainty for the flow velocity (uncertainty relation between spatial resolution and velocity uncertainty). As a unique advantage, the laser Doppler profile sensor (LDV-PS) overcomes this problem by using two fan-like fringe systems to obtain the position of the measured particles along the measurement axis and therefore achieve a high spatial resolution while it still offers a low velocity uncertainty. With this technique, the flow rate can be estimated with one order of magnitude lower uncertainty, down to 0:05% statistical uncertainty.1 And flow profiles especially in film flows can be measured more accurately. The problem for this technique is, in contrast to laboratory setups where the system is quite stable, that for industrial applications the sensor needs a reliable and robust traceability to the SI units, meter and second. Small deviations in the calibration can, because of the highly position depending calibration function, cause large systematic errors in the measurement result. Therefore, a simple, stable and accurate tool is needed, that can easily be used in industrial surroundings to check or recalibrate the sensor. In this work, different calibration methods are presented and their influences to the measurement uncertainty budget of the sensor is discussed. Finally, generated measurement results for the film flow of an impinging jet cleaning experiment are presented.

  8. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  9. Assessment of Airborne Instrument Uncertainty via Measurement Comparisons Conducted During the DC3 and SEAC4RS Field Campaigns

    NASA Astrophysics Data System (ADS)

    Silverman, M. L.; Chen, G.; Shook, M.

    2016-12-01

    Airborne field campaigns have long understood the importance of well-defined measurement uncertainties and their impact on scientific research. Measurement comparisons are an effective way to assess the uncertainty of different techniques as well as gain insight into instrument performance. As part of the NASA DC3 (Deep Convective Clouds and Chemistry) airborne field campaign, there were several wing-tip-to-wing-tip formation flight segments designed for measurement comparison of the instruments onboard the NASA DC-8 and NSF/NCAR Gulfstream-V aircraft. This provides the opportunity to evaluate the consistency between multiple measurements of the same species/parameters on different platforms and based on different measurement techniques. The DC-8 aircraft was also instrumented with duplicate measurements of the same species, allowing for intraplatform comparisons. The NASA DC-8 was also used during the NASA SEAC4RS (Studies of Emission and Atmospheric Composition, Clouds, and Climate Coupling by Regional Surveys) airborne field campaign. While no wing-tip to wing-tip flights were flown, several instruments measuring the same species were aboard the DC-8 providing intraplatform comparisons. Time series and correlations are produced to show the relative agreement between the measurements both on a daily basis and over the course of the five inter-comparison days. We have also used a data-driven approach to analyze the instrument precisions as an important part of measurement uncertainty assessment. By conducting these analyses we provide insight to users on the quality of the measurements.

  10. Uncertainty propagation in the calibration equations for NTC thermistors

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Guo, Liang; Liu, Chunlong; Wu, Qingwen

    2018-06-01

    The uncertainty propagation problem is quite important for temperature measurements, since we rely so much on the sensors and calibration equations. Although uncertainty propagation for platinum resistance or radiation thermometers is well known, there have been few publications concerning negative temperature coefficient (NTC) thermistors. Insight into the propagation characteristics of uncertainty that develop when equations are determined using the Lagrange interpolation or least-squares fitting method is presented here with respect to several of the most common equations used in NTC thermistor calibration. Within this work, analytical expressions of the propagated uncertainties for both fitting methods are derived for the uncertainties in the measured temperature and resistance at each calibration point. High-precision calibration of an NTC thermistor in a precision water bath was performed by means of the comparison method. Results show that, for both fitting methods, the propagated uncertainty is flat in the interpolation region but rises rapidly beyond the calibration range. Also, for temperatures interpolated between calibration points, the propagated uncertainty is generally no greater than that associated with the calibration points. For least-squares fitting, the propagated uncertainty is significantly reduced by increasing the number of calibration points and can be well kept below the uncertainty of the calibration points.

  11. Estimation of Vickers hardness uncertainty for a heterogeneous welded joint (S235JR+AR and X2CrNiMo17-12-2)

    NASA Astrophysics Data System (ADS)

    Dijmărescu, M. C.; Dijmărescu, M. R.

    2017-08-01

    When talking about tests that include measurements, the uncertainty of measurement is an essential element because it is important to know the limits within the obtained results may be assumed to lie and the influence the measurement technological system elements have on these results. The research presented in this paper focuses on the estimation of the Vickers hardness uncertainty of measurement for the heterogeneous welded joint between S235JR+AR and X2CrNiMo17-12-2 materials in order to establish the results relevance and the quality assessment of this joint. The paper contents are structured in three main parts. In the first part, the initial data necessary for the experiment is presented in terms of the welded joint and technological means characterisation. The second part presents the physical experiment development and its results and in the third part the uncertainty of the measurements is calculated and a results discussion is undertaken.

  12. Calculation of the compounded uncertainty of 14C AMS measurements

    NASA Astrophysics Data System (ADS)

    Nadeau, Marie-Josée; Grootes, Pieter M.

    2013-01-01

    The correct method to calculate conventional 14C ages from the carbon isotopic ratios was summarised 35 years ago by Stuiver and Polach (1977) and is now accepted as the only method to calculate 14C ages. There is, however, no consensus regarding the treatment of AMS data, mainly of the uncertainty of the final result. The estimation and treatment of machine background, process blank, and/or in situ contamination is not uniform between laboratories, leading to differences in 14C results, mainly for older ages. As Donahue (1987) and Currie (1994), among others, mentioned, some laboratories find it important to use the scatter of several measurements as uncertainty while others prefer to use Poisson statistics. The contribution of the scatter of the standards, machine background, process blank, and in situ contamination to the uncertainty of the final 14C result is also treated in different ways. In the early years of AMS, several laboratories found it important to describe their calculation process in details. In recent years, this practise has declined. We present an overview of the calculation process for 14C AMS measurements looking at calculation practises published from the beginning of AMS until present.

  13. Communicating uncertainties in earth sciences in view of user needs

    NASA Astrophysics Data System (ADS)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to uncertain model parameters (parametric variability). These uncertainties can be quantified by uncertainty propagation methods such as Monte Carlo simulation methods. Examples of intrinsic uncertainties that generally cannot be expressed in mathematical terms are errors or biases in: • Results of experiments and observations due to inadequate sampling and errors in analyzing data in the laboratory and even in data reporting. • Results of (laboratory) experiments that are limited to a specific domain or performed under circumstances that differ from field circumstances. • Model structure, due to lack of knowledge of the underlying processes. Structural uncertainty, which may cause model inadequacy/ bias, is inherent in model approaches since models are approximations of reality. Intrinsic uncertainties often occur in an emerging field where ongoing new findings, either experiments or field observations of new model findings, challenge earlier work. In this context, climate scientists working within the IPCC have adopted a lexicon to communicate confidence in their findings, ranging from "very high", "high", "medium", "low" and "very low" confidence. In fact, there are also statistical methods to gain insight in uncertainties in model predictions due to model assumptions (i.e. model structural error). Examples are comparing model results with independent observations or a systematic intercomparison of predictions from multiple models. In the latter case, Bayesian model averaging techniques can be used, in which each model considered gets an assigned prior probability of being the 'true' model. This approach works well with statistical (regression) models, but extension to physically-based models is cumbersome. An alternative is the use of state-space models in which structural errors are represent as (additive) noise terms. In this presentation, we focus on approaches that are relevant at the science - policy interface, including multiple scientific disciplines and policy makers with different subject areas. Approaches to communicate uncertainties in results of observations or model predictions are discussed, distinguishing results that include probabilistic measures of uncertainty and more intrinsic uncertainties. Examples concentrate on uncertainties in nitrogen (N) related environmental issues, including: • Spatio-temporal trends in atmospheric N deposition, in view of the policy question whether there is a declining or increasing trend. • Carbon response to N inputs to terrestrial ecosystems, based on meta-analysis of N addition experiments and other approaches, in view of the policy relevance of N emission control. • Calculated spatial variations in the emissions of nitrous-oxide and ammonia, in view of the need of emission policies at different spatial scales. • Calculated N emissions and losses by model intercomparisons, in view of the policy need to apply no-regret decisions with respect to the control of those emissions.

  14. Calibration Uncertainties in the Droplet Measurement Technologies Cloud Condensation Nuclei Counter

    NASA Astrophysics Data System (ADS)

    Hibert, Kurt James

    Cloud condensation nuclei (CCN) serve as the nucleation sites for the condensation of water vapor in Earth's atmosphere and are important for their effect on climate and weather. The influence of CCN on cloud radiative properties (aerosol indirect effect) is the most uncertain of quantified radiative forcing changes that have occurred since pre-industrial times. CCN influence the weather because intrinsic and extrinsic aerosol properties affect cloud formation and precipitation development. To quantify these effects, it is necessary to accurately measure CCN, which requires accurate calibrations using a consistent methodology. Furthermore, the calibration uncertainties are required to compare measurements from different field projects. CCN uncertainties also aid the integration of CCN measurements with atmospheric models. The commercially available Droplet Measurement Technologies (DMT) CCN Counter is used by many research groups, so it is important to quantify its calibration uncertainty. Uncertainties in the calibration of the DMT CCN counter exist in the flow rate and supersaturation values. The concentration depends on the accuracy of the flow rate calibration, which does not have a large (4.3 %) uncertainty. The supersaturation depends on chamber pressure, temperature, and flow rate. The supersaturation calibration is a complex process since the chamber's supersaturation must be inferred from a temperature difference measurement. Additionally, calibration errors can result from the Kohler theory assumptions, fitting methods utilized, the influence of multiply-charged particles, and calibration points used. In order to determine the calibration uncertainties and the pressure dependence of the supersaturation calibration, three calibrations are done at each pressure level: 700, 840, and 980 hPa. Typically 700 hPa is the pressure used for aircraft measurements in the boundary layer, 840 hPa is the calibration pressure at DMT in Boulder, CO, and 980 hPa is the average surface pressure at Grand Forks, ND. The supersaturation calibration uncertainty is 2.3, 3.1, and 4.4 % for calibrations done at 700, 840, and 980 hPa respectively. The supersaturation calibration change with pressure is on average 0.047 % supersaturation per 100 hPa. The supersaturation calibrations done at UND are 42-45 % lower than supersaturation calibrations done at DMT approximately 1 year previously. Performance checks confirmed that all major leaks developed during shipping were fixed before conducting the supersaturation calibrations. Multiply-charged particles passing through the Electrostatic Classifier may have influenced DMT's activation curves, which is likely part of the supersaturation calibration difference. Furthermore, the fitting method used to calculate the activation size and the limited calibration points are likely significant sources of error in DMT's supersaturation calibration. While the DMT CCN counter's calibration uncertainties are relatively small, and the pressure dependence is easily accounted for, the calibration methodology used by different groups can be very important. The insights gained from the careful calibration of the DMT CCN counter indicate that calibration of scientific instruments using complex methodology is not trivial.

  15. Performance testing of supercapacitors: Important issues and uncertainties

    NASA Astrophysics Data System (ADS)

    Zhao, Jingyuan; Gao, Yinghan; Burke, Andrew F.

    2017-09-01

    Supercapacitors are a promising technology for high power energy storage, which have been used in some industrial and vehicles applications. Hence, it is important that information concerning the performance of supercapacitors be detailed and reliable so system designers can make rational decisions regarding the selection of the energy storage components. This paper is concerned with important issues and uncertainties regarding the performance testing of supercapacitors. The effect of different test procedures on the measured characteristics of both commercial and prototype supercapacitors including hybrid supercapacitors have been studied. It was found that the test procedure has a relatively minor effect on the capacitance of carbon/carbon devices and a more significant effect on the capacitance of hybrid supercapacitors. The device characteristic with the greatest uncertainty is the resistance and subsequently the claimed power capability of the device. The energy density should be measured by performing constant power discharges between appropriate voltage limits. This is particularly important in the case of hybrid supercapacitors for which the energy density is rate dependent and the simple relationship E = ½CV2 does not yield accurate estimates of the energy stored. In general, most of the important issues for testing carbon/carbon devices become more serious for hybrid supercapacitors.

  16. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  17. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  18. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    PubMed

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  19. Uncertainty analysis of gas flow measurements using clearance-sealed piston provers in the range from 0.0012 g min-1 to 60 g min-1

    NASA Astrophysics Data System (ADS)

    Bobovnik, G.; Kutin, J.; Bajsić, I.

    2016-08-01

    This paper deals with an uncertainty analysis of gas flow measurements using a compact, high-speed, clearance-sealed realization of a piston prover. A detailed methodology for the uncertainty analysis, covering the components due to the gas density, dimensional and time measurements, the leakage flow, the density correction factor and the repeatability, is presented. The paper also deals with the selection of the isothermal and adiabatic measurement models, the treatment of the leakage flow and discusses the need for averaging multiple consecutive readings of the piston prover. The analysis is prepared for the flow range (50 000:1) covered by the three interchangeable flow cells. The results show that using the adiabatic measurement model and averaging the multiple readings, the estimated expanded measurement uncertainty of the gas mass flow rate is less than 0.15% in the flow range above 0.012 g min-1, whereas it increases for lower mass flow rates due to the leakage flow related effects. At the upper end of the measuring range, using the adiabatic instead of the isothermal measurement model, as well as averaging multiple readings, proves important.

  20. Quantitative analysis of trace levels of surface contamination by X-ray photoelectron spectroscopy Part I: statistical uncertainty near the detection limit.

    PubMed

    Hill, Shannon B; Faradzhev, Nadir S; Powell, Cedric J

    2017-12-01

    We discuss the problem of quantifying common sources of statistical uncertainties for analyses of trace levels of surface contamination using X-ray photoelectron spectroscopy. We examine the propagation of error for peak-area measurements using common forms of linear and polynomial background subtraction including the correlation of points used to determine both background and peak areas. This correlation has been neglected in previous analyses, but we show that it contributes significantly to the peak-area uncertainty near the detection limit. We introduce the concept of relative background subtraction variance (RBSV) which quantifies the uncertainty introduced by the method of background determination relative to the uncertainty of the background area itself. The uncertainties of the peak area and atomic concentration and of the detection limit are expressed using the RBSV, which separates the contributions from the acquisition parameters, the background-determination method, and the properties of the measured spectrum. These results are then combined to find acquisition strategies that minimize the total measurement time needed to achieve a desired detection limit or atomic-percentage uncertainty for a particular trace element. Minimization of data-acquisition time is important for samples that are sensitive to x-ray dose and also for laboratories that need to optimize throughput.

  1. Exploration of quantum-memory-assisted entropic uncertainty relations in a noninertial frame

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Shi, Jia-Dong; Ye, Liu

    2017-05-01

    The uncertainty principle offers a bound to show accuracy of the simultaneous measurement outcome for two incompatible observables. In this letter, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) when the particle to be measured stays at an open system, and another particle is treated as quantum memory under a noninertial frame. In such a scenario, the collective influence of the unital and nonunital noise environment, and of the relativistic motion of the system, on the QMA-EUR is examined. By numerical analysis, we conclude that, firstly, the noises and the Unruh effect can both increase the uncertainty, due to the decoherence of the bipartite system induced by the noise or Unruh effect; secondly, the uncertainty is more affected by the noises than by the Unruh effect from the acceleration; thirdly, unital noises can reduce the uncertainty in long-time regime. We give a possible physical interpretation for those results: that the information of interest is redistributed among the bipartite, the noisy environment and the physically inaccessible region in the noninertial frame. Therefore, we claim that our observations provide an insight into dynamics of the entropic uncertainty in a noninertial frame, and might be important to quantum precision measurement under relativistic motion.

  2. Evaluation of Effective Sources in Uncertainty Measurements of Personal Dosimetry by a Harshaw TLD System

    PubMed Central

    Hosseini Pooya, SM; Orouji, T

    2014-01-01

    Background: The accurate results of the individual doses in personal dosimety which are reported by the service providers in personal dosimetry are very important. There are national / international criteria for acceptable dosimetry system performance. Objective: In this research, the sources of uncertainties are identified, measured and calculated in a personal dosimetry system by TLD. Method: These sources are included; inhomogeneity of TLDs sensitivity, variability of TLD readings due to limited sensitivity and background, energy dependence, directional dependence, non-linearity of the response, fading, dependent on ambient temperature / humidity and calibration errors, which may affect on the dose responses. Some parameters which influence on the above sources of uncertainty are studied for Harshaw TLD-100 cards dosimeters as well as the hot gas Harshaw 6600 TLD reader system. Results: The individual uncertainties of each sources was measured less than 6.7% in 68% confidence level. The total uncertainty was calculated 17.5% with 95% confidence level. Conclusion: The TLD-100 personal dosimeters as well as the Harshaw TLD-100 reader 6600 system show the total uncertainty value which is less than that of admissible value of 42% for personal dosimetry services. PMID:25505769

  3. Expected trace gas and aerosol retrieval accuracy of the Geostationary Environment Monitoring Spectrometer

    NASA Astrophysics Data System (ADS)

    Jeong, U.; Kim, J.; Liu, X.; Lee, K. H.; Chance, K.; Song, C. H.

    2015-12-01

    The predicted accuracy of the trace gases and aerosol retrievals from the geostationary environment monitoring spectrometer (GEMS) was investigated. The GEMS is one of the first sensors to monitor NO2, SO2, HCHO, O3, and aerosols onboard geostationary earth orbit (GEO) over Asia. Since the GEMS is not launched yet, the simulated measurements and its precision were used in this study. The random and systematic component of the measurement error was estimated based on the instrument design. The atmospheric profiles were obtained from Model for Ozone And Related chemical Tracers (MOZART) simulations and surface reflectances were obtained from climatology of OMI Lambertian equivalent reflectance. The uncertainties of the GEMS trace gas and aerosol products were estimated based on the OE method using the atmospheric profile and surface reflectance. Most of the estimated uncertainties of NO2, HCHO, stratospheric and total O3 products satisfied the user's requirements with sufficient margin. However, about 26% of the estimated uncertainties of SO2 and about 30% of the estimated uncertainties of tropospheric O3 do not meet the required precision. Particularly the estimated uncertainty of SO2 is high in winter, when the emission is strong in East Asia. Further efforts are necessary in order to improve the retrieval accuracy of SO2 and tropospheric O3 in order to reach the scientific goal of GEMS. Random measurement error of GEMS was important for the NO2, SO2, and HCHO retrieval, while both the random and systematic measurement errors were important for the O3 retrievals. The degree of freedom for signal of tropospheric O3 was 0.8 ± 0.2 and that for stratospheric O3 was 2.9 ± 0.5. The estimated uncertainties of the aerosol retrieval from GEMS measurements were predicted to be lower than the required precision for the SZA range of the trace gas retrievals.

  4. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  5. Microparticle tracking velocimetry as a tool for microfluidic flow measurements

    NASA Astrophysics Data System (ADS)

    Salipante, Paul; Hudson, Steven D.; Schmidt, James W.; Wright, John D.

    2017-07-01

    The accurate measurement of flows in microfluidic channels is important for commercial and research applications. We compare the accuracy of flow measurement techniques over a wide range flows. Flow measurements made using holographic microparticle tracking velocimetry (µPTV) and a gravimetric flow standard over the range of 0.5-100 nL/s agree within 0.25%, well within the uncertainty of the two flow systems. Two commercial thermal flow sensors were used as the intermediaries (transfer standards) between the two flow measurement systems. The gravimetric flow standard was used to calibrate the thermal flow sensors by measuring the rate of change of the mass of liquid in a beaker on a micro-balance as it fills. The holographic µPTV flow measurements were made in a rectangular channel and the flow was seeded with 1 µm diameter polystyrene spheres. The volumetric flow was calculated using the Hagen-Pouiseille solution for a rectangular channel. The uncertainty of both flow measurement systems is given. For the gravimetric standard, relative uncertainty increased for decreasing flows due to surface tension forces between the pipette carrying the flow and the free surface of the liquid in the beaker. The uncertainty of the holographic µPTV measurements did not vary significantly over the measured flow range, and thus comparatively are especially useful at low flow velocities.

  6. Quantification of LiDAR measurement uncertainty through propagation of errors due to sensor sub-systems and terrain morphology

    NASA Astrophysics Data System (ADS)

    Goulden, T.; Hopkinson, C.

    2013-12-01

    The quantification of LiDAR sensor measurement uncertainty is important for evaluating the quality of derived DEM products, compiling risk assessment of management decisions based from LiDAR information, and enhancing LiDAR mission planning capabilities. Current quality assurance estimates of LiDAR measurement uncertainty are limited to post-survey empirical assessments or vendor estimates from commercial literature. Empirical evidence can provide valuable information for the performance of the sensor in validated areas; however, it cannot characterize the spatial distribution of measurement uncertainty throughout the extensive coverage of typical LiDAR surveys. Vendor advertised error estimates are often restricted to strict and optimal survey conditions, resulting in idealized values. Numerical modeling of individual pulse uncertainty provides an alternative method for estimating LiDAR measurement uncertainty. LiDAR measurement uncertainty is theoretically assumed to fall into three distinct categories, 1) sensor sub-system errors, 2) terrain influences, and 3) vegetative influences. This research details the procedures for numerical modeling of measurement uncertainty from the sensor sub-system (GPS, IMU, laser scanner, laser ranger) and terrain influences. Results show that errors tend to increase as the laser scan angle, altitude or laser beam incidence angle increase. An experimental survey over a flat and paved runway site, performed with an Optech ALTM 3100 sensor, showed an increase in modeled vertical errors of 5 cm, at a nadir scan orientation, to 8 cm at scan edges; for an aircraft altitude of 1200 m and half scan angle of 15°. In a survey with the same sensor, at a highly sloped glacial basin site absent of vegetation, modeled vertical errors reached over 2 m. Validation of error models within the glacial environment, over three separate flight lines, respectively showed 100%, 85%, and 75% of elevation residuals fell below error predictions. Future work in LiDAR sensor measurement uncertainty must focus on the development of vegetative error models to create more robust error prediction algorithms. To achieve this objective, comprehensive empirical exploratory analysis is recommended to relate vegetative parameters to observed errors.

  7. Variability of blood alcohol content (BAC) determinations: the role of measurement uncertainty, significant figures, and decision rules for compliance assessment in the frame of a multiple BAC threshold law.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela; Zancanaro, Flavio; Sciarrone, Rocco

    2014-10-01

    The measurement of blood-alcohol content (BAC) is a crucial analytical determination required to assess if an offence (e.g. driving under the influence of alcohol) has been committed. For various reasons, results of forensic alcohol analysis are often challenged by the defence. As a consequence, measurement uncertainty becomes a critical topic when assessing compliance with specification limits for forensic purposes. The aims of this study were: (1) to investigate major sources of variability for BAC determinations; (2) to estimate measurement uncertainty for routine BAC determinations; (3) to discuss the role of measurement uncertainty in compliance assessment; (4) to set decision rules for a multiple BAC threshold law, as provided in the Italian Highway Code; (5) to address the topic of the zero-alcohol limit from the forensic toxicology point of view; and (6) to discuss the role of significant figures and rounding errors on measurement uncertainty and compliance assessment. Measurement variability was investigated by the analysis of data collected from real cases and internal quality control. The contribution of both pre-analytical and analytical processes to measurement variability was considered. The resulting expanded measurement uncertainty was 8.0%. Decision rules for the multiple BAC threshold Italian law were set by adopting a guard-banding approach. 0.1 g/L was chosen as cut-off level to assess compliance with the zero-alcohol limit. The role of significant figures and rounding errors in compliance assessment was discussed by providing examples which stressed the importance of these topics for forensic purposes. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Using global sensitivity analysis of demographic models for ecological impact assessment.

    PubMed

    Aiello-Lammens, Matthew E; Akçakaya, H Resit

    2017-02-01

    Population viability analysis (PVA) is widely used to assess population-level impacts of environmental changes on species. When combined with sensitivity analysis, PVA yields insights into the effects of parameter and model structure uncertainty. This helps researchers prioritize efforts for further data collection so that model improvements are efficient and helps managers prioritize conservation and management actions. Usually, sensitivity is analyzed by varying one input parameter at a time and observing the influence that variation has over model outcomes. This approach does not account for interactions among parameters. Global sensitivity analysis (GSA) overcomes this limitation by varying several model inputs simultaneously. Then, regression techniques allow measuring the importance of input-parameter uncertainties. In many conservation applications, the goal of demographic modeling is to assess how different scenarios of impact or management cause changes in a population. This is challenging because the uncertainty of input-parameter values can be confounded with the effect of impacts and management actions. We developed a GSA method that separates model outcome uncertainty resulting from parameter uncertainty from that resulting from projected ecological impacts or simulated management actions, effectively separating the 2 main questions that sensitivity analysis asks. We applied this method to assess the effects of predicted sea-level rise on Snowy Plover (Charadrius nivosus). A relatively small number of replicate models (approximately 100) resulted in consistent measures of variable importance when not trying to separate the effects of ecological impacts from parameter uncertainty. However, many more replicate models (approximately 500) were required to separate these effects. These differences are important to consider when using demographic models to estimate ecological impacts of management actions. © 2016 Society for Conservation Biology.

  9. Long-Range Educational Policy Planning and the Demand for Educated Manpower in Times of Uncertainty.

    ERIC Educational Resources Information Center

    Bakke, E. K.

    1984-01-01

    There is no good method of regulating the educational system based on specific, numerical measurements of labor requirements, and it will be important to integrate uncertainty into future forecasts. Adjustments in demand and supply of educated labor in Norway require a decentralized authority structure providing incentives for institutions and the…

  10. Uncertainties in the deprojection of the observed bar properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Yanfei; Shen, Juntai; Li, Zhao-Yu, E-mail: jshen@shao.ac.cn

    2014-08-10

    In observations, it is important to deproject the two fundamental quantities characterizing a bar, i.e., its length (a) and ellipticity (e), to face-on values before any careful analyses. However, systematic estimation on the uncertainties of the commonly used deprojection methods is still lacking. Simulated galaxies are well suited in this study. We project two simulated barred galaxies onto a two-dimensional (2D) plane with different bar orientations and disk inclination angles (i). Bar properties are measured and deprojected with the popular deprojection methods in the literature. Generally speaking, deprojection uncertainties increase with increasing i. All of the deprojection methods behave badlymore » when i is larger than 60°, due to the vertical thickness of the bar. Thus, future statistical studies of barred galaxies should exclude galaxies more inclined than 60°. At moderate inclination angles (i ≤ 60°), 2D deprojection methods (analytical and image stretching), and Fourier-based methods (Fourier decomposition and bar-interbar contrast) perform reasonably well with uncertainties ∼10% in both the bar length and ellipticity, whereas the uncertainties of the one-dimensional (1D) analytical deprojection can be as high as 100% in certain extreme cases. We find that different bar measurement methods show systematic differences in the deprojection uncertainties. We further discuss the deprojection uncertainty factors with the emphasis on the most important one, i.e., the three-dimensional structure of the bar itself. We construct two triaxial toy bar models that can qualitatively reproduce the results of the 1D and 2D analytical deprojections; they confirm that the vertical thickness of the bar is the main source of uncertainties.« less

  11. Experimental uncertainty and drag measurements in the national transonic facility

    NASA Technical Reports Server (NTRS)

    Batill, Stephen M.

    1994-01-01

    This report documents the results of a study which was conducted in order to establish a framework for the quantitative description of the uncertainty in measurements conducted in the National Transonic Facility (NTF). The importance of uncertainty analysis in both experiment planning and reporting results has grown significantly in the past few years. Various methodologies have been proposed and the engineering community appears to be 'converging' on certain accepted practices. The practical application of these methods to the complex wind tunnel testing environment at the NASA Langley Research Center was based upon terminology and methods established in the American National Standards Institute (ANSI) and the American Society of Mechanical Engineers (ASME) standards. The report overviews this methodology.

  12. Quantifying uncertainty in measurement of mercury in suspended particulate matter by cold vapor technique using atomic absorption spectrometry with hydride generator.

    PubMed

    Singh, Nahar; Ahuja, Tarushee; Ojha, Vijay Narain; Soni, Daya; Tripathy, S Swarupa; Leito, Ivo

    2013-01-01

    As a result of rapid industrialization several chemical forms of organic and inorganic mercury are constantly introduced to the environment and affect humans and animals directly. All forms of mercury have toxic effects; therefore accurate measurement of mercury is of prime importance especially in suspended particulate matter (SPM) collected through high volume sampler (HVS). In the quantification of mercury in SPM samples several steps are involved from sampling to final result. The quality, reliability and confidence level of the analyzed data depends upon the measurement uncertainty of the whole process. Evaluation of measurement uncertainty of results is one of the requirements of the standard ISO/IEC 17025:2005 (European Standard EN IS/ISO/IEC 17025:2005, issue1:1-28, 2006). In the presented study the uncertainty estimation in mercury determination in suspended particulate matter (SPM) has been carried out using cold vapor Atomic Absorption Spectrometer-Hydride Generator (AAS-HG) technique followed by wet chemical digestion process. For the calculation of uncertainty, we have considered many general potential sources of uncertainty. After the analysis of data of seven diverse sites of Delhi, it has been concluded that the mercury concentration varies from 1.59 ± 0.37 to 14.5 ± 2.9 ng/m(3) with 95% confidence level (k = 2).

  13. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  14. Uncertainties in Predicting Rice Yield by Current Crop Models Under a Wide Range of Climatic Conditions

    NASA Technical Reports Server (NTRS)

    Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Adam, Myriam; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fumoto, Tamon; hide

    2014-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi-year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model-based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10 percent of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well-controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.

  15. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  16. Public Perception of Uncertainties Within Climate Change Science.

    PubMed

    Visschers, Vivianne H M

    2018-01-01

    Climate change is a complex, multifaceted problem involving various interacting systems and actors. Therefore, the intensities, locations, and timeframes of the consequences of climate change are hard to predict and cause uncertainties. Relatively little is known about how the public perceives this scientific uncertainty and how this relates to their concern about climate change. In this article, an online survey among 306 Swiss people is reported that investigated whether people differentiate between different types of uncertainty in climate change research. Also examined was the way in which the perception of uncertainty is related to people's concern about climate change, their trust in science, their knowledge about climate change, and their political attitude. The results of a principal component analysis showed that respondents differentiated between perceived ambiguity in climate research, measurement uncertainty, and uncertainty about the future impact of climate change. Using structural equation modeling, it was found that only perceived ambiguity was directly related to concern about climate change, whereas measurement uncertainty and future uncertainty were not. Trust in climate science was strongly associated with each type of uncertainty perception and was indirectly associated with concern about climate change. Also, more knowledge about climate change was related to less strong perceptions of each type of climate science uncertainty. Hence, it is suggested that to increase public concern about climate change, it may be especially important to consider the perceived ambiguity about climate research. Efforts that foster trust in climate science also appear highly worthwhile. © 2017 Society for Risk Analysis.

  17. Estimating the uncertainty from sampling in pollution crime investigation: The importance of metrology in the forensic interpretation of environmental data.

    PubMed

    Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo

    2018-07-01

    The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Universal quantum uncertainty relations between nonergodicity and loss of information

    NASA Astrophysics Data System (ADS)

    Awasthi, Natasha; Bhattacharya, Samyadeb; SenDe, Aditi; Sen, Ujjwal

    2018-03-01

    We establish uncertainty relations between information loss in general open quantum systems and the amount of nonergodicity of the corresponding dynamics. The relations hold for arbitrary quantum systems interacting with an arbitrary quantum environment. The elements of the uncertainty relations are quantified via distance measures on the space of quantum density matrices. The relations hold for arbitrary distance measures satisfying a set of intuitively satisfactory axioms. The relations show that as the nonergodicity of the dynamics increases, the lower bound on information loss decreases, which validates the belief that nonergodicity plays an important role in preserving information of quantum states undergoing lossy evolution. We also consider a model of a central qubit interacting with a fermionic thermal bath and derive its reduced dynamics to subsequently investigate the information loss and nonergodicity in such dynamics. We comment on the "minimal" situations that saturate the uncertainty relations.

  19. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    PubMed

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  20. Mapping Uncertainty Due to Missing Data in the Global Ocean Health Index.

    PubMed

    Frazier, Melanie; Longo, Catherine; Halpern, Benjamin S

    2016-01-01

    Indicators are increasingly used to measure environmental systems; however, they are often criticized for failing to measure and describe uncertainty. Uncertainty is particularly difficult to evaluate and communicate in the case of composite indicators which aggregate many indicators of ecosystem condition. One of the ongoing goals of the Ocean Health Index (OHI) has been to improve our approach to dealing with missing data, which is a major source of uncertainty. Here we: (1) quantify the potential influence of gapfilled data on index scores from the 2015 global OHI assessment; (2) develop effective methods of tracking, quantifying, and communicating this information; and (3) provide general guidance for implementing gapfilling procedures for existing and emerging indicators, including regional OHI assessments. For the overall OHI global index score, the percent contribution of gapfilled data was relatively small (18.5%); however, it varied substantially among regions and goals. In general, smaller territorial jurisdictions and the food provision and tourism and recreation goals required the most gapfilling. We found the best approach for managing gapfilled data was to mirror the general framework used to organize, calculate, and communicate the Index data and scores. Quantifying gapfilling provides a measure of the reliability of the scores for different regions and components of an indicator. Importantly, this information highlights the importance of the underlying datasets used to calculate composite indicators and can inform and incentivize future data collection.

  1. Wavelength-encoded optical psychrometer for relative humidity measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montanini, Roberto

    2007-02-15

    In this article an optical psychrometer, in which temperature measurements are performed by means of two fiber Bragg grating sensors used as dry-bulb and wet-bulb thermometers, is introduced. The adopted design exploits both the high accuracy of psychrometric-based relative humidity measurements with acknowledged advantages of wavelength-encoded fiber optic sensing. Important metrological issues that have been addressed in the experimental work include calibration of the fiber Bragg grating temperature sensors, evaluation of response time, sensitivity, hysteresis, linearity, and accuracy. The calibration results give confidence that, with the current experimental setup, measurement of temperature can be done with an uncertainty of {+-}0.2more » deg. C and a resolution of 0.1 deg. C. A detailed uncertainty analysis is also presented in the article to investigate the effects produced by different sources of error on the combined standard uncertainty u{sub c}(U) of the relative humidity measurement, which has been estimated to be roughly within {+-}2% in the range close to saturation.« less

  2. Wavelength-encoded optical psychrometer for relative humidity measurement.

    PubMed

    Montanini, Roberto

    2007-02-01

    In this article an optical psychrometer, in which temperature measurements are performed by means of two fiber Bragg grating sensors used as dry-bulb and wet-bulb thermometers, is introduced. The adopted design exploits both the high accuracy of psychrometric-based relative humidity measurements with acknowledged advantages of wavelength-encoded fiber optic sensing. Important metrological issues that have been addressed in the experimental work include calibration of the fiber Bragg grating temperature sensors, evaluation of response time, sensitivity, hysteresis, linearity, and accuracy. The calibration results give confidence that, with the current experimental setup, measurement of temperature can be done with an uncertainty of +/- 0.2 degrees C and a resolution of 0.1 degrees C. A detailed uncertainty analysis is also presented in the article to investigate the effects produced by different sources of error on the combined standard uncertainty uc(U) of the relative humidity measurement, which has been estimated to be roughly within +/-2% in the range close to saturation.

  3. Assessing the importance of rainfall uncertainty on hydrological models with different spatial and temporal scale

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2015-04-01

    Precipitation is one of the key inputs for hydrological models. As long as the values of the hydrological model parameters are fixed, a variation of the rainfall input is expected to induce a change in the model output. Given the increased awareness of uncertainty on rainfall records, it becomes more important to understand the impact of this input - output dynamic. Yet, modellers often still have the intention to mimic the observed flow, whatever the deviation of the employed records from the actual rainfall might be, by recklessly adapting the model parameter values. But is it actually possible to vary the model parameter values in such a way that a certain (observed) model output can be generated based on inaccurate rainfall inputs? Thus, how important is the rainfall uncertainty for the model output with respect to the model parameter importance? To address this question, we apply the Sobol' sensitivity analysis method to assess and compare the importance of the rainfall uncertainty and the model parameters on the output of the hydrological model. In order to be able to treat the regular model parameters and input uncertainty in the same way, and to allow a comparison of their influence, a possible approach is to represent the rainfall uncertainty by a parameter. To tackle the latter issue, we apply so called rainfall multipliers on hydrological independent storm events, as a probabilistic parameter representation of the possible rainfall variation. As available rainfall records are very often point measurements at a discrete time step (hourly, daily, monthly,…), they contain uncertainty due to a latent lack of spatial and temporal variability. The influence of the latter variability can also be different for hydrological models with different spatial and temporal scale. Therefore, we perform the sensitivity analyses on a semi-distributed model (SWAT) and a lumped model (NAM). The assessment and comparison of the importance of the rainfall uncertainty and the model parameters is achieved by considering different scenarios for the included parameters and the state of the models.

  4. Energy and Uncertainty in General Relativity

    NASA Astrophysics Data System (ADS)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  5. How to Avoid Errors in Error Propagation: Prediction Intervals and Confidence Intervals in Forest Biomass

    NASA Astrophysics Data System (ADS)

    Lilly, P.; Yanai, R. D.; Buckley, H. L.; Case, B. S.; Woollons, R. C.; Holdaway, R. J.; Johnson, J.

    2016-12-01

    Calculations of forest biomass and elemental content require many measurements and models, each contributing uncertainty to the final estimates. While sampling error is commonly reported, based on replicate plots, error due to uncertainty in the regression used to estimate biomass from tree diameter is usually not quantified. Some published estimates of uncertainty due to the regression models have used the uncertainty in the prediction of individuals, ignoring uncertainty in the mean, while others have propagated uncertainty in the mean while ignoring individual variation. Using the simple case of the calcium concentration of sugar maple leaves, we compare the variation among individuals (the standard deviation) to the uncertainty in the mean (the standard error) and illustrate the declining importance in the prediction of individual concentrations as the number of individuals increases. For allometric models, the analogous statistics are the prediction interval (or the residual variation in the model fit) and the confidence interval (describing the uncertainty in the best fit model). The effect of propagating these two sources of error is illustrated using the mass of sugar maple foliage. The uncertainty in individual tree predictions was large for plots with few trees; for plots with 30 trees or more, the uncertainty in individuals was less important than the uncertainty in the mean. Authors of previously published analyses have reanalyzed their data to show the magnitude of these two sources of uncertainty in scales ranging from experimental plots to entire countries. The most correct analysis will take both sources of uncertainty into account, but for practical purposes, country-level reports of uncertainty in carbon stocks, as required by the IPCC, can ignore the uncertainty in individuals. Ignoring the uncertainty in the mean will lead to exaggerated estimates of confidence in estimates of forest biomass and carbon and nutrient contents.

  6. Representing radar rainfall uncertainty with ensembles based on a time-variant geostatistical error modelling approach

    NASA Astrophysics Data System (ADS)

    Cecinati, Francesca; Rico-Ramirez, Miguel Angel; Heuvelink, Gerard B. M.; Han, Dawei

    2017-05-01

    The application of radar quantitative precipitation estimation (QPE) to hydrology and water quality models can be preferred to interpolated rainfall point measurements because of the wide coverage that radars can provide, together with a good spatio-temporal resolutions. Nonetheless, it is often limited by the proneness of radar QPE to a multitude of errors. Although radar errors have been widely studied and techniques have been developed to correct most of them, residual errors are still intrinsic in radar QPE. An estimation of uncertainty of radar QPE and an assessment of uncertainty propagation in modelling applications is important to quantify the relative importance of the uncertainty associated to radar rainfall input in the overall modelling uncertainty. A suitable tool for this purpose is the generation of radar rainfall ensembles. An ensemble is the representation of the rainfall field and its uncertainty through a collection of possible alternative rainfall fields, produced according to the observed errors, their spatial characteristics, and their probability distribution. The errors are derived from a comparison between radar QPE and ground point measurements. The novelty of the proposed ensemble generator is that it is based on a geostatistical approach that assures a fast and robust generation of synthetic error fields, based on the time-variant characteristics of errors. The method is developed to meet the requirement of operational applications to large datasets. The method is applied to a case study in Northern England, using the UK Met Office NIMROD radar composites at 1 km resolution and at 1 h accumulation on an area of 180 km by 180 km. The errors are estimated using a network of 199 tipping bucket rain gauges from the Environment Agency. 183 of the rain gauges are used for the error modelling, while 16 are kept apart for validation. The validation is done by comparing the radar rainfall ensemble with the values recorded by the validation rain gauges. The validated ensemble is then tested on a hydrological case study, to show the advantage of probabilistic rainfall for uncertainty propagation. The ensemble spread only partially captures the mismatch between the modelled and the observed flow. The residual uncertainty can be attributed to other sources of uncertainty, in particular to model structural uncertainty, parameter identification uncertainty, uncertainty in other inputs, and uncertainty in the observed flow.

  7. Establishment of metrological traceability in porosity measurements by x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Hermanek, Petr; Carmignato, Simone

    2017-09-01

    Internal porosity is an inherent phenomenon to many manufacturing processes, such as casting, additive manufacturing, and others. Since these defects cannot be completely avoided by improving production processes, it is important to have a reliable method to detect and evaluate them accurately. The accurate evaluation becomes even more important concerning current industrial trends to minimize size and weight of products on one side, and enhance their complexity and performance on the other. X-ray computed tomography (CT) has emerged as a promising instrument for holistic porosity measurements offering several advantages over equivalent methods already established in the detection of internal defects. The main shortcomings of the conventional techniques pertain to too general information about total porosity content (e.g. Archimedes method) or the destructive way of testing (e.g. microscopy of cross-sections). On the contrary, CT is a nondestructive technique providing complete information about size, shape and distribution of internal porosity. However, due to the lack of international standards and the fact that it is relatively a new measurement technique, CT as a measurement technology has not yet reached maturity. This study proposes a procedure for the establishment of measurement traceability in porosity measurements by CT including the necessary evaluation of measurement uncertainty. The traceability transfer is carried out through a novel reference standard calibrated by optical and tactile coordinate measuring systems. The measurement uncertainty is calculated following international standards and guidelines. In addition, the accuracy of porosity measurements by CT with the associated measurement uncertainty is evaluated using the reference standard.

  8. Parameter-induced uncertainty quantification of crop yields, soil N2O and CO2 emission for 8 arable sites across Europe using the LandscapeDNDC model

    NASA Astrophysics Data System (ADS)

    Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf

    2014-05-01

    When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.

  9. Evaluation of uncertainty for regularized deconvolution: A case study in hydrophone measurements.

    PubMed

    Eichstädt, S; Wilkens, V

    2017-06-01

    An estimation of the measurand in dynamic metrology usually requires a deconvolution based on a dynamic calibration of the measuring system. Since deconvolution is, mathematically speaking, an ill-posed inverse problem, some kind of regularization is required to render the problem stable and obtain usable results. Many approaches to regularized deconvolution exist in the literature, but the corresponding evaluation of measurement uncertainties is, in general, an unsolved issue. In particular, the uncertainty contribution of the regularization itself is a topic of great importance, because it has a significant impact on the estimation result. Here, a versatile approach is proposed to express prior knowledge about the measurand based on a flexible, low-dimensional modeling of an upper bound on the magnitude spectrum of the measurand. This upper bound allows the derivation of an uncertainty associated with the regularization method in line with the guidelines in metrology. As a case study for the proposed method, hydrophone measurements in medical ultrasound with an acoustic working frequency of up to 7.5 MHz are considered, but the approach is applicable for all kinds of estimation methods in dynamic metrology, where regularization is required and which can be expressed as a multiplication in the frequency domain.

  10. Sensitivity analysis in practice: providing an uncertainty budget when applying supplement 1 to the GUM

    NASA Astrophysics Data System (ADS)

    Allard, Alexandre; Fischer, Nicolas

    2018-06-01

    Sensitivity analysis associated with the evaluation of measurement uncertainty is a very important tool for the metrologist, enabling them to provide an uncertainty budget and to gain a better understanding of the measurand and the underlying measurement process. Using the GUM uncertainty framework, the contribution of an input quantity to the variance of the output quantity is obtained through so-called ‘sensitivity coefficients’. In contrast, such coefficients are no longer computed in cases where a Monte-Carlo method is used. In such a case, supplement 1 to the GUM suggests varying the input quantities one at a time, which is not an efficient method and may provide incorrect contributions to the variance in cases where significant interactions arise. This paper proposes different methods for the elaboration of the uncertainty budget associated with a Monte Carlo method. An application to the mass calibration example described in supplement 1 to the GUM is performed with the corresponding R code for implementation. Finally, guidance is given for choosing a method, including suggestions for a future revision of supplement 1 to the GUM.

  11. Capital and time: uncertainty and qualitative measures of inequality.

    PubMed

    Bear, Laura

    2014-12-01

    This review compares Piketty and Marx's approaches to capital and time in order to argue for the importance of qualitative measures of inequality. These latter measures emphasize varying experiences across classes and through history of uncertainty and insecurity. They explore how the social rhythms of capital profoundly affect the ability to plan a life-course. Quantitative measures such as those used by Piketty that focus on the amount of capital that accrues through time cannot capture such important phenomenon. This is especially because their calculations rest on absolute amounts of capital recorded in formal state statistics. Their limits are particularly revealed if we consider issues of: informal labour, social reproduction, and changing institutional forms of public debt. If we are to build the inter-disciplinary rapprochement between social science and economics that Piketty calls for it must be through asserting the value of qualitative measures of insecurity and its effects on decision making. These are important to track both at the macro-level of institutions and at the micro-level scale of human lives. It is, therefore, through emphasizing the existing strengths of both anthropology and history that we can meet Piketty's important challenge to make our scholarship relevant to current political and social debates. © London School of Economics and Political Science 2014.

  12. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    NASA Astrophysics Data System (ADS)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  13. Displacement, distance, and shape measurements of fast-rotating rough objects by two mutually tilted interference fringe systems.

    PubMed

    Günther, Philipp; Kuschmierz, Robert; Pfister, Thorsten; Czarske, Jürgen W

    2013-05-01

    The precise distance measurement of fast-moving rough surfaces is important in several applications such as lathe monitoring. A nonincremental interferometer based on two mutually tilted interference fringe systems has been realized for this task. The distance is coded in the phase difference between the generated interference signals corresponding to the fringe systems. Large tilting angles between the interference fringe systems are necessary for a high sensitivity. However, due to the speckle effect at rough surfaces, different envelopes and phase jumps of the interference signals occur. At large tilting angles, these signals become dissimilar, resulting in a small correlation coefficient and a high measurement uncertainty. Based on a matching of illumination and receiving optics, the correlation coefficient and the phase difference estimation have been improved significantly. For axial displacement measurements of recurring rough surfaces, laterally moving with velocities of 5 m/s, an uncertainty of 110 nm has been attained. For nonrecurring surfaces, a distance measurement uncertainty of 830 nm has been achieved. Incorporating the additionally measured lateral velocity and the rotational speed, the two-dimensional shape of rotating objects results. Since the measurement uncertainty of the displacement, distance, and shape is nearly independent of the lateral surface velocity, this technique is predestined for fast-rotating objects, such as crankshafts, camshafts, vacuum pump shafts, or turning parts of lathes.

  14. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Large contribution of natural aerosols to uncertainty in indirect forcing

    NASA Astrophysics Data System (ADS)

    Carslaw, K. S.; Lee, L. A.; Reddington, C. L.; Pringle, K. J.; Rap, A.; Forster, P. M.; Mann, G. W.; Spracklen, D. V.; Woodhouse, M. T.; Regayre, L. A.; Pierce, J. R.

    2013-11-01

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  16. The Thermal Conductivity of Earth's Core: A Key Geophysical Parameter's Constraints and Uncertainties

    NASA Astrophysics Data System (ADS)

    Williams, Q.

    2018-05-01

    The thermal conductivity of iron alloys at high pressures and temperatures is a critical parameter in governing ( a) the present-day heat flow out of Earth's core, ( b) the inferred age of Earth's inner core, and ( c) the thermal evolution of Earth's core and lowermost mantle. It is, however, one of the least well-constrained important geophysical parameters, with current estimates for end-member iron under core-mantle boundary conditions varying by about a factor of 6. Here, the current state of calculations, measurements, and inferences that constrain thermal conductivity at core conditions are reviewed. The applicability of the Wiedemann-Franz law, commonly used to convert electrical resistivity data to thermal conductivity data, is probed: Here, whether the constant of proportionality, the Lorenz number, is constant at extreme conditions is of vital importance. Electron-electron inelastic scattering and increases in Fermi-liquid-like behavior may cause uncertainties in thermal conductivities derived from both first-principles-associated calculations and electrical conductivity measurements. Additional uncertainties include the role of alloying constituents and local magnetic moments of iron in modulating the thermal conductivity. Thus, uncertainties in thermal conductivity remain pervasive, and hence a broad range of core heat flows and inner core ages appear to remain plausible.

  17. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.

  18. Comparison of Approaches for Measuring the Mass Accommodation Coefficient for the Condensation of Water and Sensitivities to Uncertainties in Thermophysical Properties

    PubMed Central

    2012-01-01

    We compare and contrast measurements of the mass accommodation coefficient of water on a water surface made using ensemble and single particle techniques under conditions of supersaturation and subsaturation, respectively. In particular, we consider measurements made using an expansion chamber, a continuous flow streamwise thermal gradient cloud condensation nuclei chamber, the Leipzig Aerosol Cloud Interaction Simulator, aerosol optical tweezers, and electrodynamic balances. Although this assessment is not intended to be comprehensive, these five techniques are complementary in their approach and give values that span the range from near 0.1 to 1.0 for the mass accommodation coefficient. We use the same semianalytical treatment to assess the sensitivities of the measurements made by the various techniques to thermophysical quantities (diffusion constants, thermal conductivities, saturation pressure of water, latent heat, and solution density) and experimental parameters (saturation value and temperature). This represents the first effort to assess and compare measurements made by different techniques to attempt to reduce the uncertainty in the value of the mass accommodation coefficient. Broadly, we show that the measurements are consistent within the uncertainties inherent to the thermophysical and experimental parameters and that the value of the mass accommodation coefficient should be considered to be larger than 0.5. Accurate control and measurement of the saturation ratio is shown to be critical for a successful investigation of the surface transport kinetics during condensation/evaporation. This invariably requires accurate knowledge of the partial pressure of water, the system temperature, the droplet curvature and the saturation pressure of water. Further, the importance of including and quantifying the transport of heat in interpreting droplet measurements is highlighted; the particular issues associated with interpreting measurements of condensation/evaporation rates with varying pressure are discussed, measurements that are important for resolving the relative importance of gas diffusional transport and surface kinetics. PMID:23057492

  19. Uncertainty analysis of gross primary production partitioned from net ecosystem exchange measurements

    NASA Astrophysics Data System (ADS)

    Raj, Rahul; Hamm, Nicholas Alexander Samuel; van der Tol, Christiaan; Stein, Alfred

    2016-03-01

    Gross primary production (GPP) can be separated from flux tower measurements of net ecosystem exchange (NEE) of CO2. This is used increasingly to validate process-based simulators and remote-sensing-derived estimates of simulated GPP at various time steps. Proper validation includes the uncertainty associated with this separation. In this study, uncertainty assessment was done in a Bayesian framework. It was applied to data from the Speulderbos forest site, The Netherlands. We estimated the uncertainty in GPP at half-hourly time steps, using a non-rectangular hyperbola (NRH) model for its separation from the flux tower measurements. The NRH model provides a robust empirical relationship between radiation and GPP. It includes the degree of curvature of the light response curve, radiation and temperature. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. We defined the prior distribution of each NRH parameter and used Markov chain Monte Carlo (MCMC) simulation to estimate the uncertainty in the separated GPP from the posterior distribution at half-hourly time steps. This time series also allowed us to estimate the uncertainty at daily time steps. We compared the informative with the non-informative prior distributions of the NRH parameters and found that both choices produced similar posterior distributions of GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.

  20. Optimized clustering estimators for BAO measurements accounting for significant redshift uncertainty

    NASA Astrophysics Data System (ADS)

    Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago; Percival, Will J.; Dodelson, Scott; Garcia-Bellido, Juan; Crocce, Martin; Elvin-Poole, Jack; Giannantonio, Tommaso; Manera, Marc; Sevilla-Noarbe, Ignacio

    2017-12-01

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the baryon acoustic oscillation (BAO) information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line of sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty σz ≥ 0.02(1 + z), we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for σz ≥ 0.02(1 + z). For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations (combining two separate sets) of galaxy simulations mimicking the Dark Energy Survey Year 1 (DES Y1) sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  1. Estimation of Uncertainties in Stage-Discharge Curve for an Experimental Himalayan Watershed

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Sen, S.

    2016-12-01

    Various water resource projects developed on rivers originating from the Himalayan region, the "Water Tower of Asia", plays an important role on downstream development. Flow measurements at the desired river site are very critical for river engineers and hydrologists for water resources planning and management, flood forecasting, reservoir operation and flood inundation studies. However, an accurate discharge assessment of these mountainous rivers is costly, tedious and frequently dangerous to operators during flood events. Currently, in India, discharge estimation is linked to stage-discharge relationship known as rating curve. This relationship would be affected by a high degree of uncertainty. Estimating the uncertainty of rating curve remains a relevant challenge because it is not easy to parameterize. Main source of rating curve uncertainty are errors because of incorrect discharge measurement, variation in hydraulic conditions and depth measurement. In this study our objective is to obtain best parameters of rating curve that fit the limited record of observations and to estimate uncertainties at different depth obtained from rating curve. The rating curve parameters of standard power law are estimated for three different streams of Aglar watershed located in lesser Himalayas by maximum-likelihood estimator. Quantification of uncertainties in the developed rating curves is obtained from the estimate of variances and covariances of the rating curve parameters. Results showed that the uncertainties varied with catchment behavior with error varies between 0.006-1.831 m3/s. Discharge uncertainty in the Aglar watershed streams significantly depend on the extent of extrapolation outside the range of observed water levels. Extrapolation analysis confirmed that more than 15% for maximum discharges and 5% for minimum discharges are not strongly recommended for these mountainous gauging sites.

  2. Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data

    NASA Astrophysics Data System (ADS)

    Stegmeir, Matthew; Kassen, Dan

    2016-11-01

    As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.

  3. Benefits of the uncertainty management intervention for African American and White older breast cancer survivors: 20-month outcomes.

    PubMed

    Gil, Karen M; Mishel, Merle H; Belyea, Michael; Germino, Barbara; Porter, Laura S; Clayton, Margaret

    2006-01-01

    In a 2 x 2 randomized block repeated measure design, this study evaluated the follow-up efficacy of the uncertainty management intervention at 20 months. The sample included 483 recurrence-free women (342 White, 141 African American women; mean age = 64 years) who were 5-9 years posttreatment for breast cancer. Women were randomly assigned to either the intervention or usual care control condition. The intervention was delivered during 4 weekly telephone sessions in which survivors were guided in the use of audiotaped cognitive-behavioral strategies and a self-help manual. Repeated measures MANOVAs evaluating treatment group, ethnic group, and treatment by ethnic interaction effects at 20 months indicated that training in uncertainty management resulted in improvements in cognitive reframing, cancer knowledge, and a variety of coping skills. Importantly, the 20-month outcomes also demonstrated benefits for women in the intervention condition in terms of declines in illness uncertainty and stable effects in personal growth over time.

  4. Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes

    NASA Astrophysics Data System (ADS)

    Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris

    2017-12-01

    Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.

  5. Progress toward a new beam measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Hoogerheide, Shannon Fogwell

    2016-09-01

    Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosysnthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method will be performed at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement and the technical improvements will be discussed.

  6. Physicians' reactions to uncertainty in the context of shared decision making.

    PubMed

    Politi, Mary C; Légaré, France

    2010-08-01

    Physicians' reactions towards uncertainty may influence their willingness to engage in shared decision making (SDM). This study aimed to identify variables associated with physician's anxiety from uncertainty and reluctance to disclose uncertainty to patients. We conducted a cross-sectional secondary analysis of longitudinal data of an implementation study of SDM among primary care professionals (n=122). Outcomes were anxiety from uncertainty and reluctance to disclose uncertainty to patients. Hypothesized factors that would be associated with outcomes included attitude, social norm, perceived behavioral control, intention to implement SDM in practice, and socio-demographics. Stepwise linear regression was used to identify predictors of anxiety from uncertainty and reluctance to disclose uncertainty to patients. In multivariate analyses, anxiety from uncertainty was influenced by female gender (beta=0.483; p=0.0039), residency status (1st year: beta=0.600; p=0.001; 2nd year: beta=0.972; p<0.001), and number of hours worked per week (beta=-0.012; p=0.048). Reluctance to disclose uncertainty to patients was influenced by having more years in formal education (beta=-1.996; p=0.012). Variables associated with anxiety from uncertainty differ from those associated with reluctance to disclose uncertainty to patients. Given the importance of communicating uncertainty during SDM, measuring physicians' reactions to uncertainty is essential in SDM implementation studies. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  7. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.

  8. SU-F-T-316: A Model to Deal with Dosimetric and Delivery Uncertainties in Radiotherapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haering, P; Lang, C; Splinter, M

    2016-06-15

    Purpose The conventional way of dealing with uncertainties resulting from dose calculation or beam delivery in IMRT, is to do verification measurements for the plan in question. Here we present an alternative based on recommendations given in the AAPM 142 report and treatment specific parameters that model the uncertainties for the plan delivery. Methods Basis of the model is the assignment of uncertainty parameters to all segment fields or control point sequences of a plan. The given field shape is analyzed for complexity, dose rate, number of MU, field size related output as well as factors for in/out field positionmore » and penumbra regions. Together with depth related uncertainties, a 3D matrix is generated by a projection algorithm. Patient anatomy is included as uncertainty CT data set as well. Therefore, object density is classified in 4 categories close to water, lung, bone and gradient regions with additional uncertainties. The result is then exported as a DICOM dose file by the software tool (written in IDL, Exelis), having the given resolution and target point. Results Uncertainty matrixes for several patient cases have been calculated and compared side by side in the planning system. The result is not quite always intuitive but it clearly indicates high and low uncertainties related to OARs and target volumes as well as to measured gamma distributions.ConclusionThe imported uncertainty datasets may help the treatment planner to understand the complexity of the treatment plan. He then might decide to change the plan to produce a more suited uncertainty distribution, e.g. by changing the beam angles the high uncertainty spots can be influenced or try to use another treatment setup, resulting in a plan with lower uncertainties. A next step could be to include such a model into the optimization algorithm to add a new dose uncertainty constraint.« less

  9. Preliminary evaluation of the importance of existing hydraulic-head observation locations to advective-transport predictions, Death Valley regional flow system, California and Nevada

    USGS Publications Warehouse

    Hill, Mary C.; Ely, D. Matthew; Tiedeman, Claire; O'Brien, Grady M.; D'Agnese, Frank A.; Faunt, Claudia C.

    2001-01-01

    When a model is calibrated by nonlinear regression, calculated diagnostic statistics and measures of uncertainty provide a wealth of information about many aspects of the system. This report presents a method of ranking the likely importance of existing observation locations using measures of prediction uncertainty. It is suggested that continued monitoring is warranted at more important locations, and unwarranted or less warranted at less important locations. The report develops the methodology and then demonstrates it using the hydraulic-head observation locations of a three-layer model of the Death Valley regional flow system. The predictions of interest are subsurface transport from beneath Yucca Mountain and 14 Underground Test Areas. The advective component of transport is considered because it is the component most affected by the system dynamics represented by the scale model being used. The problem is addressed using the capabilities of the U.S. Geological Survey computer program MODFLOW-2000, with its ADVective-Travel Observation (ADV) Package, and an additional computer program developed for this work. The methods presented in this report are used in three ways. (1) The ratings for individual observations are obtained by manipulating the measures of prediction uncertainty, and do not involve recalibrating the model. In this analysis, observation locations are each omitted individually and the resulting increase in uncertainty in the predictions is calculated. The uncertainty is quantified as standard deviations on the simulated advective transport. The increase in uncertainty is quantified as the percent increase in the standard deviations caused by omitting the one observation location from the calculation of standard deviations. In general, observation locations associated with larger increases are rated as more important. (2) Ratings for largely geographically based groups are obtained using a straightforward extension of the method used for individual observation locations. This analysis is needed where observations are clustered to determine whether the area is important to the predictions of interest. (3) Finally, the method is used to evaluate omitting a set of 100 observation locations. The locations were selected because they had low individual ratings and were not one of the few locations at which hydraulic heads from deep in the system were measured. The major results of the three analyses, when applied to the three-layer DVRFS ground-water flow system, are described in the following paragraphs. The discussion is labeled using the numbers 1 to 3 to clearly relate it to the three ways the method is used, as listed above. (1) The individual observation location analysis indicates that three observation locations are most important. They are located in Emigrant Valley, Oasis Valley, and Beatty. Of importance is that these and other observations shown to be important by this analysis are far from the travel paths considered. This displays the importance of the regional setting within which the transport occurs, the importance of including some sites throughout the area in the monitoring network, and the importance of including sites in these areas in particular. The method considered in this report indicates that the 19 observation locations that reflect hydraulic heads deeper in the system (in model layers 1, 2, and 3) are not very important. This appears to be because the locations of these observations are in the vicinity of shallow observation locations that also generally are rated as low importance, and because the model layers are hydraulically well connected vertically. The value of deep observations to testing conceptual models, however, is stressed. As a result, the deep observations are rated higher than is consistent with the results of the analysis presented, and none of these observations are omitted in the scenario discussed under (3) below. (2) The geographic grouping of th

  10. Comparison of model estimated and measured direct-normal solar irradiance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halthore, R.N.; Schwartz, S.E.; Michalsky, J.J.

    1997-12-01

    Direct-normal solar irradiance (DNSI), the energy in the solar spectrum incident in unit time at the Earth{close_quote}s surface on a unit area perpendicular to the direction to the Sun, depends only on atmospheric extinction of solar energy without regard to the details of the extinction, whether absorption or scattering. Here we report a set of closure experiments performed in north central Oklahoma in April 1996 under cloud-free conditions, wherein measured atmospheric composition and aerosol optical thickness are input to a radiative transfer model, MODTRAN 3, to estimate DNSI, which is then compared with measured values obtained with normal incidence pyrheliometersmore » and absolute cavity radiometers. Uncertainty in aerosol optical thickness (AOT) dominates the uncertainty in DNSI calculation. AOT measured by an independently calibrated Sun photometer and a rotating shadow-band radiometer agree to within the uncertainties of each measurement. For 36 independent comparisons the agreement between measured and model-estimated values of DNSI falls within the combined uncertainties in the measurement (0.3{endash}0.7{percent}) and model calculation (1.8{percent}), albeit with a slight average model underestimate ({minus}0.18{plus_minus}0.94){percent}; for a DNSI of 839Wm{sup {minus}2} this corresponds to {minus}1.5{plus_minus}7.9Wm{sup {minus}2}. The agreement is nearly independent of air mass and water-vapor path abundance. These results thus establish the accuracy of the current knowledge of the solar spectrum, its integrated power, and the atmospheric extinction as a function of wavelength as represented in MODTRAN 3. An important consequence is that atmospheric absorption of short-wave energy is accurately parametrized in the model to within the above uncertainties. {copyright} 1997 American Geophysical Union« less

  11. The Uncertainty of Local Background Magnetic Field Orientation in Anisotropic Plasma Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerick, F.; Saur, J.; Papen, M. von, E-mail: felix.gerick@uni-koeln.de

    In order to resolve and characterize anisotropy in turbulent plasma flows, a proper estimation of the background magnetic field is crucially important. Various approaches to calculating the background magnetic field, ranging from local to globally averaged fields, are commonly used in the analysis of turbulent data. We investigate how the uncertainty in the orientation of a scale-dependent background magnetic field influences the ability to resolve anisotropy. Therefore, we introduce a quantitative measure, the angle uncertainty, that characterizes the uncertainty of the orientation of the background magnetic field that turbulent structures are exposed to. The angle uncertainty can be used asmore » a condition to estimate the ability to resolve anisotropy with certain accuracy. We apply our description to resolve the spectral anisotropy in fast solar wind data. We show that, if the angle uncertainty grows too large, the power of the turbulent fluctuations is attributed to false local magnetic field angles, which may lead to an incorrect estimation of the spectral indices. In our results, an apparent robustness of the spectral anisotropy to false local magnetic field angles is observed, which can be explained by a stronger increase of power for lower frequencies when the scale of the local magnetic field is increased. The frequency-dependent angle uncertainty is a measure that can be applied to any turbulent system.« less

  12. A Method to Estimate Uncertainty in Radiometric Measurement Using the Guide to the Expression of Uncertainty in Measurement (GUM) Method; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Reda, I.

    Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expressionmore » of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).« less

  13. An exacting transition probability measurement - a direct test of atomic many-body theories.

    PubMed

    Dutta, Tarun; De Munshi, Debashis; Yum, Dahyun; Rebhi, Riadh; Mukherjee, Manas

    2016-07-19

    A new protocol for measuring the branching fraction of hydrogenic atoms with only statistically limited uncertainty is proposed and demonstrated for the decay of the P3/2 level of the barium ion, with precision below 0.5%. Heavy hydrogenic atoms like the barium ion are test beds for fundamental physics such as atomic parity violation and they also hold the key to understanding nucleo-synthesis in stars. To draw definitive conclusion about possible physics beyond the standard model by measuring atomic parity violation in the barium ion it is necessary to measure the dipole transition probabilities of low-lying excited states with a precision better than 1%. Furthermore, enhancing our understanding of the barium puzzle in barium stars requires branching fraction data for proper modelling of nucleo-synthesis. Our measurements are the first to provide a direct test of quantum many-body calculations on the barium ion with a precision below one percent and more importantly with no known systematic uncertainties. The unique measurement protocol proposed here can be easily extended to any decay with more than two channels and hence paves the way for measuring the branching fractions of other hydrogenic atoms with no significant systematic uncertainties.

  14. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    NASA Astrophysics Data System (ADS)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  15. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy.

    PubMed

    Wahl, N; Hennig, P; Wieser, H P; Bangert, M

    2017-06-26

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU [Formula: see text] min). The resulting standard deviation (expectation value) of dose show average global [Formula: see text] pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  16. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    NASA Astrophysics Data System (ADS)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly destroyed. Several adaptation measures were discussed by the responsible authorities but decision making is particularly challenging under multiple uncertainties. For this area, we developed a stochastic optimization model for concrete and real-case adaptation options and measures and use dynamic programming to explore the optimal adaptation decisions under uncertainty in face of uncertain impacts from climate change of debris flows and flooding. Even though simplification needed to be made the results produced were concrete and tangible, indicating that excavation is a preferable adaptation option based on our assumption and modeling in comparison to building a dam or relocation, which is not necessarily intuitive and adds an additional perspective to what has so far been sketched and evaluated by cantonal and communal authorities for Guttannen. Moreover, the building of an alternative cantonal road appears to be more expensive than costs incurring due to road closure.

  17. [Evaluation of measurement uncertainty of welding fume in welding workplace of a shipyard].

    PubMed

    Ren, Jie; Wang, Yanrang

    2015-12-01

    To evaluate the measurement uncertainty of welding fume in the air of the welding workplace of a shipyard, and to provide quality assurance for measurement. According to GBZ/T 192.1-2007 "Determination of dust in the air of workplace-Part 1: Total dust concentration" and JJF 1059-1999 "Evaluation and expression of measurement uncertainty", the uncertainty for determination of welding fume was evaluated and the measurement results were completely described. The concentration of welding fume was 3.3 mg/m(3), and the expanded uncertainty was 0.24 mg/m(3). The repeatability for determination of dust concentration introduced an uncertainty of 1.9%, the measurement using electronic balance introduced a standard uncertainty of 0.3%, and the measurement of sample quality introduced a standard uncertainty of 3.2%. During the determination of welding fume, the standard uncertainty introduced by the measurement of sample quality is the dominant uncertainty. In the process of sampling and measurement, quality control should be focused on the collection efficiency of dust, air humidity, sample volume, and measuring instruments.

  18. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  19. Mapping the absolute magnetic field and evaluating the quadratic Zeeman-effect-induced systematic error in an atom interferometer gravimeter

    NASA Astrophysics Data System (ADS)

    Hu, Qing-Qing; Freier, Christian; Leykauf, Bastian; Schkolnik, Vladimir; Yang, Jun; Krutzik, Markus; Peters, Achim

    2017-09-01

    Precisely evaluating the systematic error induced by the quadratic Zeeman effect is important for developing atom interferometer gravimeters aiming at an accuracy in the μ Gal regime (1 μ Gal =10-8m /s2 ≈10-9g ). This paper reports on the experimental investigation of Raman spectroscopy-based magnetic field measurements and the evaluation of the systematic error in the gravimetric atom interferometer (GAIN) due to quadratic Zeeman effect. We discuss Raman duration and frequency step-size-dependent magnetic field measurement uncertainty, present vector light shift and tensor light shift induced magnetic field measurement offset, and map the absolute magnetic field inside the interferometer chamber of GAIN with an uncertainty of 0.72 nT and a spatial resolution of 12.8 mm. We evaluate the quadratic Zeeman-effect-induced gravity measurement error in GAIN as 2.04 μ Gal . The methods shown in this paper are important for precisely mapping the absolute magnetic field in vacuum and reducing the quadratic Zeeman-effect-induced systematic error in Raman transition-based precision measurements, such as atomic interferometer gravimeters.

  20. Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification

    PubMed Central

    Pham, Tuan D.

    2014-01-01

    The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744

  1. Gaussian Process Model for Antarctic Surface Mass Balance and Ice Core Site Selection

    NASA Astrophysics Data System (ADS)

    White, P. A.; Reese, S.; Christensen, W. F.; Rupper, S.

    2017-12-01

    Surface mass balance (SMB) is an important factor in the estimation of sea level change, and data are collected to estimate models for prediction of SMB on the Antarctic ice sheet. Using Favier et al.'s (2013) quality-controlled aggregate data set of SMB field measurements, a fully Bayesian spatial model is posed to estimate Antarctic SMB and propose new field measurement locations. Utilizing Nearest-Neighbor Gaussian process (NNGP) models, SMB is estimated over the Antarctic ice sheet. An Antarctic SMB map is rendered using this model and is compared with previous estimates. A prediction uncertainty map is created to identify regions of high SMB uncertainty. The model estimates net SMB to be 2173 Gton yr-1 with 95% credible interval (2021,2331) Gton yr-1. On average, these results suggest lower Antarctic SMB and higher uncertainty than previously purported [Vaughan et al. (1999); Van de Berg et al. (2006); Arthern, Winebrenner and Vaughan (2006); Bromwich et al. (2004); Lenaerts et al. (2012)], even though this model utilizes significantly more observations than previous models. Using the Gaussian process' uncertainty and model parameters, we propose 15 new measurement locations for field study utilizing a maximin space-filling, error-minimizing design; these potential measurements are identied to minimize future estimation uncertainty. Using currently accepted Antarctic mass balance estimates and our SMB estimate, we estimate net mass loss [Shepherd et al. (2012); Jacob et al. (2012)]. Furthermore, we discuss modeling details for both space-time data and combining field measurement data with output from mathematical models using the NNGP framework.

  2. Reassessing biases and other uncertainties in sea surface temperature observations measured in situ since 1850: 2. Biases and homogenization

    NASA Astrophysics Data System (ADS)

    Kennedy, J. J.; Rayner, N. A.; Smith, R. O.; Parker, D. E.; Saunby, M.

    2011-07-01

    Changes in instrumentation and data availability have caused time-varying biases in estimates of global and regional average sea surface temperature. The size of the biases arising from these changes are estimated and their uncertainties evaluated. The estimated biases and their associated uncertainties are largest during the period immediately following the Second World War, reflecting the rapid and incompletely documented changes in shipping and data availability at the time. Adjustments have been applied to reduce these effects in gridded data sets of sea surface temperature and the results are presented as a set of interchangeable realizations. Uncertainties of estimated trends in global and regional average sea surface temperature due to bias adjustments since the Second World War are found to be larger than uncertainties arising from the choice of analysis technique, indicating that this is an important source of uncertainty in analyses of historical sea surface temperatures. Despite this, trends over the twentieth century remain qualitatively consistent.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest; Hadgu, Teklu; Greenberg, Harris

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approachmore » to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).« less

  4. The δ2H and δ18O of tap water from 349 sites in the United States and selected territories

    USGS Publications Warehouse

    Coplen, Tyler B.; Landwehr, Jurate M.; Qi, Haiping; Lorenz, Jennifer M.

    2013-01-01

    Because the stable isotopic compositions of hydrogen (δ2H) and oxygen (δ18O) of animal (including human) tissues, such as hair, nail, and urine, reflect the δ2H and δ18O of water and food ingested by an animal or a human and because the δ2H and δ18O of environmental waters vary geographically, δ2H and δ18O values of tap water samples collected in 2007-2008 from 349 sites in the United States and three selected U.S. territories have been measured in support of forensic science applications, creating one of the largest databases of tap water δ2H and δ18O values to date. The results of replicate isotopic measurements for these tap water samples confirm that the expanded uncertainties (U = 2μc) obtained over a period of years by the Reston Stable Isotope Laboratory from δ2H and δ18O dual-inlet mass spectrometric measurements are conservative, at ±2‰ and ±0.2 ‰, respectively. These uncertainties are important because U.S. Geological Survey data may be needed for forensic science applications, including providing evidence in court cases. Half way through the investigation, an isotope-laser spectrometer was acquired, enabling comparison of dual-inlet isotope-ratio mass spectrometric results with isotope-laser spectrometric results. The uncertainty of the laser-based δ2H measurement results for these tap water samples is comparable to the uncertainty of the mass spectrometric method, with the laser-based method having a slightly lower uncertainty. However, the δ18O uncertainty of the laser-based method is more than a factor of ten higher than that of the dual-inlet isotoperatio mass spectrometric method.

  5. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    PubMed Central

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-01-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements. PMID:27112127

  6. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate.

    PubMed

    Pradines, Joël R; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-26

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  7. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    NASA Astrophysics Data System (ADS)

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  8. Using spatial uncertainty to manipulate the size of the attention focus.

    PubMed

    Huang, Dan; Xue, Linyan; Wang, Xin; Chen, Yao

    2016-09-01

    Preferentially processing behaviorally relevant information is vital for primate survival. In visuospatial attention studies, manipulating the spatial extent of attention focus is an important question. Although many studies have claimed to successfully adjust attention field size by either varying the uncertainty about the target location (spatial uncertainty) or adjusting the size of the cue orienting the attention focus, no systematic studies have assessed and compared the effectiveness of these methods. We used a multiple cue paradigm with 2.5° and 7.5° rings centered around a target position to measure the cue size effect, while the spatial uncertainty levels were manipulated by changing the number of cueing positions. We found that spatial uncertainty had a significant impact on reaction time during target detection, while the cue size effect was less robust. We also carefully varied the spatial scope of potential target locations within a small or large region and found that this amount of variation in spatial uncertainty can also significantly influence target detection speed. Our results indicate that adjusting spatial uncertainty is more effective than varying cue size when manipulating attention field size.

  9. Calibration of hydrometers

    NASA Astrophysics Data System (ADS)

    Lorefice, Salvatore; Malengo, Andrea

    2006-10-01

    After a brief description of the different methods employed in periodic calibration of hydrometers used in most cases to measure the density of liquids in the range between 500 kg m-3 and 2000 kg m-3, particular emphasis is given to the multipoint procedure based on hydrostatic weighing, known as well as Cuckow's method. The features of the calibration apparatus and the procedure used at the INRiM (formerly IMGC-CNR) density laboratory have been considered to assess all relevant contributions involved in the calibration of different kinds of hydrometers. The uncertainty is strongly dependent on the kind of hydrometer; in particular, the results highlight the importance of the density of the reference buoyant liquid, the temperature of calibration and the skill of operator in the reading of the scale in the whole assessment of the uncertainty. It is also interesting to realize that for high-resolution hydrometers (division of 0.1 kg m-3), the uncertainty contribution of the density of the reference liquid is the main source of the total uncertainty, but its importance falls under about 50% for hydrometers with a division of 0.5 kg m-3 and becomes somewhat negligible for hydrometers with a division of 1 kg m-3, for which the reading uncertainty is the predominant part of the total uncertainty. At present the best INRiM result is obtained with commercially available hydrometers having a scale division of 0.1 kg m-3, for which the relative uncertainty is about 12 × 10-6.

  10. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  13. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  14. Progress toward a new beam measurement of the neutron lifetime

    NASA Astrophysics Data System (ADS)

    Hoogerheide, Shannon Fogwell; BL2 Collaboration

    2017-01-01

    Neutron beta decay is the simplest example of nuclear beta decay. A precise value of the neutron lifetime is important for consistency tests of the Standard Model and Big Bang Nucleosynthesis models. The beam neutron lifetime method requires the absolute counting of the decay protons in a neutron beam of precisely known flux. Recent work has resulted in improvements in both the neutron and proton detection systems that should permit a significant reduction in systematic uncertainties. A new measurement of the neutron lifetime using the beam method is underway at the National Institute of Standards and Technology Center for Neutron Research. The projected uncertainty of this new measurement is 1 s. An overview of the measurement, its current status, and the technical improvements will be discussed.

  15. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    NASA Astrophysics Data System (ADS)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the probability of flooding of a certain area, based on the uncertainty assessment of the flood forecasts. By using this type of maps, water managers can focus their attention on the areas with the highest flood probability. Also the larger public can consult these maps for information on the probability of flooding for their specific location, such that they can take pro-active measures to reduce the personal damage. The method of quantifying the uncertainty was implemented in the operational flood forecasting system for the navigable rivers in the Flanders region of Belgium. The method has shown clear benefits during the floods of the last two years.

  16. Uncertainly Analysis of Two Types of Humidity Sensors by a Humidity Generator with a Divided-Flow System.

    PubMed

    Chen, Ling-Hsi; Chen, Chiachung

    2018-02-21

    Humidity measurement is an important technique for the agricultural, foods, pharmaceuticals, and chemical industries. For the sake of convenience, electrical relative humidity (RH) sensors have been widely used. These sensors need to be calibrated to ensure their accuracy and the uncertainty measurement of these sensors has become a major concern. In this study, a self-made divided-flow generator was established to calibrate two types of electrical humidity sensors. The standard reference humidity was calculated from dew-point temperature and air dry-bulb temperature measured by a chilled mirror monitor. This divided-flow generator could produce consistent result of RH measurement results. The uncertainty of the reference standard increased with the increase of RH values. The combined uncertainty with the adequate calibration equations were ranged from 0.82% to 1.45% RH for resistive humidity sensors and 0.63% to 1.4% for capacitive humidity sensors, respectively. This self-made, divided-flow generator, and calibration method are cheap, time-saving, and easy to be used. Thus, the proposed approach can easily be applied in research laboratories.

  17. Uncertainly Analysis of Two Types of Humidity Sensors by a Humidity Generator with a Divided-Flow System

    PubMed Central

    Chen, Ling-Hsi

    2018-01-01

    Humidity measurement is an important technique for the agricultural, foods, pharmaceuticals, and chemical industries. For the sake of convenience, electrical relative humidity (RH) sensors have been widely used. These sensors need to be calibrated to ensure their accuracy and the uncertainty measurement of these sensors has become a major concern. In this study, a self-made divided-flow generator was established to calibrate two types of electrical humidity sensors. The standard reference humidity was calculated from dew-point temperature and air dry-bulb temperature measured by a chilled mirror monitor. This divided-flow generator could produce consistent result of RH measurement results. The uncertainty of the reference standard increased with the increase of RH values. The combined uncertainty with the adequate calibration equations were ranged from 0.82% to 1.45% RH for resistive humidity sensors and 0.63% to 1.4% for capacitive humidity sensors, respectively. This self-made, divided-flow generator, and calibration method are cheap, time-saving, and easy to be used. Thus, the proposed approach can easily be applied in research laboratories. PMID:29466313

  18. Defining the measurand in radius of curvature measurements

    NASA Astrophysics Data System (ADS)

    Davies, Angela; Schmitz, Tony L.

    2003-11-01

    Traceable radius of curvature measurements are critical for precision optics manufacture. An optical bench measurement of radius is very repeatable and is the preferred method for low-uncertainty applications. On an optical bench, the displacement of the optic is measured as it is moved between the cat's eye and confocal positions, each identified using a figure measuring interferometer. Traceability requires connection to a basic unit (the meter, here) in addition to a defensible uncertainty analysis, and the identification and proper propagation of all uncertainty sources in this measurement is challenging. Recent work has focused on identifying all uncertainty contributions; measurement biases have been approximately taken into account and uncertainties combined in an RSS sense for a final measurement estimate and uncertainty. In this paper we report on a new mathematical definition of the radius measurand, which is a single function that depends on all uncertainty sources, such as error motions, alignment uncertainty, displacement gauge uncertainty, etc. The method is based on a homogeneous transformation matrix (HTM) formalism, and intrinsically defines an unbiased estimate for radius, providing a single mathematical expression for uncertainty propagation through a Taylor-series expansion.

  19. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  20. Absolute Single Photoionization Cross Sections of Se^3+ For the Determination of Elemental Abundances in Planetary Nebulae

    NASA Astrophysics Data System (ADS)

    Esteves, David; Sterling, Nicholas; Aguilar, Alex; Kilcoyne, A. L. David; Phaneuf, Ronald; Bilodeau, Rene; Red, Eddie; McLaughlin, Brendan; Norrington, Patrick; Balance, Connor

    2009-05-01

    Numerical simulations show that derived elemental abundances in astrophysical nebulae can be uncertain by factors of two or more due to atomic data uncertainties alone, and of these uncertainties, absolute photoionization cross sections are the most important. Absolute single photoionization cross sections for Se^3+ ions have been measured from 42 eV to 56 eV at the ALS using the merged beams photo-ion technique. Theoretical photoionization cross section calculations were also performed for these ions using the state-of-the-art fully relativistic Dirac R-matrix code (DARC). The calculations show encouraging agreement with the experimental measurements.

  1. Is my bottom-up uncertainty estimation on metal measurement adequate?

    NASA Astrophysics Data System (ADS)

    Marques, J. R.; Faustino, M. G.; Monteiro, L. R.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Is the estimated uncertainty under GUM recommendation associated with metal measurement adequately estimated? How to evaluate if the measurement uncertainty really covers all uncertainty that is associated with the analytical procedure? Considering that, many laboratories frequently underestimate or less frequently overestimate uncertainties on its results; this paper presents the evaluation of estimated uncertainties on two ICP-OES procedures of seven metal measurements according to GUM approach. Horwitz function and proficiency tests scaled standard uncertainties were used in this evaluation. Our data shows that most elements expanded uncertainties were from two to four times underestimated. Possible causes and corrections are discussed herein.

  2. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Zhen Yu, Jian

    2018-03-01

    Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS), Deming regression (DR), orthogonal distance regression (ODR), weighted ODR (WODR), and York regression (YR). We first introduce a new data generation scheme that employs the Mersenne twister (MT) pseudorandom number generator. The numerical simulations are also improved by (a) refining the parameterization of nonlinear measurement uncertainties, (b) inclusion of a linear measurement uncertainty, and (c) inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot) was developed to facilitate the implementation of error-in-variables regressions.

  3. Estimating discharge measurement uncertainty using the interpolated variance estimator

    USGS Publications Warehouse

    Cohn, T.; Kiang, J.; Mason, R.

    2012-01-01

    Methods for quantifying the uncertainty in discharge measurements typically identify various sources of uncertainty and then estimate the uncertainty from each of these sources by applying the results of empirical or laboratory studies. If actual measurement conditions are not consistent with those encountered in the empirical or laboratory studies, these methods may give poor estimates of discharge uncertainty. This paper presents an alternative method for estimating discharge measurement uncertainty that uses statistical techniques and at-site observations. This Interpolated Variance Estimator (IVE) estimates uncertainty based on the data collected during the streamflow measurement and therefore reflects the conditions encountered at the site. The IVE has the additional advantage of capturing all sources of random uncertainty in the velocity and depth measurements. It can be applied to velocity-area discharge measurements that use a velocity meter to measure point velocities at multiple vertical sections in a channel cross section.

  4. Host Model Uncertainty in Aerosol Radiative Effects: the AeroCom Prescribed Experiment and Beyond

    NASA Astrophysics Data System (ADS)

    Stier, Philip; Schutgens, Nick; Bian, Huisheng; Boucher, Olivier; Chin, Mian; Ghan, Steven; Huneeus, Nicolas; Kinne, Stefan; Lin, Guangxing; Myhre, Gunnar; Penner, Joyce; Randles, Cynthia; Samset, Bjorn; Schulz, Michael; Yu, Hongbin; Zhou, Cheng; Bellouin, Nicolas; Ma, Xiaoyan; Yu, Fangqun; Takemura, Toshihiko

    2013-04-01

    Anthropogenic and natural aerosol radiative effects are recognized to affect global and regional climate. Multi-model "diversity" in estimates of the aerosol radiative effect is often perceived as a measure of the uncertainty in modelling aerosol itself. However, current aerosol models vary considerably in model components relevant for the calculation of aerosol radiative forcings and feedbacks and the associated "host-model uncertainties" are generally convoluted with the actual uncertainty in aerosol modelling. In the AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in eleven participating models. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention. However, uncertainties in aerosol radiative effects also include short-term and long-term feedback processes that will be systematically explored in future intercomparison studies. Here we will present an overview of the proposals for discussion and results from early scoping studies.

  5. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  6. Motion compensation using origin ensembles in awake small animal positron emission tomography

    NASA Astrophysics Data System (ADS)

    Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.

    2017-02-01

    In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.

  7. Recent developments in measurement and evaluation of FAC damage in power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garud, Y.S.; Besuner, P.; Cohn, M.J.

    1999-11-01

    This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less

  8. Nuclear event zero-time calculation and uncertainty evaluation.

    PubMed

    Pan, Pujing; Ungar, R Kurt

    2012-04-01

    It is important to know the initial time, or zero-time, of a nuclear event such as a nuclear weapon's test, a nuclear power plant accident or a nuclear terrorist attack (e.g. with an improvised nuclear device, IND). Together with relevant meteorological information, the calculated zero-time is used to help locate the origin of a nuclear event. The zero-time of a nuclear event can be derived from measured activity ratios of two nuclides. The calculated zero-time of a nuclear event would not be complete without an appropriately evaluated uncertainty term. In this paper, analytical equations for zero-time and the associated uncertainty calculations are derived using a measured activity ratio of two nuclides. Application of the derived equations is illustrated in a realistic example using data from the last Chinese thermonuclear test in 1980. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  9. Mathematical modeling of a survey-meter used to measure radioactivity in human thyroids: Monte Carlo calculations of the device response and uncertainties

    PubMed Central

    Khrutchinsky, Arkady; Drozdovitch, Vladimir; Kutsen, Semion; Minenko, Victor; Khrouch, Valeri; Luckyanov, Nickolas; Voillequé, Paul; Bouville, André

    2012-01-01

    This paper presents results of Monte Carlo modeling of the SRP-68-01 survey meter used to measure exposure rates near the thyroid glands of persons exposed to radioactivity following the Chernobyl accident. This device was not designed to measure radioactivity in humans. To estimate the uncertainty associated with the measurement results, a mathematical model of the SRP-68-01 survey meter was developed and verified. A Monte Carlo method of numerical simulation of radiation transport has been used to calculate the calibration factor for the device and evaluate its uncertainty. The SRP-68-01 survey meter scale coefficient, an important characteristic of the device, was also estimated in this study. The calibration factors of the survey meter were calculated for 131I, 132I, 133I, and 135I content in the thyroid gland for six age groups of population: newborns; children aged 1 yr, 5 yr, 10 yr, 15 yr; and adults. A realistic scenario of direct thyroid measurements with an “extended” neck was used to calculate the calibration factors for newborns and one-year-olds. Uncertainties in the device calibration factors due to variability of the device scale coefficient, variability in thyroid mass and statistical uncertainty of Monte Carlo method were evaluated. Relative uncertainties in the calibration factor estimates were found to be from 0.06 for children aged 1 yr to 0.1 for 10-yr and 15-yr children. The positioning errors of the detector during measurements deviate mainly in one direction from the estimated calibration factors. Deviations of the device position from the proper geometry of measurements were found to lead to overestimation of the calibration factor by up to 24 percent for adults and up to 60 percent for 1-yr children. The results of this study improve the estimates of 131I thyroidal content and, consequently, thyroid dose estimates that are derived from direct thyroid measurements performed in Belarus shortly after the Chernobyl accident. PMID:22245289

  10. A Novel Uncertainty Framework for Improving Discharge Data Quality Using Hydraulic Modelling.

    NASA Astrophysics Data System (ADS)

    Mansanarez, V.; Westerberg, I.; Lyon, S. W.; Lam, N.

    2017-12-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable stage-discharge (SD) rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. We introduce an uncertainty framework using hydraulic modelling for developing SD rating curves and estimating their uncertainties. The proposed framework incorporates information from both the hydraulic configuration (bed slope, roughness, vegetation) and the information available in the stage-discharge observation data (gaugings). This method provides a direct estimation of the hydraulic configuration (slope, bed roughness and vegetation roughness). Discharge time series are estimated propagating stage records through posterior rating curve results.We applied this novel method to two Swedish hydrometric stations, accounting for uncertainties in the gaugings for the hydraulic model. Results from these applications were compared to discharge measurements and official discharge estimations.Sensitivity analysis was performed. We focused analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken.

  11. HIT or miss: the application of health care information technology to managing uncertainty in clinical decision making.

    PubMed

    Kazandjian, Vahé A; Lipitz-Snyderman, Allison

    2011-12-01

    To discuss the usefulness of health care information technology (HIT) in assisting care providers minimize uncertainty while simultaneously increasing efficiency of the care provided. An ongoing study of HIT, performance measurement (clinical and production efficiency) and their implications to the payment for care represents the design of this study. Since 2006, all Maryland hospitals have embarked on a multi-faceted study of performance measures and HIT adoption surveys, which will shape the health care payment model in Maryland, the last of the all-payor states, in 2011. This paper focuses on the HIT component of the Maryland care payment initiative. While the payment model is still under review and discussion, 'appropriateness' of care has been discussed as an important dimension of measurement. Within this dimension, the 'uncertainty' concept has been identified as associated with variation in care practices. Hence, the methods of this paper define how HIT can assist care providers in addressing the concept of uncertainty, and then provides findings from the first HIT survey in Maryland to infer the readiness of Maryland hospital in addressing uncertainty of care in part through the use of HIT. Maryland hospitals show noteworthy variation in their adoption and use of HIT. While computerized, electronic patient records are not commonly used among and across Maryland hospitals, many of the uses of HIT internally in each hospital could significantly assist in better communication about better practices to minimize uncertainty of care and enhance the efficiency of its production. © 2010 Blackwell Publishing Ltd.

  12. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  13. Uncertainty in Climate Change Research: An Integrated Approach

    NASA Astrophysics Data System (ADS)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  14. Fission cross section uncertainties with the NIFFTE TPC

    NASA Astrophysics Data System (ADS)

    Sangiorgio, Samuele; Niffte Collaboration

    2014-09-01

    Nuclear data such as neutron-induced fission cross sections play a fundamental role in nuclear energy and defense applications. In recent years, understanding of these systems has become increasingly dependent upon advanced simulation and modeling, where uncertainties in nuclear data propagate in the expected performances of existing and future systems. It is important therefore that uncertainties in nuclear data are minimized and fully understood. For this reason, the Neutron Induced Fission Fragment Tracking Experiment (NIFFTE) uses a Time Projection Chamber (TPC) to measure energy-differential (n,f) cross sections with unprecedented precision. The presentation will discuss how the capabilities of the NIFFTE TPC allow to directly measures systematic uncertainties in fission cross sections, in particular for what concerns fission-fragment identification, and target and beam uniformity. Preliminary results from recent analysis of 238U/235U and 239Pu/235U data collected with the TPC will be presented. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  15. Data Assimilation and Propagation of Uncertainty in Multiscale Cardiovascular Simulation

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele; Marsden, Alison

    2015-11-01

    Cardiovascular modeling is the application of computational tools to predict hemodynamics. State-of-the-art techniques couple a 3D incompressible Navier-Stokes solver with a boundary circulation model and can predict local and peripheral hemodynamics, analyze the post-operative performance of surgical designs and complement clinical data collection minimizing invasive and risky measurement practices. The ability of these tools to make useful predictions is directly related to their accuracy in representing measured physiologies. Tuning of model parameters is therefore a topic of paramount importance and should include clinical data uncertainty, revealing how this uncertainty will affect the predictions. We propose a fully Bayesian, multi-level approach to data assimilation of uncertain clinical data in multiscale circulation models. To reduce the computational cost, we use a stable, condensed approximation of the 3D model build by linear sparse regression of the pressure/flow rate relationship at the outlets. Finally, we consider the problem of non-invasively propagating the uncertainty in model parameters to the resulting hemodynamics and compare Monte Carlo simulation with Stochastic Collocation approaches based on Polynomial or Multi-resolution Chaos expansions.

  16. Transportable Optical Lattice Clock with 7×10^{-17} Uncertainty.

    PubMed

    Koller, S B; Grotti, J; Vogt, St; Al-Masoudi, A; Dörscher, S; Häfner, S; Sterr, U; Lisdat, Ch

    2017-02-17

    We present a transportable optical clock (TOC) with ^{87}Sr. Its complete characterization against a stationary lattice clock resulted in a systematic uncertainty of 7.4×10^{-17}, which is currently limited by the statistics of the determination of the residual lattice light shift, and an instability of 1.3×10^{-15}/sqrt[τ] with an averaging time τ in seconds. Measurements confirm that the systematic uncertainty can be reduced to below the design goal of 1×10^{-17}. To our knowledge, these are the best uncertainties and instabilities reported for any transportable clock to date. For autonomous operation, the TOC has been installed in an air-conditioned car trailer. It is suitable for chronometric leveling with submeter resolution as well as for intercontinental cross-linking of optical clocks, which is essential for a redefinition of the International System of Units (SI) second. In addition, the TOC will be used for high precision experiments for fundamental science that are commonly tied to precise frequency measurements and its development is an important step to space-borne optical clocks.

  17. Transportable Optical Lattice Clock with 7 ×10-17 Uncertainty

    NASA Astrophysics Data System (ADS)

    Koller, S. B.; Grotti, J.; Vogt, St.; Al-Masoudi, A.; Dörscher, S.; Häfner, S.; Sterr, U.; Lisdat, Ch.

    2017-02-01

    We present a transportable optical clock (TOC) with Sr 87 . Its complete characterization against a stationary lattice clock resulted in a systematic uncertainty of 7.4 ×10-17, which is currently limited by the statistics of the determination of the residual lattice light shift, and an instability of 1.3 ×10-15/√{τ } with an averaging time τ in seconds. Measurements confirm that the systematic uncertainty can be reduced to below the design goal of 1 ×10-17. To our knowledge, these are the best uncertainties and instabilities reported for any transportable clock to date. For autonomous operation, the TOC has been installed in an air-conditioned car trailer. It is suitable for chronometric leveling with submeter resolution as well as for intercontinental cross-linking of optical clocks, which is essential for a redefinition of the International System of Units (SI) second. In addition, the TOC will be used for high precision experiments for fundamental science that are commonly tied to precise frequency measurements and its development is an important step to space-borne optical clocks.

  18. Elastic and inelastic neutron scattering cross sections for fission reactor applications

    NASA Astrophysics Data System (ADS)

    Hicks, S. F.; Chakraborty, A.; Combs, B.; Crider, B. P.; Downes, L.; Girgis, J.; Kersting, L. J.; Kumar, A.; Lueck, C. J.; McDonough, P. J.; McEllistrem, M. T.; Peters, E. E.; Prados-Estevz, F. M.; Schniederjan, J.; Sidwell, L.; Sigillito, A. J.; Vanhoy, J. R.; Watts, D.; Yates, S. W.

    2013-04-01

    Nuclear data important for the design and development of the next generation of light-water reactors and future fast reactors include neutron elastic and inelastic scattering cross sections on important structural materials, such as Fe, and on coolant materials, such as Na. These reaction probabilities are needed since neutron reactions impact fuel performance during irradiations and the overall efficiency of reactors. While neutron scattering cross sections from these materials are available for certain incident neutron energies, the fast neutron region, particularly above 2 MeV, has large gaps for which no measurements exist, or the existing uncertainties are large. Measurements have been made at the University of Kentucky Accelerator Laboratory to measure neutron scattering cross sections on both Fe and Na in the region where these gaps occur and to reduce the uncertainties on scattering from the ground state and first excited state of these nuclei. Results from measurements on Fe at incident neutron energies between 2 and 4 MeV will be presented and comparisons will be made to model calculations available from data evaluators.

  19. Monkeys and humans take local uncertainty into account when localizing a change.

    PubMed

    Devkar, Deepna; Wright, Anthony A; Ma, Wei Ji

    2017-09-01

    Since sensory measurements are noisy, an observer is rarely certain about the identity of a stimulus. In visual perception tasks, observers generally take their uncertainty about a stimulus into account when doing so helps task performance. Whether the same holds in visual working memory tasks is largely unknown. Ten human and two monkey subjects localized a single change in orientation between a sample display containing three ellipses and a test display containing two ellipses. To manipulate uncertainty, we varied the reliability of orientation information by making each ellipse more or less elongated (two levels); reliability was independent across the stimuli. In both species, a variable-precision encoding model equipped with an "uncertainty-indifferent" decision rule, which uses only the noisy memories, fitted the data poorly. In both species, a much better fit was provided by a model in which the observer also takes the levels of reliability-driven uncertainty associated with the memories into account. In particular, a measured change in a low-reliability stimulus was given lower weight than the same change in a high-reliability stimulus. We did not find strong evidence that observers took reliability-independent variations in uncertainty into account. Our results illustrate the importance of studying the decision stage in comparison tasks and provide further evidence for evolutionary continuity of working memory systems between monkeys and humans.

  20. National Institute of Standards and Technology measurement service of the optical properties of biomedical phantoms: Current status.

    PubMed

    Lemaillet, Paul; Cooksey, Catherine C; Levine, Zachary H; Pintar, Adam L; Hwang, Jeeseong; Allen, David W

    2016-03-24

    The National Institute of Standards and Technology (NIST) has maintained scales for reflectance and transmittance over several decades. The scales are primarily intended for regular transmittance, mirrors, and solid surface scattering diffusers. The rapidly growing area of optical medical imaging needs a scale for volume scattering of diffuse materials that are used to mimic the optical properties of tissue. Such materials are used as phantoms to evaluate and validate instruments under development intended for clinical use. To address this need, a double-integrating sphere based instrument has been installed to measure the optical properties of tissue-mimicking phantoms. The basic system and methods have been described in previous papers. An important attribute in establishing a viable calibration service is the estimation of measurement uncertainties. The use of custom models and comparisons with other established scales enabled uncertainty measurements. Here, we describe the continuation of those efforts to advance the understanding of the uncertainties through two independent measurements: the bidirectional reflectance distribution function and the bidirectional transmittance distribution function of a commercially available solid biomedical phantom. A Monte Carlo-based model is used and the resulting optical properties are compared to the values provided by the phantom manufacturer.

  1. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago

    2017-05-15

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertaintymore » $$\\sigma_z \\geq 0.02(1+z)$$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $$\\sigma_z \\geq 0.02(1+z)$$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.« less

  3. Can hydraulic-modelled rating curves reduce uncertainty in high flow data?

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Lam, Norris; Lyon, Steve W.

    2017-04-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. In this study we compared the uncertainty in discharge data that resulted from these two rating curve modelling approaches. We applied both methods to a Swedish catchment, accounting for uncertainties in the stage-discharge gauging and water-surface slope data for the hydraulic model and in the stage-discharge gauging data and rating-curve parameters for the traditional method. We focused our analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken. First results show that the hydraulically-modelled rating curves were more sensitive to uncertainties in the calibration measurements of discharge than water surface slope. The uncertainty of the hydraulically-modelled rating curves were lowest within the range of the three calibration stage-discharge gaugings (i.e. between median and two-times median flow) whereas uncertainties were higher outside of this range. For instance, at the highest observed stage of the 24-year stage record, the 90% uncertainty band was -15% to +40% of the official rating curve. Additional gaugings at high flows (i.e. four to five times median flow) would likely substantially reduce those uncertainties. These first results show the potential of the hydraulically-modelled curves, particularly where the calibration gaugings are of high quality and cover a wide range of flow conditions.

  4. Conditional Entropy and Location Error in Indoor Localization Using Probabilistic Wi-Fi Fingerprinting.

    PubMed

    Berkvens, Rafael; Peremans, Herbert; Weyn, Maarten

    2016-10-02

    Localization systems are increasingly valuable, but their location estimates are only useful when the uncertainty of the estimate is known. This uncertainty is currently calculated as the location error given a ground truth, which is then used as a static measure in sometimes very different environments. In contrast, we propose the use of the conditional entropy of a posterior probability distribution as a complementary measure of uncertainty. This measure has the advantage of being dynamic, i.e., it can be calculated during localization based on individual sensor measurements, does not require a ground truth, and can be applied to discrete localization algorithms. Furthermore, for every consistent location estimation algorithm, both the location error and the conditional entropy measures must be related, i.e., a low entropy should always correspond with a small location error, while a high entropy can correspond with either a small or large location error. We validate this relationship experimentally by calculating both measures of uncertainty in three publicly available datasets using probabilistic Wi-Fi fingerprinting with eight different implementations of the sensor model. We show that the discrepancy between these measures, i.e., many location estimates having a high location error while simultaneously having a low conditional entropy, is largest for the least realistic implementations of the probabilistic sensor model. Based on the results presented in this paper, we conclude that conditional entropy, being dynamic, complementary to location error, and applicable to both continuous and discrete localization, provides an important extra means of characterizing a localization method.

  5. Conditional Entropy and Location Error in Indoor Localization Using Probabilistic Wi-Fi Fingerprinting

    PubMed Central

    Berkvens, Rafael; Peremans, Herbert; Weyn, Maarten

    2016-01-01

    Localization systems are increasingly valuable, but their location estimates are only useful when the uncertainty of the estimate is known. This uncertainty is currently calculated as the location error given a ground truth, which is then used as a static measure in sometimes very different environments. In contrast, we propose the use of the conditional entropy of a posterior probability distribution as a complementary measure of uncertainty. This measure has the advantage of being dynamic, i.e., it can be calculated during localization based on individual sensor measurements, does not require a ground truth, and can be applied to discrete localization algorithms. Furthermore, for every consistent location estimation algorithm, both the location error and the conditional entropy measures must be related, i.e., a low entropy should always correspond with a small location error, while a high entropy can correspond with either a small or large location error. We validate this relationship experimentally by calculating both measures of uncertainty in three publicly available datasets using probabilistic Wi-Fi fingerprinting with eight different implementations of the sensor model. We show that the discrepancy between these measures, i.e., many location estimates having a high location error while simultaneously having a low conditional entropy, is largest for the least realistic implementations of the probabilistic sensor model. Based on the results presented in this paper, we conclude that conditional entropy, being dynamic, complementary to location error, and applicable to both continuous and discrete localization, provides an important extra means of characterizing a localization method. PMID:27706099

  6. Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.

    Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less

  7. Abschätzung des Einflusses von Parameterunsicherheiten bei der Planung und Auswertung von Tracertests unter Verwendung von Ensembleprognosen

    NASA Astrophysics Data System (ADS)

    Klotzsch, Stephan; Binder, Martin; Händel, Falk

    2017-06-01

    While planning tracer tests, uncertainties in geohydraulic parameters should be considered as an important factor. Neglecting these uncertainties can lead to missing the tracer breakthrough, for example. One way to consider uncertainties during tracer test design is the so called ensemble forecast. The applicability of this method to geohydrological problems is demonstrated by coupling the method with two analytical solute transport models. The algorithm presented in this article is suitable for prediction as well as parameter estimation. The parameter estimation function can be used in a tracer test for reducing the uncertainties in the measured data which can improve the initial prediction. The algorithm was implemented into a software tool which is freely downloadable from the website of the Institute for Groundwater Management at TU Dresden, Germany.

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  10. Determination of the carbon budget of a pasture: effect of system boundaries and flux uncertainties

    NASA Astrophysics Data System (ADS)

    Felber, R.; Bretscher, D.; Münger, A.; Neftel, A.; Ammann, C.

    2015-12-01

    Carbon (C) sequestration in the soil is considered as a potential important mechanism to mitigate greenhouse gas (GHG) emissions of the agricultural sector. It can be quantified by the net ecosystem carbon budget (NECB) describing the change of soil C as the sum of all relevant import and export fluxes. NECB was investigated here in detail for an intensively grazed dairy pasture in Switzerland. Two budget approaches with different system boundaries were applied: NECBtot for system boundaries including the grazing cows and NECBpast for system boundaries excluding the cows. CO2 and CH4 exchange induced by soil/vegetation processes as well as direct emissions by the animals were derived from eddy covariance measurements. Other C fluxes were either measured (milk yield, concentrate feeding) or derived based on animal performance data (intake, excreta). For the investigated year, both approaches resulted in a small non-significant C loss: NECBtot - 13 ± 61 g C m-2 yr-1 and NECBpast - 17 ± 81 g C m-2 yr-1. The considerable uncertainties, depending on the approach, were mainly due to errors in the CO2 exchange or in the animal related fluxes. The associated GHG budget revealed CH4 emissions from the cows to be the major contributor, but with much lower uncertainty compared to NECB. Although only one year of data limit the representativeness of the carbon budget results, they demonstrated the important contribution of the non-CO2 fluxes depending on the chosen system boundaries and the effect of their propagated uncertainty in an exemplary way. The simultaneous application and comparison of both NECB approaches provides a useful consistency check for the carbon budget determination and can help to identify and eliminate systematic errors.

  11. Determination of the carbon budget of a pasture: effect of system boundaries and flux uncertainties

    NASA Astrophysics Data System (ADS)

    Felber, Raphael; Bretscher, Daniel; Münger, Andreas; Neftel, Albrecht; Ammann, Christof

    2016-05-01

    Carbon (C) sequestration in the soil is considered as a potential important mechanism to mitigate greenhouse gas (GHG) emissions of the agricultural sector. It can be quantified by the net ecosystem carbon budget (NECB) describing the change of soil C as the sum of all relevant import and export fluxes. NECB was investigated here in detail for an intensively grazed dairy pasture in Switzerland. Two budget approaches with different system boundaries were applied: NECBtot for system boundaries including the grazing cows and NECBpast for system boundaries excluding the cows. CO2 and CH4 exchange induced by soil/vegetation processes as well as direct emissions by the animals were derived from eddy covariance measurements. Other C fluxes were either measured (milk yield, concentrate feeding) or derived based on animal performance data (intake, excreta). For the investigated year, both approaches resulted in a small near-neutral C budget: NECBtot -27 ± 62 and NECBpast 23 ± 76 g C m-2 yr-1. The considerable uncertainties, depending on the approach, were mainly due to errors in the CO2 exchange or in the animal-related fluxes. The comparison of the NECB results with the annual exchange of other GHG revealed CH4 emissions from the cows to be the major contributor in terms of CO2 equivalents, but with much lower uncertainty compared to NECB. Although only 1 year of data limit the representativeness of the carbon budget results, they demonstrate the important contribution of the non-CO2 fluxes depending on the chosen system boundaries and the effect of their propagated uncertainty in an exemplary way. The simultaneous application and comparison of both NECB approaches provides a useful consistency check for the carbon budget determination and can help to identify and eliminate systematic errors.

  12. The Relative Importance of Random Error and Observation Frequency in Detecting Trends in Upper Tropospheric Water Vapor

    NASA Technical Reports Server (NTRS)

    Whiteman, David N.; Vermeesch, Kevin C.; Oman, Luke D.; Weatherhead, Elizabeth C.

    2011-01-01

    Recent published work assessed the amount of time to detect trends in atmospheric water vapor over the coming century. We address the same question and conclude that under the most optimistic scenarios and assuming perfect data (i.e., observations with no measurement uncertainty) the time to detect trends will be at least 12 years at approximately 200 hPa in the upper troposphere. Our times to detect trends are therefore shorter than those recently reported and this difference is affected by data sources used, method of processing the data, geographic location and pressure level in the atmosphere where the analyses were performed. We then consider the question of how instrumental uncertainty plays into the assessment of time to detect trends. We conclude that due to the high natural variability in atmospheric water vapor, the amount of time to detect trends in the upper troposphere is relatively insensitive to instrumental random uncertainty and that it is much more important to increase the frequency of measurement than to decrease the random error in the measurement. This is put in the context of international networks such as the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) and the Network for the Detection of Atmospheric Composition Change (NDACC) that are tasked with developing time series of climate quality water vapor data.

  13. The relative importance of random error and observation frequency in detecting trends in upper tropospheric water vapor

    NASA Astrophysics Data System (ADS)

    Whiteman, David N.; Vermeesch, Kevin C.; Oman, Luke D.; Weatherhead, Elizabeth C.

    2011-11-01

    Recent published work assessed the amount of time to detect trends in atmospheric water vapor over the coming century. We address the same question and conclude that under the most optimistic scenarios and assuming perfect data (i.e., observations with no measurement uncertainty) the time to detect trends will be at least 12 years at approximately 200 hPa in the upper troposphere. Our times to detect trends are therefore shorter than those recently reported and this difference is affected by data sources used, method of processing the data, geographic location and pressure level in the atmosphere where the analyses were performed. We then consider the question of how instrumental uncertainty plays into the assessment of time to detect trends. We conclude that due to the high natural variability in atmospheric water vapor, the amount of time to detect trends in the upper troposphere is relatively insensitive to instrumental random uncertainty and that it is much more important to increase the frequency of measurement than to decrease the random error in the measurement. This is put in the context of international networks such as the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) and the Network for the Detection of Atmospheric Composition Change (NDACC) that are tasked with developing time series of climate quality water vapor data.

  14. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  15. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    PubMed

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a certain extent depending on the strength of the correlation. In the case of model prediction, the qualitative comparison of the model predictions with the measured plasma and urinary data showed the HMGU model to be more reliable than the ICRP model; quantitatively, the uncertainty model prediction by the HMGU systemic biokinetic model is smaller than that of the ICRP model. The uncertainty information on the model parameters analyzed in this study was used in the second part of the paper regarding a sensitivity analysis of the Zr biokinetic models.

  16. Estimation of Confidence Intervals for Multiplication and Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J

    2009-07-17

    Helium-3 tubes are used to detect thermal neutrons by charge collection using the {sup 3}He(n,p) reaction. By analyzing the time sequence of neutrons detected by these tubes, one can determine important features about the constitution of a measured object: Some materials such as Cf-252 emit several neutrons simultaneously, while others such as uranium and plutonium isotopes multiply the number of neutrons to form bursts. This translates into unmistakable signatures. To determine the type of materials measured, one compares the measured count distribution with the one generated by a theoretical fission chain model. When the neutron background is negligible, the theoreticalmore » count distributions can be completely characterized by a pair of parameters, the multiplication M and the detection efficiency {var_epsilon}. While the optimal pair of M and {var_epsilon} can be determined by existing codes such as BigFit, the uncertainty on these parameters has not yet been fully studied. The purpose of this work is to precisely compute the uncertainties on the parameters M and {var_epsilon}, given the uncertainties in the count distribution. By considering different lengths of time tagged data, we will determine how the uncertainties on M and {var_epsilon} vary with the different count distributions.« less

  17. Metrology in Medicine: From Measurements to Decision, with Specific Reference to Anesthesia and Intensive Care

    PubMed Central

    Imhoff, Michael; Cecconi, Maurizio

    2015-01-01

    Metrology is the science of measurements. Although of critical importance in medicine and especially in critical care, frequent confusion in terms and definitions impact either interphysician communications or understanding of manufacturers’ and engineers’ instructions and limitations when using devices. In this review, we first list the terms defined by the International Bureau of Weights and Measures regarding quantities and units, measurements, devices for measurement, properties of measuring devices, and measurement standards. The traditional tools for assessing the most important measurement quality criteria are also reviewed with clinical examples for diagnosis, alarm, and titration purposes, as well as for assessing the uncertainty of reference methods. PMID:25625255

  18. Life-cycle assessment of municipal solid waste management alternatives with consideration of uncertainty: SIWMS development and application.

    PubMed

    Hanandeh, Ali El; El-Zein, Abbas

    2010-05-01

    This paper describes the development and application of the Stochastic Integrated Waste Management Simulator (SIWMS) model. SIWMS provides a detailed view of the environmental impacts and associated costs of municipal solid waste (MSW) management alternatives under conditions of uncertainty. The model follows a life-cycle inventory approach extended with compensatory systems to provide more equitable bases for comparing different alternatives. Economic performance is measured by the net present value. The model is verified against four publicly available models under deterministic conditions and then used to study the impact of uncertainty on Sydney's MSW management 'best practices'. Uncertainty has a significant effect on all impact categories. The greatest effect is observed in the global warming category where a reversal of impact direction is predicted. The reliability of the system is most sensitive to uncertainties in the waste processing and disposal. The results highlight the importance of incorporating uncertainty at all stages to better understand the behaviour of the MSW system. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  19. Uncertainty in BRCA1 cancer susceptibility testing.

    PubMed

    Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y

    2006-11-15

    This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.

  20. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    PubMed

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  1. Assessment of the risk of introduction of H5N1 HPAI virus from affected countries to the U.K.

    PubMed

    Sabirovic, M; Hall, S; Wilesmith, J; Grimley, P; Coulson, N; Landeg, F

    2007-03-01

    The Department for Environment, Food and Rural Affairs (Defra) has monitored epidemiologic developments following outbreaks of H5N1 in Asia since the beginning of 2004 and publishes risk assessments as the situation evolves. The U.K. applies safeguard measures that reflect EU rules to enable imports to continue when they present negligible risk. Defra risk assessments (RA) identify possible pathways by which the H5N1 virus may be introduced to the U.K. These assessments provide a basis for identifying appropriate surveillance activities to ensure early detection, should the virus be introduced, and disease control measures to be taken, should the virus be detected in the U.K. Nevertheless, these assessments have highlighted that many fundamental uncertainties still remain. These uncertainties center on the geographic and species distribution of infection outside Asia and the means of dissemination of the virus. However, the evolving developments demonstrated that regulatory decisions had to be made despite these uncertainties. Improvements in our current RA abilities would greatly benefit from systematic studies to provide more information on the species susceptibility, dynamics of infection, pathogenesis, and ecology of the virus along with possible pathways by which the H5N1 virus may be disseminated. Such an approach would assist in reducing uncertainties and ensuring that regulatory risk management measures are regularly reviewed by taking into account the most recent scientific evidence. The likelihood of the persistence of H5N1 outside Asia in the coming years and the effects of control programs in Asia and other affected regions to reduce the prevalence of infection are also important factors.

  2. Assessing climate change and socio-economic uncertainties in long term management of water resources

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.

  3. A global positioning measurement system for regional geodesy in the caribbean

    NASA Astrophysics Data System (ADS)

    Renzetti, N. A.

    1986-11-01

    Low cost, portable receivers using signals from satellites of the Global Positioning System (GPS) will enable precision geodetic observations to be made on a large scale. A number of important geophysical questions relating to plate-motion kinematics and dynamics can be addressed with this measurement capability. We describe a plan to design and validate a GPS-based geodetic system, and to demonstrate its capability in California, Mexico and the Caribbean region. The Caribbean program is a prototype for a number of regional geodetic networks to be globally distributed. In 1985, efforts will be concentrated on understanding and minimizing error sources. Two dominant sources of error are uncertainties in the orbit ephemeris of the GPS satellites, and uncertainties in the correction for signal delay due to variable tropospheric water vapor. Orbit ephemeris uncertainties can be minimized by performing simultaneous satellite observations with GPS receivers at known (fiducial) points. Water vapor corrections can be made by performing simultaneous line-of-sight measurements of integrated water vapor content with ground-based water vapor radiometers. Specific experiments to validate both concepts are outlined. Caribbean measurements will begin in late 1985 or early 1986. Key areas of measurement are the northern strike-slip boundary, and the western convergent boundary. Specific measurement plans in both regions are described.

  4. Experimental findings on the underwater measurements uncertainty of speed of sound and the alignment system

    NASA Astrophysics Data System (ADS)

    Santos, T. Q.; Alvarenga, A. V.; Oliveira, D. P.; Mayworm, R. C.; Souza, R. M.; Costa-Félix, R. P. B.

    2016-07-01

    Speed of sound is an important quantity to characterize reference materials for ultrasonic applications, for instance. The alignment between the transducer and the test body is an key activity in order to perform reliable and consistent measurement. The aim of this work is to evaluate the influence of the alignment system to the expanded uncertainty of such measurement. A stainless steel cylinder was previously calibrated on an out of water system typically used for calibration of non-destructive blocks. Afterwards, the cylinder was calibrated underwater with two distinct alignment system: fixed and mobile. The values were statistically compared to the out-of-water measurement, considered the golden standard for such application. For both alignment systems, the normalized error was less than 0.8, leading to conclude that the both measurement system (under and out-of-water) do not diverge significantly. The gold standard uncertainty was 2.7 m-s-1, whilst the fixed underwater system resulted in 13 m-s-1, and the mobile alignment system achieved 6.6 m-s-1. After the validation of the underwater system for speed of sound measurement, it will be applied to certify Encapsulated Tissue Mimicking Material as a reference material for biotechnology application.

  5. INSPECTION SHOP: PLAN TO PROVIDE UNCERTAINTY ANALYSIS WITH MEASUREMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nederbragt, W W

    The LLNL inspection shop is chartered to make dimensional measurements of components for critical programmatic experiments. These measurements ensure that components are within tolerance and provide geometric details that can be used to further refine simulations. For these measurements to be useful, they must be significantly more accurate than the tolerances that are being checked. For example, if a part has a specified dimension of 100 millimeters and a tolerance of 1 millimeter, then the precision and/or accuracy of the measurement should be less than 1 millimeter. Using the ''10-to-1 gaugemaker's rule of thumb'', the desired precision of the measurementmore » should be less than 100 micrometers. Currently, the process for associating measurement uncertainty with data is not standardized, nor is the uncertainty based on a thorough uncertainty analysis. The goal of this project is to begin providing measurement uncertainty statements with critical measurements performed in the inspection shop. To accomplish this task, comprehensive knowledge about the underlying sources of uncertainty for measurement instruments need to be understood and quantified. Moreover, measurements of elemental uncertainties for each physical source need to be combined in a meaningful way to obtain an overall measurement uncertainty.« less

  6. Quantifying uncertainty in carbon and nutrient pools of coarse woody debris

    NASA Astrophysics Data System (ADS)

    See, C. R.; Campbell, J. L.; Fraver, S.; Domke, G. M.; Harmon, M. E.; Knoepp, J. D.; Woodall, C. W.

    2016-12-01

    Woody detritus constitutes a major pool of both carbon and nutrients in forested ecosystems. Estimating coarse wood stocks relies on many assumptions, even when full surveys are conducted. Researchers rarely report error in coarse wood pool estimates, despite the importance to ecosystem budgets and modelling efforts. To date, no study has attempted a comprehensive assessment of error rates and uncertainty inherent in the estimation of this pool. Here, we use Monte Carlo analysis to propagate the error associated with the major sources of uncertainty present in the calculation of coarse wood carbon and nutrient (i.e., N, P, K, Ca, Mg, Na) pools. We also evaluate individual sources of error to identify the importance of each source of uncertainty in our estimates. We quantify sampling error by comparing the three most common field methods used to survey coarse wood (two transect methods and a whole-plot survey). We quantify the measurement error associated with length and diameter measurement, and technician error in species identification and decay class using plots surveyed by multiple technicians. We use previously published values of model error for the four most common methods of volume estimation: Smalian's, conical frustum, conic paraboloid, and average-of-ends. We also use previously published values for error in the collapse ratio (cross-sectional height/width) of decayed logs that serves as a surrogate for the volume remaining. We consider sampling error in chemical concentration and density for all decay classes, using distributions from both published and unpublished studies. Analytical uncertainty is calculated using standard reference plant material from the National Institute of Standards. Our results suggest that technician error in decay classification can have a large effect on uncertainty, since many of the error distributions included in the calculation (e.g. density, chemical concentration, volume-model selection, collapse ratio) are decay-class specific.

  7. Uncertainty in predicting soil hydraulic properties at the hillslope scale with indirect methods

    NASA Astrophysics Data System (ADS)

    Chirico, G. B.; Medina, H.; Romano, N.

    2007-02-01

    SummarySeveral hydrological applications require the characterisation of the soil hydraulic properties at large spatial scales. Pedotransfer functions (PTFs) are being developed as simplified methods to estimate soil hydraulic properties as an alternative to direct measurements, which are unfeasible for most practical circumstances. The objective of this study is to quantify the uncertainty in PTFs spatial predictions at the hillslope scale as related to the sampling density, due to: (i) the error in estimated soil physico-chemical properties and (ii) PTF model error. The analysis is carried out on a 2-km-long experimental hillslope in South Italy. The method adopted is based on a stochastic generation of patterns of soil variables using sequential Gaussian simulation, conditioned to the observed sample data. The following PTFs are applied: Vereecken's PTF [Vereecken, H., Diels, J., van Orshoven, J., Feyen, J., Bouma, J., 1992. Functional evaluation of pedotransfer functions for the estimation of soil hydraulic properties. Soil Sci. Soc. Am. J. 56, 1371-1378] and HYPRES PTF [Wösten, J.H.M., Lilly, A., Nemes, A., Le Bas, C., 1999. Development and use of a database of hydraulic properties of European soils. Geoderma 90, 169-185]. The two PTFs estimate reliably the soil water retention characteristic even for a relatively coarse sampling resolution, with prediction uncertainties comparable to the uncertainties in direct laboratory or field measurements. The uncertainty of soil water retention prediction due to the model error is as much as or more significant than the uncertainty associated with the estimated input, even for a relatively coarse sampling resolution. Prediction uncertainties are much more important when PTF are applied to estimate the saturated hydraulic conductivity. In this case model error dominates the overall prediction uncertainties, making negligible the effect of the input error.

  8. On the determination of the carbon balance of continents (Vladimir Ivanovich Vernadsky Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Dolman, Albertus J. Han

    2013-04-01

    The carbon balance of regions, the size of continents, can be determined, albeit with significant uncertainty, by combining several bottom up and top down methods. The bottom up methods use eddy covariance techniques, biometric inventory measurements and modeling, while the top down methods use atmospheric observations and inverse models. There has been considerable progress in the last few years in determining these balances through more or less standard protocols, as highlighted for instance by studies of the REgional Carbon Cycle Assessment and Processes (RECAPP) project of the Global Carbon Project. Important areas where uncertainty creeps in are the scaling of point measurements in the bottom up methods, the sparseness of the observation network and the role of model and other errors in the inversion methods. Typically these balances hold for periods of several years. They therefore do not resolve the impact of anomalies in weather and climate directly. The role of management in these balances also differs for different continents. For instance in Europe management plays a strong role in the carbon balance, whereas for the Russian continent this is less important. Management in the European carbon balance may potentially override climatically driven variability. In contrast, for Russia, the importance of the role of forest is paramount, but there the vulnerability of the Arctic regions and permafrost is a key uncertainty for future behaviour. I hope to show the importance of these different aspects of the terrestrial carbon balance by comparing the two continents, and also discuss the significant uncertainty we still face in determining the carbon budgets of large areas. I will argue that we need to get a clearer picture of the role of management in these budgets, but also of the time variability of the budgets to be able to determine the impact of anomalous weather and the vulnerability in a future climate.

  9. How Knowledge Worker Teams Deal Effectively with Task Uncertainty: The Impact of Transformational Leadership and Group Development.

    PubMed

    Leuteritz, Jan-Paul; Navarro, José; Berger, Rita

    2017-01-01

    The purpose of this paper is to clarify how leadership is able to improve team effectiveness, by means of its influence on group processes (i.e., increasing group development) and on the group task (i.e., decreasing task uncertainty). Four hundred and eight members of 107 teams in a German research and development (R&D) organization completed a web-based survey; they provided measures of transformational leadership, group development, 2 aspects of task uncertainty, task interdependence, and team effectiveness. In 54 of these teams, the leaders answered a web-based survey on team effectiveness. We tested the model with the data from team members, using structural equations modeling. Group development and a task uncertainty measurement that refers to unstable demands from outside the team partially mediate the effect of transformational leadership on team effectiveness in R&D organizations ( p < 0.05). Although transformational leaders reduce unclarity of goals ( p < 0.05), this seems not to contribute to team effectiveness. The data provided by the leaders was used to assess common source bias, which did not affect the interpretability of the results. Limitations include cross-sectional data and a lower than expected variance of task uncertainty across different job types. This paper contributes to understanding how knowledge worker teams deal effectively with task uncertainty and confirms the importance of group development in this context. This is the first study to examine the effects of transformational leadership and team processes on team effectiveness considering the task characteristics uncertainty and interdependence.

  10. How Knowledge Worker Teams Deal Effectively with Task Uncertainty: The Impact of Transformational Leadership and Group Development

    PubMed Central

    Leuteritz, Jan-Paul; Navarro, José; Berger, Rita

    2017-01-01

    The purpose of this paper is to clarify how leadership is able to improve team effectiveness, by means of its influence on group processes (i.e., increasing group development) and on the group task (i.e., decreasing task uncertainty). Four hundred and eight members of 107 teams in a German research and development (R&D) organization completed a web-based survey; they provided measures of transformational leadership, group development, 2 aspects of task uncertainty, task interdependence, and team effectiveness. In 54 of these teams, the leaders answered a web-based survey on team effectiveness. We tested the model with the data from team members, using structural equations modeling. Group development and a task uncertainty measurement that refers to unstable demands from outside the team partially mediate the effect of transformational leadership on team effectiveness in R&D organizations (p < 0.05). Although transformational leaders reduce unclarity of goals (p < 0.05), this seems not to contribute to team effectiveness. The data provided by the leaders was used to assess common source bias, which did not affect the interpretability of the results. Limitations include cross-sectional data and a lower than expected variance of task uncertainty across different job types. This paper contributes to understanding how knowledge worker teams deal effectively with task uncertainty and confirms the importance of group development in this context. This is the first study to examine the effects of transformational leadership and team processes on team effectiveness considering the task characteristics uncertainty and interdependence. PMID:28861012

  11. An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation

    NASA Astrophysics Data System (ADS)

    McLean, N. M.; Bowring, J. F.; Bowring, S. A.

    2011-06-01

    High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of calculated U-Pb dates. As analytical techniques have advanced, formerly small sources of uncertainty are increasingly important, and thus previous simplifications for data reduction and uncertainty propagation are no longer valid. Although notable previous efforts have treated propagation of correlated uncertainties for the U-Pb system, the equations, uncertainties, and correlations have been limited in number and subject to simplification during propagation through intermediary calculations. We derive and present a transparent U-Pb data reduction algorithm that transforms raw isotopic data and measured or assumed laboratory parameters into the isotopic ratios and dates geochronologists interpret without making assumptions about the relative size of sample components. To propagate uncertainties and their correlations, we describe, in detail, a linear algebraic algorithm that incorporates all input uncertainties and correlations without limiting or simplifying covariance terms to propagate them though intermediate calculations. Finally, a weighted mean algorithm is presented that utilizes matrix elements from the uncertainty propagation algorithm to propagate random and systematic uncertainties for data comparison between other U-Pb labs and other geochronometers. The linear uncertainty propagation algorithms are verified with Monte Carlo simulations of several typical analyses. We propose that our algorithms be considered by the community for implementation to improve the collaborative science envisioned by the EARTHTIME initiative.

  12. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  13. Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reinath, Michael S.

    1997-01-01

    Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.

  14. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  15. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Gregory M; Key, Brian P; Zerkle, David K

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which canmore » be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.« less

  16. Accurate calibration and uncertainty estimation of the normal spring constant of various AFM cantilevers.

    PubMed

    Song, Yunpeng; Wu, Sen; Xu, Linyan; Fu, Xing

    2015-03-10

    Measurement of force on a micro- or nano-Newton scale is important when exploring the mechanical properties of materials in the biophysics and nanomechanical fields. The atomic force microscope (AFM) is widely used in microforce measurement. The cantilever probe works as an AFM force sensor, and the spring constant of the cantilever is of great significance to the accuracy of the measurement results. This paper presents a normal spring constant calibration method with the combined use of an electromagnetic balance and a homemade AFM head. When the cantilever presses the balance, its deflection is detected through an optical lever integrated in the AFM head. Meanwhile, the corresponding bending force is recorded by the balance. Then the spring constant can be simply calculated using Hooke's law. During the calibration, a feedback loop is applied to control the deflection of the cantilever. Errors that may affect the stability of the cantilever could be compensated rapidly. Five types of commercial cantilevers with different shapes, stiffness, and operating modes were chosen to evaluate the performance of our system. Based on the uncertainty analysis, the expanded relative standard uncertainties of the normal spring constant of most measured cantilevers are believed to be better than 2%.

  17. Evaluation of measurement uncertainty of glucose in clinical chemistry.

    PubMed

    Berçik Inal, B; Koldas, M; Inal, H; Coskun, C; Gümüs, A; Döventas, Y

    2007-04-01

    The definition of the uncertainty of measurement used in the International Vocabulary of Basic and General Terms in Metrology (VIM) is a parameter associated with the result of a measurement, which characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty of measurement comprises many components. In addition to every parameter, the measurement uncertainty is that a value should be given by all institutions that have been accredited. This value shows reliability of the measurement. GUM, published by NIST, contains uncertainty directions. Eurachem/CITAC Guide CG4 was also published by Eurachem/CITAC Working Group in the year 2000. Both of them offer a mathematical model, for uncertainty can be calculated. There are two types of uncertainty in measurement. Type A is the evaluation of uncertainty through the statistical analysis and type B is the evaluation of uncertainty through other means, for example, certificate reference material. Eurachem Guide uses four types of distribution functions: (1) rectangular distribution that gives limits without specifying a level of confidence (u(x)=a/ radical3) to a certificate; (2) triangular distribution that values near to the same point (u(x)=a/ radical6); (3) normal distribution in which an uncertainty is given in the form of a standard deviation s, a relative standard deviation s/ radicaln, or a coefficient of variance CV% without specifying the distribution (a = certificate value, u = standard uncertainty); and (4) confidence interval.

  18. Using measurement uncertainty in decision-making and conformity assessment

    NASA Astrophysics Data System (ADS)

    Pendrill, L. R.

    2014-08-01

    Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such costs are associated with a variety of consequences, such as unnecessary re-manufacturing by the supplier as well as various consequences for the customer, arising from incorrect measures of quantity, poor product performance and so on.

  19. Using Bayes factors for multi-factor, biometric authentication

    NASA Astrophysics Data System (ADS)

    Giffin, A.; Skufca, J. D.; Lao, P. A.

    2015-01-01

    Multi-factor/multi-modal authentication systems are becoming the de facto industry standard. Traditional methods typically use rates that are point estimates and lack a good measure of uncertainty. Additionally, multiple factors are typically fused together in an ad hoc manner. To be consistent, as well as to establish and make proper use of uncertainties, we use a Bayesian method that will update our estimates and uncertainties as new information presents itself. Our algorithm compares competing classes (such as genuine vs. imposter) using Bayes Factors (BF). The importance of this approach is that we not only accept or reject one model (class), but compare it to others to make a decision. We show using a Receiver Operating Characteristic (ROC) curve that using BF for determining class will always perform at least as well as the traditional combining of factors, such as a voting algorithm. As the uncertainty decreases, the BF result continues to exceed the traditional methods result.

  20. Error-tradeoff and error-disturbance relations for incompatible quantum measurements.

    PubMed

    Branciard, Cyril

    2013-04-23

    Heisenberg's uncertainty principle is one of the main tenets of quantum theory. Nevertheless, and despite its fundamental importance for our understanding of quantum foundations, there has been some confusion in its interpretation: Although Heisenberg's first argument was that the measurement of one observable on a quantum state necessarily disturbs another incompatible observable, standard uncertainty relations typically bound the indeterminacy of the outcomes when either one or the other observable is measured. In this paper, we quantify precisely Heisenberg's intuition. Even if two incompatible observables cannot be measured together, one can still approximate their joint measurement, at the price of introducing some errors with respect to the ideal measurement of each of them. We present a tight relation characterizing the optimal tradeoff between the error on one observable vs. the error on the other. As a particular case, our approach allows us to characterize the disturbance of an observable induced by the approximate measurement of another one; we also derive a stronger error-disturbance relation for this scenario.

  1. Collective efficacy: How is it conceptualized, how is it measured, and does it really matter for understanding perceived neighborhood crime and disorder?

    PubMed Central

    Hipp, John R.

    2016-01-01

    Building on the insights of the self-efficacy literature, this study highlights that collective efficacy is a collective perception that comes from a process. This study emphasizes that 1) there is updating, as there are feedback effects from success or failure by the group to the perception of collective efficacy, and 2) this updating raises the importance of accounting for members' degree of uncertainty regarding neighborhood collective efficacy. Using a sample of 113 block groups in three rural North Carolina counties, this study finds evidence of updating as neighborhoods perceiving more crime or disorder reported less collective efficacy at the next time point. Furthermore, collective efficacy was only associated with lower perceived disorder at the next time point when it occurred in highly cohesive neighborhoods. Finally, neighborhoods with more perceived disorder and uncertainty regarding collective efficacy at one time point had lower levels of collective efficacy at the next time point, illustrating the importance of uncertainty along with updating. PMID:27069285

  2. Adaptive Photothermal Emission Analysis Techniques for Robust Thermal Property Measurements of Thermal Barrier Coatings

    NASA Astrophysics Data System (ADS)

    Valdes, Raymond

    The characterization of thermal barrier coating (TBC) systems is increasingly important because they enable gas turbine engines to operate at high temperatures and efficiency. Phase of photothermal emission analysis (PopTea) has been developed to analyze the thermal behavior of the ceramic top-coat of TBCs, as a nondestructive and noncontact method for measuring thermal diffusivity and thermal conductivity. Most TBC allocations are on actively-cooled high temperature turbine blades, which makes it difficult to precisely model heat transfer in the metallic subsystem. This reduces the ability of rote thermal modeling to reflect the actual physical conditions of the system and can lead to higher uncertainty in measured thermal properties. This dissertation investigates fundamental issues underpinning robust thermal property measurements that are adaptive to non-specific, complex, and evolving system characteristics using the PopTea method. A generic and adaptive subsystem PopTea thermal model was developed to account for complex geometry beyond a well-defined coating and substrate system. Without a priori knowledge of the subsystem characteristics, two different measurement techniques were implemented using the subsystem model. In the first technique, the properties of the subsystem were resolved as part of the PopTea parameter estimation algorithm; and, the second technique independently resolved the subsystem properties using a differential "bare" subsystem. The confidence in thermal properties measured using the generic subsystem model is similar to that from a standard PopTea measurement on a "well-defined" TBC system. Non-systematic bias-error on experimental observations in PopTea measurements due to generic thermal model discrepancies was also mitigated using a regression-based sensitivity analysis. The sensitivity analysis reported measurement uncertainty and was developed into a data reduction method to filter out these "erroneous" observations. It was found that the adverse impact of bias-error can be greatly reduced, leaving measurement observations with only random Gaussian noise in PopTea thermal property measurements. Quantifying the influence of the coating-substrate interface in PopTea measurements is important to resolving the thermal conductivity of the coating. However, the reduced significance of this interface in thicker coating systems can give rise to large uncertainties in thermal conductivity measurements. A first step towards improving PopTea measurements for such circumstances has been taken by implementing absolute temperature measurements using harmonically-sustained two-color pyrometry. Although promising, even small uncertainties in thermal emission observations were found to lead to significant noise in temperature measurements. However, PopTea analysis on bulk graphite samples were able to resolve its thermal conductivity to the expected literature values.

  3. Instrumentation-related uncertainty of reflectance and transmittance measurements with a two-channel spectrophotometer.

    PubMed

    Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten

    2017-01-01

    Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.

  4. Standards for the validation of remotely sensed albedo products

    NASA Astrophysics Data System (ADS)

    Adams, Jennifer

    2015-04-01

    Land surface albedo is important component of the Earth's energy balance, defined as the fraction of shortwave radiation absorbed by a surface, and is one many Essential Climate Variables (ECVS) that can be retrieved from space through remote sensing. To quantify the accuracy of these products, they must be validated with respect to in-situ measurements of albedo using an albedometer. Whilst accepted standards exist for the calibration of albedometers, standards for the use of in-situ measurement schemes, and their use in validation procedures have yet to be developed. It is essential that we can assess the quality of remotely sensed albedo data, and to identify traceable sources of uncertainty during process of providing these data. As a result of the current lack of accepted standards for in-situ albedo retrieval and validation procedures, we are not yet able to identify and quantify traceable sources of uncertainty. Establishing standard protocols for in-situ retrievals for the validation of global albedo products would allow inter-product use and comparison, in addition to product standardization. Accordingly, this study aims to assess the quality of in-situ albedo retrieval schemes and identify sources of uncertainty, specifically in vegetation environments. A 3D Monte Carlo Ray Tracing Model will be used to simulate albedometer instruments in complex 3D vegetation canopies. To determine sources of uncertainty, factors that influence albedo measurement uncertainty were identified and will subsequently be examined: 1. Time of day (Solar Zenith Angle) 2. Ecosytem type 3. Placement of albedometer within the ecosystem 4. Height of albedometer above the canopy 5. Clustering within the ecosystem A variety of 3D vegetation canopies have been generated to cover the main ecosystems found globally, different seasons, and different plant distributions. Canopies generated include birchstand and pinestand forests for summer and winter, savanna, shrubland, cropland and citrus orchard. All canopies were simulated for a 100x100m area to best represent in-situ measurement conditions. Preliminary tests have been conducted, firstly, identifying the spectral range required to estimate broadband albedo (BBA) and secondly, determining the hyper-spectral intervals required to calculate BBA from spectral albedo. Final results are expected to be able to identify for the factors aforementioned, given a specified confidence level and within 3% accuracy, when does uncertainty of in-situ measurement fall within these critera, and outside these criteria. As the uncertainty of in-situ measurements should be made on an individual basis accounting for relevant factors, this study aims to document for a specific scenario traceable uncertainty sources in in-situ albedo retrieval.

  5. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  6. Experimental joint quantum measurements with minimum uncertainty.

    PubMed

    Ringbauer, Martin; Biggerstaff, Devon N; Broome, Matthew A; Fedrizzi, Alessandro; Branciard, Cyril; White, Andrew G

    2014-01-17

    Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the three-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.999 98(6) allow us to verge upon the fundamental limits of measurement uncertainty.

  7. Investigating Uncertainty in Predicting Carbon Dynamics in North American Biomes: Putting Support-Effect Bias in Perspective

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Brass, Jim (Technical Monitor)

    2001-01-01

    A fundamental strategy in NASA's Earth Observing System's (EOS) monitoring of vegetation and its contribution to the global carbon cycle is to rely on deterministic, process-based ecosystem models to make predictions of carbon flux over large regions. These models are parameterized (that is, the input variables are derived) using remotely sensed images such as those from the Moderate Resolution Imaging Spectroradiometer (MODIS), ground measurements and interpolated maps. Since early applications of these models, investigators have noted that results depend partly on the spatial support of the input variables. In general, the larger the support of the input data, the greater the chance that the effects of important components of the ecosystem will be averaged out. A review of previous work shows that using large supports can cause either positive or negative bias in carbon flux predictions. To put the magnitude and direction of these biases in perspective, we must quantify the range of uncertainty on our best measurements of carbon-related variables made on equivalent areas. In other words, support-effect bias should be placed in the context of prediction uncertainty from other sources. If the range of uncertainty at the smallest support is less than the support-effect bias, more research emphasis should probably be placed on support sizes that are intermediate between those of field measurements and MODIS. If the uncertainty range at the smallest support is larger than the support-effect bias, the accuracy of MODIS-based predictions will be difficult to quantify and more emphasis should be placed on field-scale characterization and sampling. This talk will describe methods to address these issues using a field measurement campaign in North America and "upscaling" using geostatistical estimation and simulation.

  8. Uncertainty analysis of gross primary production partitioned from net ecosystem exchange measurements

    NASA Astrophysics Data System (ADS)

    Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.

    2015-08-01

    Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.

  9. Measuring Uranium Decay Rates for Advancement of Nuclear Forensics and Geochronology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons-Davis, Tashi

    Radioisotopic dating techniques are highly valuable tools for understanding the history of physical and chemical processes in materials related to planetary sciences and nuclear forensics, and rely on accurate knowledge of decay constants and their uncertainties. The decay constants of U-238 and U-235 are particularly important to Earth science, and often the measured values with lowest reported uncertainties are applied, although they have not been independently verified with similar precision. New direct measurements of the decay constants of U-238, Th-234, U-235, and U-234 were completed, using a range of analytical approaches. An overarching goal of the project was to ensuremore » the quality of results, including metrological traceability to facilitate implementation across diverse disciplines. This report presents preliminary results of these experiments, as a few final measurements and calculations are still in progress.« less

  10. What drives uncertainty in model diagnoses of carbon dynamics in southern US forests: climate, vegetation, disturbance, or model parameters?

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Gu, H.; Williams, C. A.

    2017-12-01

    Results from terrestrial carbon cycle models have multiple sources of uncertainty, each with its behavior and range. Their relative importance and how they combine has received little attention. This study investigates how various sources of uncertainty propagate, temporally and spatially, in CASA-Disturbance (CASA-D). CASA-D simulates the impact of climatic forcing and disturbance legacies on forest carbon dynamics with the following steps. Firstly, we infer annual growth and mortality rates from measured biomass stocks (FIA) over time and disturbance (e.g., fire, harvest, bark beetle) to represent annual post-disturbance carbon fluxes trajectories across forest types and site productivity settings. Then, annual carbon fluxes are estimated from these trajectories by using time since disturbance which is inferred from biomass (NBCD 2000) and disturbance maps (NAFD, MTBS and ADS). Finally, we apply monthly climatic scalars derived from default CASA to temporally distribute annual carbon fluxes to each month. This study assesses carbon flux uncertainty from two sources: driving data including climatic and forest biomass inputs, and three most sensitive parameters in CASA-D including maximum light use efficiency, temperature sensitivity of soil respiration (Q10) and optimum temperature identified by using EFAST (Extended Fourier Amplitude Sensitivity Testing). We quantify model uncertainties from each, and report their relative importance in estimating forest carbon sink/source in southeast United States from 2003 to 2010.

  11. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  12. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  13. Defining and Measuring Diagnostic Uncertainty in Medicine: A Systematic Review.

    PubMed

    Bhise, Viraj; Rajan, Suja S; Sittig, Dean F; Morgan, Robert O; Chaudhary, Pooja; Singh, Hardeep

    2018-01-01

    Physicians routinely encounter diagnostic uncertainty in practice. Despite its impact on health care utilization, costs and error, measurement of diagnostic uncertainty is poorly understood. We conducted a systematic review to describe how diagnostic uncertainty is defined and measured in medical practice. We searched OVID Medline and PsycINFO databases from inception until May 2017 using a combination of keywords and Medical Subject Headings (MeSH). Additional search strategies included manual review of references identified in the primary search, use of a topic-specific database (AHRQ-PSNet) and expert input. We specifically focused on articles that (1) defined diagnostic uncertainty; (2) conceptualized diagnostic uncertainty in terms of its sources, complexity of its attributes or strategies for managing it; or (3) attempted to measure diagnostic uncertainty. We identified 123 articles for full review, none of which defined diagnostic uncertainty. Three attributes of diagnostic uncertainty were relevant for measurement: (1) it is a subjective perception experienced by the clinician; (2) it has the potential to impact diagnostic evaluation-for example, when inappropriately managed, it can lead to diagnostic delays; and (3) it is dynamic in nature, changing with time. Current methods for measuring diagnostic uncertainty in medical practice include: (1) asking clinicians about their perception of uncertainty (surveys and qualitative interviews), (2) evaluating the patient-clinician encounter (such as by reviews of medical records, transcripts of patient-clinician communication and observation), and (3) experimental techniques (patient vignette studies). The term "diagnostic uncertainty" lacks a clear definition, and there is no comprehensive framework for its measurement in medical practice. Based on review findings, we propose that diagnostic uncertainty be defined as a "subjective perception of an inability to provide an accurate explanation of the patient's health problem." Methodological advancements in measuring diagnostic uncertainty can improve our understanding of diagnostic decision-making and inform interventions to reduce diagnostic errors and overuse of health care resources.

  14. National Institute of Standards and Technology measurement service of the optical properties of biomedical phantoms: Current status

    PubMed Central

    Lemaillet, Paul; Cooksey, Catherine C.; Levine, Zachary H.; Pintar, Adam L.; Hwang, Jeeseong; Allen, David W.

    2016-01-01

    The National Institute of Standards and Technology (NIST) has maintained scales for reflectance and transmittance over several decades. The scales are primarily intended for regular transmittance, mirrors, and solid surface scattering diffusers. The rapidly growing area of optical medical imaging needs a scale for volume scattering of diffuse materials that are used to mimic the optical properties of tissue. Such materials are used as phantoms to evaluate and validate instruments under development intended for clinical use. To address this need, a double-integrating sphere based instrument has been installed to measure the optical properties of tissue-mimicking phantoms. The basic system and methods have been described in previous papers. An important attribute in establishing a viable calibration service is the estimation of measurement uncertainties. The use of custom models and comparisons with other established scales enabled uncertainty measurements. Here, we describe the continuation of those efforts to advance the understanding of the uncertainties through two independent measurements: the bidirectional reflectance distribution function and the bidirectional transmittance distribution function of a commercially available solid biomedical phantom. A Monte Carlo-based model is used and the resulting optical properties are compared to the values provided by the phantom manufacturer. PMID:27453623

  15. Nitrous oxide emissions from open-lot cattle feedyards: A review

    USDA-ARS?s Scientific Manuscript database

    Nitrous oxide volatilization from concentrated animal feeding operations (CAFO), including cattle feedyards, has become an important research topic. However, there are limitations to current measurement techniques, uncertainty in the magnitude of feedyard nitrous oxide fluxes and a lack of effective...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luis, Alfredo

    We show within a very simple framework that different measures of fluctuations lead to uncertainty relations resulting in contradictory conclusions. More specifically we focus on Tsallis and Renyi entropic uncertainty relations and we get that the minimum joint uncertainty states for some fluctuation measures are the maximum joint uncertainty states of other fluctuation measures, and vice versa.

  17. Impacts of Future Climate Change on California Perennial Crop Yields: Model Projections with Climate and Crop Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobell, D; Field, C; Cahill, K

    2006-01-10

    Most research on the agricultural impacts of climate change has focused on the major annual crops, yet perennial cropping systems are less adaptable and thus potentially more susceptible to damage. Improved assessments of yield responses to future climate are needed to prioritize adaptation strategies in the many regions where perennial crops are economically and culturally important. These impact assessments, in turn, must rely on climate and crop models that contain often poorly defined uncertainties. We evaluated the impact of climate change on six major perennial crops in California: wine grapes, almonds, table grapes, oranges, walnuts, and avocados. Outputs from multiplemore » climate models were used to evaluate climate uncertainty, while multiple statistical crop models, derived by resampling historical databases, were used to address crop response uncertainties. We find that, despite these uncertainties, climate change in California is very likely to put downward pressure on yields of almonds, walnuts, avocados, and table grapes by 2050. Without CO{sub 2} fertilization or adaptation measures, projected losses range from 0 to >40% depending on the crop and the trajectory of climate change. Climate change uncertainty generally had a larger impact on projections than crop model uncertainty, although the latter was substantial for several crops. Opportunities for expansion into cooler regions are identified, but this adaptation would require substantial investments and may be limited by non-climatic constraints. Given the long time scales for growth and production of orchards and vineyards ({approx}30 years), climate change should be an important factor in selecting perennial varieties and deciding whether and where perennials should be planted.« less

  18. Systematic evaluation of an atomic clock at 2 × 10−18 total uncertainty

    PubMed Central

    Nicholson, T.L.; Campbell, S.L.; Hutson, R.B.; Marti, G.E.; Bloom, B.J.; McNally, R.L.; Zhang, W.; Barrett, M.D.; Safronova, M.S.; Strouse, G.F.; Tew, W.L.; Ye, J.

    2015-01-01

    The pursuit of better atomic clocks has advanced many research areas, providing better quantum state control, new insights in quantum science, tighter limits on fundamental constant variation and improved tests of relativity. The record for the best stability and accuracy is currently held by optical lattice clocks. Here we take an important step towards realizing the full potential of a many-particle clock with a state-of-the-art stable laser. Our 87Sr optical lattice clock now achieves fractional stability of 2.2 × 10−16 at 1 s. With this improved stability, we perform a new accuracy evaluation of our clock, reducing many systematic uncertainties that limited our previous measurements, such as those in the lattice ac Stark shift, the atoms' thermal environment and the atomic response to room-temperature blackbody radiation. Our combined measurements have reduced the total uncertainty of the JILA Sr clock to 2.1 × 10−18 in fractional frequency units. PMID:25898253

  19. Impact of a Ground Network of Miniaturized Laser Heterodyne Radiometers (mini-LHRs) on Global Carbon Flux Estimates

    NASA Astrophysics Data System (ADS)

    DiGregorio, A.; Wilson, E. L.; Palmer, P. I.; Mao, J.; Feng, L.

    2017-12-01

    We present the simulated impact of a small (50 instrument) ground network of NASA Goddard Space Flight Center's miniaturized laser heterodyne radiometer (mini-LHR), a small, low cost ( 50k), portable, and high precision CH4 and CO2 measuring instrument. Partnered with AERONET as a non-intrusive accessory, the mini-LHR is able to leverage the 500+ instrument AERONET network for rapid network deployment and testing, and simultaneously retrieve co-located aerosol data, an important input for sattelite measurements. This observing systems simulation experiment (OSSE) uses the 3-D GEOS-Chem chemistry transport model and 50 strategically selected sites to model flux estimate uncertainty reduction of both TCCON and mini-LHR instruments. We found that 50 mini-LHR sites are capable of improving global uncertainty by up to 70%, with local improvements in the Southern Hemisphere reaching to 90%. Our studies show that addition of the mini-LHR to current ground networks will play a major role in reduction of global carbon flux uncertainty.

  20. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    PubMed

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. In Situ Measurement of Aerosol Extinction

    NASA Technical Reports Server (NTRS)

    Strawa, Anthony W.; Castaneda, R.; Owano, T. G.; Bear, D.; Gore, Warren J. (Technical Monitor)

    2001-01-01

    Aerosols are important contributors to the radiative forcing in the atmosphere. Much of the uncertainty in our knowledge of climate forcing is due to uncertainties in the radiative forcing due to aerosols as illustrated in the IPCC reports of the last ten years. Improved measurement of aerosol optical properties, therefore, is critical to an improved understanding of atmospheric radiative forcing. Additionally, attempts to reconcile in situ and remote measurements of aerosol radiative properties have generally not been successful. This is due in part to the fact that it has been impossible to measure aerosol extinction in situ in the past. In this presentation we introduce a new instrument that employs the techniques used in cavity ringdown spectroscopy to measure the aerosol extinction and scattering coefficients in situ. A prototype instrument has been designed and tested in the lab and the field. It is capable of measuring aerosol extinction coefficient to 2x10(exp -6) per meter. This prototype instrument is described and results are presented.

  2. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  3. Determination of the Solar Energy Microclimate of the United States Using Satellite Data

    NASA Technical Reports Server (NTRS)

    Vonderharr, T. H.; Ellis, J. S.

    1978-01-01

    The determination of total solar energy reaching the ground over the United States using measurements from meteorological satellites as the basic data set is examined. The methods of satellite data processing are described. Uncertainty analysis and comparison of results with well calibrated surface pyranometers are used to estimate the probable error in the satellite-based determination of ground insolation. It is 10 to 15 percent for daily information, and about 5 percent for monthly values. However, the natural space and time variability of insolation is much greater than the uncertainty in the method. The most important aspect of the satellite-based technique is the ability to determine the solar energy reaching the ground over small areas where no other measurements are available. Thus, it complements the widely spaced solar radiation measurement network of ground stations.

  4. Uncertainty of Acute Stroke Patients: A Cross-sectional Descriptive and Correlational Study.

    PubMed

    Ni, Chunping; Peng, Jing; Wei, Yuanyuan; Hua, Yan; Ren, Xiaoran; Su, Xiangni; Shi, Ruijie

    2018-06-12

    Uncertainty is a chronic and pervasive source of psychological distress for patients and plays an important role in the rehabilitation of stroke survivors. Little is known about the level and correlates of uncertainty among patients in the acute phase of stroke. The purposes of this study were to describe the uncertainty of patients in the acute phase of stroke and to explore characteristics of patients associated with that uncertainty. A cross-sectional descriptive and correlational study was conducted with a convenience sample of 451 consecutive hospitalized acute stroke patients recruited from the neurology department of 2 general hospitals of China. Uncertainty was measured using Chinese versions of Mishel Uncertainty in Illness Scale for Adults on the fourth day of patients' admission. The patients had moderately high Mishel Uncertainty in Illness Scale for Adults scores (mean [SD], 74.37 [9.22]) in the acute phase of stroke. A total of 95.2% and 2.9% of patients were in moderate and high levels of uncertainty, respectively. The mean (SD) score of ambiguity (3.05 [0.39]) was higher than that of complexity (2.88 [0.52]). Each of the following characteristics was independently associated with greater uncertainty: functional status (P = .000), suffering from other chronic diseases (P = .000), time since the first-ever stroke (P = .000), self-evaluated economic pressure (P = .000), family monthly income (P = .001), educational level (P = .006), and self-evaluated severity of disease (P = .000). Patients experienced persistently, moderately high uncertainty in the acute phase of stroke. Ameliorating uncertainty should be an integral part of the rehabilitation program. Better understanding of uncertainty and its associated characteristics may help nurses identify patients at the highest risk who may benefit from targeted interventions.

  5. Estimation of uncertainty for contour method residual stress measurements

    DOE PAGES

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; ...

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore » of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).« less

  6. Monkeys and humans take local uncertainty into account when localizing a change

    PubMed Central

    Devkar, Deepna; Wright, Anthony A.; Ma, Wei Ji

    2017-01-01

    Since sensory measurements are noisy, an observer is rarely certain about the identity of a stimulus. In visual perception tasks, observers generally take their uncertainty about a stimulus into account when doing so helps task performance. Whether the same holds in visual working memory tasks is largely unknown. Ten human and two monkey subjects localized a single change in orientation between a sample display containing three ellipses and a test display containing two ellipses. To manipulate uncertainty, we varied the reliability of orientation information by making each ellipse more or less elongated (two levels); reliability was independent across the stimuli. In both species, a variable-precision encoding model equipped with an “uncertainty–indifferent” decision rule, which uses only the noisy memories, fitted the data poorly. In both species, a much better fit was provided by a model in which the observer also takes the levels of reliability-driven uncertainty associated with the memories into account. In particular, a measured change in a low-reliability stimulus was given lower weight than the same change in a high-reliability stimulus. We did not find strong evidence that observers took reliability-independent variations in uncertainty into account. Our results illustrate the importance of studying the decision stage in comparison tasks and provide further evidence for evolutionary continuity of working memory systems between monkeys and humans. PMID:28877535

  7. On measurement of the acoustic nonlinearity parameter using the finite amplitude insertion substitution (FAIS) technique

    NASA Astrophysics Data System (ADS)

    Zeqiri, Bajram; Cook, Ashley; Rétat, Lise; Civale, John; ter Haar, Gail

    2015-04-01

    The acoustic nonlinearity parameter, B/A, is an important parameter which defines the way a propagating finite amplitude acoustic wave progressively distorts when travelling through any medium. One measurement technique used to determine its value is the finite amplitude insertion substitution (FAIS) method which has been applied to a range of liquid, tissue and tissue-like media. Importantly, in terms of the achievable measurement uncertainties, it is a relative technique. This paper presents a detailed study of the method, employing a number of novel features. The first of these is the use of a large area membrane hydrophone (30 mm aperture) which is used to record the plane-wave component of the acoustic field. This reduces the influence of diffraction on measurements, enabling studies to be carried out within the transducer near-field, with the interrogating transducer, test cell and detector positioned close to one another, an attribute which assists in controlling errors arising from nonlinear distortion in any intervening water path. The second feature is the development of a model which estimates the influence of finite-amplitude distortion as the acoustic wave travels from the rear surface of the test cell to the detector. It is demonstrated that this can lead to a significant systematic error in B/A measurement whose magnitude and direction depends on the acoustic property contrast between the test material and the water-filled equivalent cell. Good qualitative agreement between the model and experiment is reported. B/A measurements are reported undertaken at (20 ± 0.5) °C for two fluids commonly employed as reference materials within the technical literature: Corn Oil and Ethylene Glycol. Samples of an IEC standardised agar-based tissue-mimicking material were also measured. A systematic assessment of measurement uncertainties is presented giving expanded uncertainties in the range ±7% to ±14%, expressed at a confidence level close to 95%, dependent on specimen details.

  8. Value of travel-time reliability : commuters' route-choice behavior in the Twin Cities.

    DOT National Transportation Integrated Search

    2011-10-01

    Travel-time variability is a noteworthy factor in network performance. It measures the temporal uncertainty experienced by users in their : movement between any two nodes in a network. The importance of the time variance depends on the penalties incu...

  9. The transient divided bar method for laboratory measurements of thermal properties

    NASA Astrophysics Data System (ADS)

    Bording, Thue S.; Nielsen, Søren B.; Balling, Niels

    2016-12-01

    Accurate information on thermal conductivity and thermal diffusivity of materials is of central importance in relation to geoscience and engineering problems involving the transfer of heat. Several methods, including the classical divided bar technique, are available for laboratory measurements of thermal conductivity, but much fewer for thermal diffusivity. We have generalized the divided bar technique to the transient case in which thermal conductivity, volumetric heat capacity and thereby also thermal diffusivity are measured simultaneously. As the density of samples is easily determined independently, specific heat capacity can also be determined. The finite element formulation provides a flexible forward solution for heat transfer across the bar, and thermal properties are estimated by inverse Monte Carlo modelling. This methodology enables a proper quantification of experimental uncertainties on measured thermal properties and information on their origin. The developed methodology was applied to various materials, including a standard ceramic material and different rock samples, and measuring results were compared with results applying traditional steady-state divided bar and an independent line-source method. All measurements show highly consistent results and with excellent reproducibility and high accuracy. For conductivity the obtained uncertainty is typically 1-3 per cent, and for diffusivity uncertainty may be reduced to about 3-5 per cent. The main uncertainty originates from the presence of thermal contact resistance associated with the internal interfaces in the bar. These are not resolved during inversion and it is imperative that they are minimized. The proposed procedure is simple and may quite easily be implemented to the many steady-state divided bar systems in operation. A thermally controlled bath, as applied here, may not be needed. Simpler systems, such as applying temperature-controlled water directly from a tap, may also be applied.

  10. On the Lamb shift in neutral muonic helium

    NASA Astrophysics Data System (ADS)

    Amusia, Miron; Karshenboim, Savely; Ivanov, Vladimir

    2015-05-01

    The neutral muonic helium is an exotic atomic system consisting of an electron, muon and a nucleus. We consider it as a hydrogen-like atom with a compound nucleus that is also hydrogen-like system. There are a number of corrections to the Bohr energy levels, which all can be treated as contributions of generic hydrogen-like theory. While the form of those contributions is the same for all hydrogen-like atoms, their relative numerical importance differs from an atom to an atom. Here, the leading contribution to the electronic Lamb shift in the neutral muonic helium is found in a close analytic form together with the most important corrections. We believe that the Lamb shift in the neutral muonic hydrogen is measurable, at least through a measurement of the electronic 1 s - 2 s transition. We present a theoretical prediction for the 1 s - 2 s transitions with the uncertainty of 2 ppm (4 GHz), as well as for the 2 s - 2 p Lamb shift with the uncertainty of 0.6 GHz.

  11. Optimization Under Uncertainty for Wake Steering Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N.

    Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  12. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presencemore » of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  13. Optimization Under Uncertainty for Wake Steering Strategies

    NASA Astrophysics Data System (ADS)

    Quick, Julian; Annoni, Jennifer; King, Ryan; Dykes, Katherine; Fleming, Paul; Ning, Andrew

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as “wake steering,” in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  14. Optimization Under Uncertainty for Wake Steering Strategies

    DOE PAGES

    Quick, Julian; Annoni, Jennifer; King, Ryan N.; ...

    2017-06-13

    Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less

  15. A contribution to the calculation of measurement uncertainty and optimization of measuring strategies in coordinate measurement

    NASA Astrophysics Data System (ADS)

    Waeldele, F.

    1983-01-01

    The influence of sample shape deviations on the measurement uncertainties and the optimization of computer aided coordinate measurement were investigated for a circle and a cylinder. Using the complete error propagation law in matrix form the parameter uncertainties are calculated, taking the correlation between the measurement points into account. Theoretical investigations show that the measuring points have to be equidistantly distributed and that for a cylindrical body a measuring point distribution along a cross section is better than along a helical line. The theoretically obtained expressions to calculate the uncertainties prove to be a good estimation basis. The simple error theory is not satisfactory for estimation. The complete statistical data analysis theory helps to avoid aggravating measurement errors and to adjust the number of measuring points to the required measuring uncertainty.

  16. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    NASA Technical Reports Server (NTRS)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  17. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2015-04-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  18. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2014-12-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  19. Uncertainties in s-process nucleosynthesis in massive stars determined by Monte Carlo variations

    NASA Astrophysics Data System (ADS)

    Nishimura, N.; Hirschi, R.; Rauscher, T.; St. J. Murphy, A.; Cescutti, G.

    2017-08-01

    The s-process in massive stars produces the weak component of the s-process (nuclei up to A ˜ 90), in amounts that match solar abundances. For heavier isotopes, such as barium, production through neutron capture is significantly enhanced in very metal-poor stars with fast rotation. However, detailed theoretical predictions for the resulting final s-process abundances have important uncertainties caused both by the underlying uncertainties in the nuclear physics (principally neutron-capture reaction and β-decay rates) as well as by the stellar evolution modelling. In this work, we investigated the impact of nuclear-physics uncertainties relevant to the s-process in massive stars. Using a Monte Carlo based approach, we performed extensive nuclear reaction network calculations that include newly evaluated upper and lower limits for the individual temperature-dependent reaction rates. We found that most of the uncertainty in the final abundances is caused by uncertainties in the neutron-capture rates, while β-decay rate uncertainties affect only a few nuclei near s-process branchings. The s-process in rotating metal-poor stars shows quantitatively different uncertainties and key reactions, although the qualitative characteristics are similar. We confirmed that our results do not significantly change at different metallicities for fast rotating massive stars in the very low metallicity regime. We highlight which of the identified key reactions are realistic candidates for improved measurement by future experiments.

  20. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, D; Spaans, J; Kumaraswamy, L

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty onmore » and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published reports or otherwise compared to the results of other users or vendors should clearly indicate whether the Measurement Uncertainty function is used.« less

  1. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  2. Towards physiologically meaningful water-use efficiency estimates from eddy covariance data.

    PubMed

    Knauer, Jürgen; Zaehle, Sönke; Medlyn, Belinda E; Reichstein, Markus; Williams, Christopher A; Migliavacca, Mirco; De Kauwe, Martin G; Werner, Christiane; Keitel, Claudia; Kolari, Pasi; Limousin, Jean-Marc; Linderson, Maj-Lena

    2018-02-01

    Intrinsic water-use efficiency (iWUE) characterizes the physiological control on the simultaneous exchange of water and carbon dioxide in terrestrial ecosystems. Knowledge of iWUE is commonly gained from leaf-level gas exchange measurements, which are inevitably restricted in their spatial and temporal coverage. Flux measurements based on the eddy covariance (EC) technique can overcome these limitations, as they provide continuous and long-term records of carbon and water fluxes at the ecosystem scale. However, vegetation gas exchange parameters derived from EC data are subject to scale-dependent and method-specific uncertainties that compromise their ecophysiological interpretation as well as their comparability among ecosystems and across spatial scales. Here, we use estimates of canopy conductance and gross primary productivity (GPP) derived from EC data to calculate a measure of iWUE (G 1 , "stomatal slope") at the ecosystem level at six sites comprising tropical, Mediterranean, temperate, and boreal forests. We assess the following six mechanisms potentially causing discrepancies between leaf and ecosystem-level estimates of G 1 : (i) non-transpirational water fluxes; (ii) aerodynamic conductance; (iii) meteorological deviations between measurement height and canopy surface; (iv) energy balance non-closure; (v) uncertainties in net ecosystem exchange partitioning; and (vi) physiological within-canopy gradients. Our results demonstrate that an unclosed energy balance caused the largest uncertainties, in particular if it was associated with erroneous latent heat flux estimates. The effect of aerodynamic conductance on G 1 was sufficiently captured with a simple representation. G 1 was found to be less sensitive to meteorological deviations between canopy surface and measurement height and, given that data are appropriately filtered, to non-transpirational water fluxes. Uncertainties in the derived GPP and physiological within-canopy gradients and their implications for parameter estimates at leaf and ecosystem level are discussed. Our results highlight the importance of adequately considering the sources of uncertainty outlined here when EC-derived water-use efficiency is interpreted in an ecophysiological context. © 2017 John Wiley & Sons Ltd.

  3. Parity violation in electron scattering

    DOE PAGES

    Souder, P.; Paschke, K. D.

    2015-12-22

    By comparing the cross sections for left- and right-handed electrons scattered from various unpolarized nuclear targets, the small parity-violating asymmetry can be measured. These asymmetry data probe a wide variety of important topics, including searches for new fundamental interactions and important features of nuclear structure that cannot be studied with other probes. A special feature of these experiments is that the results are interpreted with remarkably few theoretical uncertainties, which justifies pushing the experiments to the highest possible precision. To measure the small asymmetries accurately, a number of novel experimental techniques have been developed.

  4. Sensitivity analysis of TRX-2 lattice parameters with emphasis on epithermal /sup 238/U capture. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomlinson, E.T.; deSaussure, G.; Weisbin, C.R.

    1977-03-01

    The main purpose of the study is the determination of the sensitivity of TRX-2 thermal lattice performance parameters to nuclear cross section data, particularly the epithermal resonance capture cross section of /sup 238/U. An energy-dependent sensitivity profile was generated for each of the performance parameters, to the most important cross sections of the various isotopes in the lattice. Uncertainties in the calculated values of the performance parameters due to estimated uncertainties in the basic nuclear data, deduced in this study, were shown to be small compared to the uncertainties in the measured values of the performance parameter and compared tomore » differences among calculations based upon the same data but with different methodologies.« less

  5. How Well Can We Assess Atmospheric Ozone Changes? The OzoneSonde Data Quality Assessment (O3S-DQA)

    NASA Astrophysics Data System (ADS)

    Tarasick, D. W.; Smit, H. G. J.; Thompson, A. M.; Morris, G. A.; Witte, J. C.; Davies, J.

    2017-12-01

    Ozonesondes are the backbone of the global ozone observing network, making inexpensive, accurate measurements of ozone from the ground to 30km, with high vertical resolution ( 100 m), for more than 50 years. The data are used extensively for validation of satellite data products, and are also part of merged satellite data sets and climatologies that are used for trend analyses and as a priori data for satellite retrievals. The importance of ECC sondes for trend analyses and as a stable reference for satellite validation recommends research effort to better quantify uncertainties in ECC data and to understand changes therein. Comparison with UV-absorption measurements in a number of studies (e.g. JOSIE, BESOS) has shown that small changes in sensor type, preparation or sensing solution can introduce significant inhomogenities in long-term sounding records. The major goal of the O3S-DQA is the homogenization of ozonesonde data sets. Essential aspects of this are the detailed estimation of uncertainties and documentation of the reprocessing. Corrections to historical data for known issues may reduce biases but introduce additional uncertainties. We take a systematic approach to quantifying these uncertainties by considering the physical and chemical processes involved, and attempt to place our estimates on a firm theoretical or empirical footing. We discuss stoichiometry, sensing solutions, background current, humidity and temperature corrections to pump flow rate, altitude-dependent pump flow corrections, variations in radiosonde pressure offsets, and normalization of sonde total ozone to spectrophotometric measurements. In the past 20 years ozonesonde precision has improved by a factor of 2, primarily through the adoption of strict standard operating procedures. We identify remaining quality assurance issues that can be better evaluated with further research. We present a "roadmap" for achieving a goal of better than 5% overall uncertainty throughout the global ozonesonde network. Finally, we note that the global network is very uneven. Additional sites would be of global benefit. Objective methods of quantifying spatial representativeness can optimize future network design. International cooperation and data sharing will continue to be of immense importance.

  6. The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-11-01

    NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.

  7. On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.

    PubMed

    Thomson, Rowan M; Kawrakow, Iwan

    2011-08-01

    The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.

  8. Value of travel-time reliability : commuters' route-choice behavior in the Twin Cities, phase 2.

    DOT National Transportation Integrated Search

    2012-04-01

    Travel-time variability is a noteworthy factor in network performance. It measures the temporal uncertainty : experienced by users in their movement between any two nodes in a network. The importance : of the time variance depends on the penalties in...

  9. Lessons learnt on biases and uncertainties in personal exposure measurement surveys of radiofrequency electromagnetic fields with exposimeters.

    PubMed

    Bolte, John F B

    2016-09-01

    Personal exposure measurements of radio frequency electromagnetic fields are important for epidemiological studies and developing prediction models. Minimizing biases and uncertainties and handling spatial and temporal variability are important aspects of these measurements. This paper reviews the lessons learnt from testing the different types of exposimeters and from personal exposure measurement surveys performed between 2005 and 2015. Applying them will improve the comparability and ranking of exposure levels for different microenvironments, activities or (groups of) people, such that epidemiological studies are better capable of finding potential weak correlations with health effects. Over 20 papers have been published on how to prevent biases and minimize uncertainties due to: mechanical errors; design of hardware and software filters; anisotropy; and influence of the body. A number of biases can be corrected for by determining multiplicative correction factors. In addition a good protocol on how to wear the exposimeter, a sufficiently small sampling interval and sufficiently long measurement duration will minimize biases. Corrections to biases are possible for: non-detects through detection limit, erroneous manufacturer calibration and temporal drift. Corrections not deemed necessary, because no significant biases have been observed, are: linearity in response and resolution. Corrections difficult to perform after measurements are for: modulation/duty cycle sensitivity; out of band response aka cross talk; temperature and humidity sensitivity. Corrections not possible to perform after measurements are for: multiple signals detection in one band; flatness of response within a frequency band; anisotropy to waves of different elevation angle. An analysis of 20 microenvironmental surveys showed that early studies using exposimeters with logarithmic detectors, overestimated exposure to signals with bursts, such as in uplink signals from mobile phones and WiFi appliances. Further, the possible corrections for biases have not been fully applied. The main findings are that if the biases are not corrected for, the actual exposure will on average be underestimated. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Accurate Calibration and Uncertainty Estimation of the Normal Spring Constant of Various AFM Cantilevers

    PubMed Central

    Song, Yunpeng; Wu, Sen; Xu, Linyan; Fu, Xing

    2015-01-01

    Measurement of force on a micro- or nano-Newton scale is important when exploring the mechanical properties of materials in the biophysics and nanomechanical fields. The atomic force microscope (AFM) is widely used in microforce measurement. The cantilever probe works as an AFM force sensor, and the spring constant of the cantilever is of great significance to the accuracy of the measurement results. This paper presents a normal spring constant calibration method with the combined use of an electromagnetic balance and a homemade AFM head. When the cantilever presses the balance, its deflection is detected through an optical lever integrated in the AFM head. Meanwhile, the corresponding bending force is recorded by the balance. Then the spring constant can be simply calculated using Hooke’s law. During the calibration, a feedback loop is applied to control the deflection of the cantilever. Errors that may affect the stability of the cantilever could be compensated rapidly. Five types of commercial cantilevers with different shapes, stiffness, and operating modes were chosen to evaluate the performance of our system. Based on the uncertainty analysis, the expanded relative standard uncertainties of the normal spring constant of most measured cantilevers are believed to be better than 2%. PMID:25763650

  11. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  12. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  13. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  14. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  15. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Enhancing soil moisture monitoring via cosmic-ray neutron sensing in farmlands by combining field site tests with an uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Oswald, S. E.; Scheiffele, L. M.; Baroni, G.; Ingwersen, J.; Schrön, M.

    2017-12-01

    One application of Cosmic-Ray Neutron Sensing (CRNS) is to investigate soil moisture on agricultural fields during the crop season. This fully employs the non-invasive character of CRNS without interference with agricultural practices of the farmland. The changing influence of vegetation on CRNS has to be dealt with as well as spatio-temporal influences, e.g. by irrigation or harvest. Previous work revealed that the CRNS signal on farmland shows complex and non-unique response because of the hydrogen pools in different depths and distances. This creates a challenge for soil moisture estimation and subsequent use for irrigation management or hydrological modelling. Thus, a special aim of our study was to assess the uncertainty of CRNS in cropped fields and to identify underlying causes of uncertainty. We have applied CRNS at two field sites during the growing season that were accompanied by intensive measurements of soil moisture, vegetation parameters, and irrigation events. Sources of uncertainty were identified from the experimental data. A Monte Carlo approach was used to propagate these uncertainties to CRNS soil moisture estimations. In addition, a sensitivity analysis was performed to identify the most important factors explaining this uncertainty. Results showed that CRNS soil moisture compares well to the soil moisture network when the point values were converted to weighted water content with all hydrogen pools included. However, when considered as a stand-alone method to retrieve volumetric soil moisture, the performance decreased. The support volume including its penetration depth showed also a considerable uncertainty, especially in relatively dry soil moisture conditions. Of seven factors analyzed, actual soil moisture profile, bulk density, incoming neutron correction and calibrated parameter N0 were found to play an important role. One possible improvement could be a simple correction factor based on independent data of soil moisture profiles to better account for the sensitivity of the CRNS signal to the upper soil layers. This is an important step to improve the method for validation of remote sensing products or agricultural water management and establish CRNS as an applied monitoring tool on farmland.

  17. Addressing forecast uncertainty impact on CSP annual performance

    NASA Astrophysics Data System (ADS)

    Ferretti, Fabio; Hogendijk, Christopher; Aga, Vipluv; Ehrsam, Andreas

    2017-06-01

    This work analyzes the impact of weather forecast uncertainty on the annual performance of a Concentrated Solar Power (CSP) plant. Forecast time series has been produced by a commercial forecast provider using the technique of hindcasting for the full year 2011 in hourly resolution for Ouarzazate, Morocco. Impact of forecast uncertainty has been measured on three case studies, representing typical tariff schemes observed in recent CSP projects plus a spot market price scenario. The analysis has been carried out using an annual performance model and a standard dispatch optimization algorithm based on dynamic programming. The dispatch optimizer has been demonstrated to be a key requisite to maximize the annual revenues depending on the price scenario, harvesting the maximum potential out of the CSP plant. Forecasting uncertainty affects the revenue enhancement outcome of a dispatch optimizer depending on the error level and the price function. Results show that forecasting accuracy of direct solar irradiance (DNI) is important to make best use of an optimized dispatch but also that a higher number of calculation updates can partially compensate this uncertainty. Improvement in revenues can be significant depending on the price profile and the optimal operation strategy. Pathways to achieve better performance are presented by having more updates both by repeatedly generating new optimized trajectories but also more often updating weather forecasts. This study shows the importance of working on DNI weather forecasting for revenue enhancement as well as selecting weather services that can provide multiple updates a day and probabilistic forecast information.

  18. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  19. The currency and tempo of extinction.

    PubMed

    Regan, H M; Lupia, R; Drinnan, A N; Burgman, M A

    2001-01-01

    This study examines estimates of extinction rates for the current purported biotic crisis and from the fossil record. Studies that compare current and geological extinctions sometimes use metrics that confound different sources of error and reflect different features of extinction processes. The per taxon extinction rate is a standard measure in paleontology that avoids some of the pitfalls of alternative approaches. Extinction rates reported in the conservation literature are rarely accompanied by measures of uncertainty, despite many elements of the calculations being subject to considerable error. We quantify some of the most important sources of uncertainty and carry them through the arithmetic of extinction rate calculations using fuzzy numbers. The results emphasize that estimates of current and future rates rely heavily on assumptions about the tempo of extinction and on extrapolations among taxa. Available data are unlikely to be useful in measuring magnitudes or trends in current extinction rates.

  20. Accurate measurements of solar spectral irradiance between 4000-10000 cm-1

    NASA Astrophysics Data System (ADS)

    Elsey, J.; Coleman, M. D.; Gardiner, T.; Shine, K. P.

    2017-12-01

    The near-infrared solar spectral irradiance (SSI) is an important input into simulations of weather and climate; the distribution of energy throughout this region of the spectrum influences atmospheric heating rates and the global hydrological cycle through absorption and scattering by water vapour. Current measurements by a mixture of ground-based and space-based instruments show differences of around 10% in the 4000-7000 cm-1 region, with no resolution to this controversy in sight. This work presents observations of SSI taken using a ground-based Fourier Transform spectrometer between 4000-10000 cm-1 at a field site in Camborne, UK, with particular focus on a rigorously defined uncertainty budget. While there is good agreement between this work and the commonly-used ATLAS3 spectrum between 7000-10000 cm-1, the SSI is systematically lower by 10% than ATLAS3 between 4000-7000 cm-1, with no overlap within the k = 2 measurement uncertainties.

  1. 76 FR 66339 - Inaugural Roundtable of the Financial Reporting Series Entitled “Uncertainty in Financial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... uncertainty in an accounting measurement is less useful to investors and why a more certain measurement would be preferable. Likewise, provide feedback on those topics where a measurement with uncertainty gives... discussion to consider financial statement measurements (and associated disclosures) that incorporate...

  2. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Treesearch

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  3. Steering the measured uncertainty under decoherence through local PT -symmetric operations

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Nan; Wang, Dong; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Ye, Liu

    2018-07-01

    The uncertainty principle is viewed as one of the appealing properties in the context of quantum mechanics, which intrinsically offers a lower bound with regard to the measurement outcomes of a pair of incompatible observables within a given system. In this letter, we attempt to observe entropic uncertainty in the presence of quantum memory under different local noisy channels. To be specific, we develop the dynamics of the measured uncertainty under local bit-phase-flipping (unital) and depolarization (nonunital) noise, respectively, and attractively put forward an effective strategy to manipulate its magnitude of the uncertainty of interest by means of parity-time symmetric (-symmetric) operations on the subsystem to be measured. It is interesting to find that there exist different evolution characteristics of the uncertainty in the channels considered here, i.e. the monotonic behavior in the nonunital channels, and the non-monotonic behavior in the unital channels. Moreover, the amount of the measured uncertainty can be reduced to some degree by properly modulating the -symmetric operations.

  4. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Zhang, Yang; Yu, Chang-Shui

    2015-06-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonçalves, Fabio; Treuhaft, Robert; Law, Beverly

    Mapping and monitoring of forest carbon stocks across large areas in the tropics will necessarily rely on remote sensing approaches, which in turn depend on field estimates of biomass for calibration and validation purposes. Here, we used field plot data collected in a tropical moist forest in the central Amazon to gain a better understanding of the uncertainty associated with plot-level biomass estimates obtained specifically for the calibration of remote sensing measurements. In addition to accounting for sources of error that would be normally expected in conventional biomass estimates (e.g., measurement and allometric errors), we examined two sources of uncertaintymore » that are specific to the calibration process and should be taken into account in most remote sensing studies: the error resulting from spatial disagreement between field and remote sensing measurements (i.e., co-location error), and the error introduced when accounting for temporal differences in data acquisition. We found that the overall uncertainty in the field biomass was typically 25% for both secondary and primary forests, but ranged from 16 to 53%. Co-location and temporal errors accounted for a large fraction of the total variance (>65%) and were identified as important targets for reducing uncertainty in studies relating tropical forest biomass to remotely sensed data. Although measurement and allometric errors were relatively unimportant when considered alone, combined they accounted for roughly 30% of the total variance on average and should not be ignored. Lastly, our results suggest that a thorough understanding of the sources of error associated with field-measured plot-level biomass estimates in tropical forests is critical to determine confidence in remote sensing estimates of carbon stocks and fluxes, and to develop strategies for reducing the overall uncertainty of remote sensing approaches.« less

  6. Estimating Aboveground Biomass in Tropical Forests: Field Methods and Error Analysis for the Calibration of Remote Sensing Observations

    DOE PAGES

    Gonçalves, Fabio; Treuhaft, Robert; Law, Beverly; ...

    2017-01-07

    Mapping and monitoring of forest carbon stocks across large areas in the tropics will necessarily rely on remote sensing approaches, which in turn depend on field estimates of biomass for calibration and validation purposes. Here, we used field plot data collected in a tropical moist forest in the central Amazon to gain a better understanding of the uncertainty associated with plot-level biomass estimates obtained specifically for the calibration of remote sensing measurements. In addition to accounting for sources of error that would be normally expected in conventional biomass estimates (e.g., measurement and allometric errors), we examined two sources of uncertaintymore » that are specific to the calibration process and should be taken into account in most remote sensing studies: the error resulting from spatial disagreement between field and remote sensing measurements (i.e., co-location error), and the error introduced when accounting for temporal differences in data acquisition. We found that the overall uncertainty in the field biomass was typically 25% for both secondary and primary forests, but ranged from 16 to 53%. Co-location and temporal errors accounted for a large fraction of the total variance (>65%) and were identified as important targets for reducing uncertainty in studies relating tropical forest biomass to remotely sensed data. Although measurement and allometric errors were relatively unimportant when considered alone, combined they accounted for roughly 30% of the total variance on average and should not be ignored. Lastly, our results suggest that a thorough understanding of the sources of error associated with field-measured plot-level biomass estimates in tropical forests is critical to determine confidence in remote sensing estimates of carbon stocks and fluxes, and to develop strategies for reducing the overall uncertainty of remote sensing approaches.« less

  7. The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.

    2010-12-01

    Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.

  8. The Third SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-3)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; VanHeukelem, Laurei; Thomas, Crystal S.; Claustre, Herve; Ras, Josephine; Schluter, Louise; Clementson, Lesley; vanderLinde, Dirk; Eker-Develi, Elif; Berthon, Jean-Francois; hide

    2009-01-01

    Seven international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. The field samples were collected primarily from oligotrophic waters, although mesotrophic and eutrophic waters were also sampled to create a dynamic range in chlorophyll concentration spanning approximately two orders of magnitude (0.020 1.366 mg m^{-3}) The intercomparisons were used to establish the following: a) the uncertainties in quantitating individual pigments and higher-order variables (sums, ratios, and indices); b) the reduction in uncertainties as a result of applying quality assurance (QA) procedures; c) the importance of establishing a properly defined referencing system in the computation of uncertainties; d) the analytical benefits of performance metrics, and e) the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.

  9. Uncertainty Estimate for the Outdoor Calibration of Solar Pyranometers: A Metrologist Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, I.; Myers, D.; Stoffel, T.

    2008-12-01

    Pyranometers are used outdoors to measure solar irradiance. By design, this type of radiometer can measure the; total hemispheric (global) or diffuse (sky) irradiance when the detector is unshaded or shaded from the sun disk, respectively. These measurements are used in a variety of applications including solar energy conversion, atmospheric studies, agriculture, and materials science. Proper calibration of pyranometers is essential to ensure measurement quality. This paper describes a step-by-step method for calculating and reporting the uncertainty of the calibration, using the guidelines of the ISO 'Guide to the Expression of Uncertainty in Measurement' or GUM, that is applied tomore » the pyranometer; calibration procedures used at the National Renewable Energy Laboratory (NREL). The NREL technique; characterizes a responsivity function of a pyranometer as a function of the zenith angle, as well as reporting a single; calibration responsivity value for a zenith angle of 45 ..deg... The uncertainty analysis shows that a lower uncertainty can be achieved by using the response function of a pyranometer determined as a function of zenith angle, in lieu of just using; the average value at 45..deg... By presenting the contribution of each uncertainty source to the total uncertainty; users will be able to troubleshoot and improve their calibration process. The uncertainty analysis method can also be used to determine the uncertainty of different calibration techniques and applications, such as deriving the uncertainty of field measurements.« less

  10. Measurement of Far-Infrared Paint Emissivity for Reference Blackbody Modeling and Design

    NASA Astrophysics Data System (ADS)

    Kaplan, S. G.; Hanssen, L. M.; Mekhontsev, S. N.

    2008-12-01

    Blackbody sources are typically used for both ground-based and on-orbit calibration of infrared radiometers for atmospheric sounding. Accurate knowledge of the emissivity (or reflectivity) of the painted cavity surfaces is important for the design and modeling of the blackbody performance. Devices designed for the mid- to long-wave infrared (2 μm to 100 μm) generally use specular black paint and multiple (4 to 6) bounces to achieve high emissivity and uniform radiance over the required optical extent. To meet the CLARREO goal of long-term climate monitoring with an absolute uncertainty of < 0.1 K (3 σ), it will be necessary to calibrate radiance measurements to < 0.1 % uncertainty, which could require knowledge of paint emissivity with < 0.005 uncertainty at long wavelength. Critical material parameters include the dependence of emissivity on wavelength, angle of incidence, polarization, and temperature (200 K to 350 K), as well as contributions of scattered light. We present measurement methods and results in support of the CORSAIR SI-traceable far-infrared blackbody design, including temperature-dependent specular reflectance measurements out to 100 μm and angle-dependent data out to 25 μm. In addition, we describe the design for a proposed new facility at NIST (CBS3) that will enable hemispherical-directional emittance and reflectance measurements versus temperature and angle out to 100 μm.

  11. Quality Assurance Decisions with Air Models: A Case Study of ImputatIon of Missing Input Data Using EPA's Multi-Layer Model

    EPA Science Inventory

    Abstract Environmental models are frequently used within regulatory and policy frameworks to estimate environmental metrics that are difficult or impossible to physically measure. As important decision tools, the uncertainty associated with the model outputs should impact their ...

  12. Estimation of evapotranspiration over the terrestrial ecosystems in China

    Treesearch

    Xianglan Li; Shunlin Liang; Wenping Yuan; Guirui Yu; Xiao Cheng; Yang Chen; Tianbao Zhao; Jinming Feng; Zhuguo Ma; Mingguo Ma; Shaomin Liu; Jiquan Chen; Changliang Shao; Shenggong Li; Xudong Zhang; Zhiqiang Zhang; Ge Sun; Shiping Chen; Takeshi Ohta; Andrej Varlagin; Akira Miyata; Kentaro Takagi; Nobuko Saiqusa; Tomomichi Kato

    2014-01-01

    Quantifying regional evapotranspiration (ET) and environmental constraints are particularly important for understanding water and carbon cycles of terrestrial ecosystems. However, a large uncertainty in the regional estimation of ET still remains for the terrestrial ecosystems in China. This study used ET measurements of 34 eddy covariance sites within China and...

  13. An approach to forecasting health expenditures, with application to the U.S. Medicare system.

    PubMed

    Lee, Ronald; Miller, Timoth

    2002-10-01

    To quantify uncertainty in forecasts of health expenditures. Stochastic time series models are estimated for historical variations in fertility, mortality, and health spending per capita in the United States, and used to generate stochastic simulations of the growth of Medicare expenditures. Individual health spending is modeled to depend on the number of years until death. A simple accounting model is developed for forecasting health expenditures, using the U.S. Medicare system as an example. Medicare expenditures are projected to rise from 2.2 percent of GDP (gross domestic product) to about 8 percent of GDP by 2075. This increase is due in equal measure to increasing health spending per beneficiary and to population aging. The traditional projection method constructs high, medium, and low scenarios to assess uncertainty, an approach that has many problems. Using stochastic forecasting, we find a 95 percent probability that Medicare spending in 2075 will fall between 4 percent and 18 percent of GDP, indicating a wide band of uncertainty. Although there is substantial uncertainty about future mortality decline, it contributed little to uncertainty about future Medicare spending, since lower mortality both raises the number of elderly, tending to raise spending, and is associated with improved health of the elderly, tending to reduce spending. Uncertainty about fertility, by contrast, leads to great uncertainty about the future size of the labor force, and therefore adds importantly to uncertainty about the health-share of GDP. In the shorter term, the major source of uncertainty is health spending per capita. History is a valuable guide for quantifying our uncertainty about future health expenditures. The probabilistic model we present has several advantages over the high-low scenario approach to forecasting. It indicates great uncertainty about future Medicare expenditures relative to GDP.

  14. Uncertainty quantification of measured quantities for a HCCI engine: composition or temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; Whitesides, Russell

    UQHCCI_1 computes the measurement uncertainties of a HCCI engine test bench using the pressure trace and the estimated uncertainties of the measured quantities as inputs, then propagating them through Bayesian inference and a mixing model.

  15. Uncertainty evaluation of dead zone of diagnostic ultrasound equipment

    NASA Astrophysics Data System (ADS)

    Souza, R. M.; Alvarenga, A. V.; Braz, D. S.; Petrella, L. I.; Costa-Felix, R. P. B.

    2016-07-01

    This paper presents a model for evaluating measurement uncertainty of a feature used in the assessment of ultrasound images: dead zone. The dead zone was measured by two technicians of the INMETRO's Laboratory of Ultrasound using a phantom and following the standard IEC/TS 61390. The uncertainty model was proposed based on the Guide to the Expression of Uncertainty in Measurement. For the tested equipment, results indicate a dead zone of 1.01 mm, and based on the proposed model, the expanded uncertainty was 0.17 mm. The proposed uncertainty model contributes as a novel way for metrological evaluation of diagnostic imaging by ultrasound.

  16. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    PubMed

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  17. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications

    PubMed Central

    Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.

    2016-01-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126

  18. Choice of baseline climate data impacts projected species' responses to climate change.

    PubMed

    Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G

    2016-07-01

    Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley & Sons Ltd.

  19. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Reda, I.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  20. The development and trial of an unmanned aerial system for the measurement of methane flux from landfill and greenhouse gas emission hotspots.

    PubMed

    Allen, Grant; Hollingsworth, Peter; Kabbabe, Khristopher; Pitt, Joseph R; Mead, Mohammed I; Illingworth, Samuel; Roberts, Gareth; Bourn, Mark; Shallcross, Dudley E; Percival, Carl J

    2018-01-09

    This paper describes the development of a new sampling and measurement method to infer methane flux using proxy measurements of CO 2 concentration and wind data recorded by Unmanned Aerial Systems (UAS). The flux method described and trialed here is appropriate to the spatial scale of landfill sites and analogous greenhouse gas emission hotspots, making it an important new method for low-cost and rapid case study quantification of fluxes from currently uncertain (but highly important) greenhouse gas sources. We present a case study using these UAS-based measurements to derive instantaneous methane fluxes from a test landfill site in the north of England using a mass balance model tailored for UAS sampling and co-emitted CO 2 concentration as a methane-emission proxy. Methane flux (and flux uncertainty) during two trials on 27 November 2014 and 5 March 2015, were found to be 0.140 kg s -1 (±61% at 1σ), and 0.050 kg s -1 (±54% at 1σ), respectively. Uncertainty contributing to the flux was dominated by ambient variability in the background (inflow) concentration (>40%) and wind speed (>10%); with instrumental error contributing only ∼1-2%. The approach described represents an important advance concerning the challenging problem of greenhouse gas hotspot flux calculation, and offers transferability to a wide range of analogous environments. This new measurement solution could add to a toolkit of approaches to better validate source-specific greenhouse emissions inventories - an important new requirement of the UNFCCC COP21 (Paris) climate change agreement. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  2. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  3. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  4. Uncertainty Evaluation of Measurements with Pyranometers and Pyrheliometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konings, Jorgen; Habte, Aron

    2016-01-03

    Evaluating photovoltaic (PV) cells, modules, arrays and systems performance of solar energy relies on accurate measurement of the available solar radiation resources. Solar radiation resources are measured using radiometers such as pyranometers (global horizontal irradiance) and pyrheliometers (direct normal irradiance). The accuracy of solar radiation data measured by radiometers depends not only on the specification of the instrument but also on a) the calibration procedure, b) the measurement conditions and maintenance, and c) the environmental conditions. Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This paper providesmore » guidelines and recommended procedures for estimating the uncertainty in measurements by radiometers using the Guide to the Expression of Uncertainty (GUM) Method. Special attention is paid to the concept of data availability and its link to uncertainty evaluation.« less

  5. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  6. Synthesizing Global and Local Datasets to Estimate Jurisdictional Forest Carbon Fluxes in Berau, Indonesia.

    PubMed

    Griscom, Bronson W; Ellis, Peter W; Baccini, Alessandro; Marthinus, Delon; Evans, Jeffrey S; Ruslandi

    2016-01-01

    Forest conservation efforts are increasingly being implemented at the scale of sub-national jurisdictions in order to mitigate global climate change and provide other ecosystem services. We see an urgent need for robust estimates of historic forest carbon emissions at this scale, as the basis for credible measures of climate and other benefits achieved. Despite the arrival of a new generation of global datasets on forest area change and biomass, confusion remains about how to produce credible jurisdictional estimates of forest emissions. We demonstrate a method for estimating the relevant historic forest carbon fluxes within the Regency of Berau in eastern Borneo, Indonesia. Our method integrates best available global and local datasets, and includes a comprehensive analysis of uncertainty at the regency scale. We find that Berau generated 8.91 ± 1.99 million tonnes of net CO2 emissions per year during 2000-2010. Berau is an early frontier landscape where gross emissions are 12 times higher than gross sequestration. Yet most (85%) of Berau's original forests are still standing. The majority of net emissions were due to conversion of native forests to unspecified agriculture (43% of total), oil palm (28%), and fiber plantations (9%). Most of the remainder was due to legal commercial selective logging (17%). Our overall uncertainty estimate offers an independent basis for assessing three other estimates for Berau. Two other estimates were above the upper end of our uncertainty range. We emphasize the importance of including an uncertainty range for all parameters of the emissions equation to generate a comprehensive uncertainty estimate-which has not been done before. We believe comprehensive estimates of carbon flux uncertainty are increasingly important as national and international institutions are challenged with comparing alternative estimates and identifying a credible range of historic emissions values.

  7. Entropic Uncertainty Relation and Information Exclusion Relation for multiple measurements in the presence of quantum memory

    PubMed Central

    Zhang, Jun; Zhang, Yang; Yu, Chang-shui

    2015-01-01

    The Heisenberg uncertainty principle shows that no one can specify the values of the non-commuting canonically conjugated variables simultaneously. However, the uncertainty relation is usually applied to two incompatible measurements. We present tighter bounds on both entropic uncertainty relation and information exclusion relation for multiple measurements in the presence of quantum memory. As applications, three incompatible measurements on Werner state and Horodecki’s bound entangled state are investigated in details. PMID:26118488

  8. Comparative study of radiometric and calorimetric methods for total hemispherical emissivity measurements

    NASA Astrophysics Data System (ADS)

    Monchau, Jean-Pierre; Hameury, Jacques; Ausset, Patrick; Hay, Bruno; Ibos, Laurent; Candau, Yves

    2018-05-01

    Accurate knowledge of infrared emissivity is important in applications such as surface temperature measurements by infrared thermography or thermal balance for building walls. A comparison of total hemispherical emissivity measurement was performed by two laboratories: the Laboratoire National de Métrologie et d'Essais (LNE) and the Centre d'Études et de Recherche en Thermique, Environnement et Systèmes (CERTES). Both laboratories performed emissivity measurements on four samples, chosen to cover a large range of emissivity values and angular reflectance behaviors. The samples were polished aluminum (highly specular, low emissivity), bulk PVC (slightly specular, high emissivity), sandblasted aluminum (diffuse surface, medium emissivity), and aluminum paint (slightly specular surface, medium emissivity). Results obtained using five measurement techniques were compared. LNE used a calorimetric method for direct total hemispherical emissivity measurement [1], an absolute reflectometric measurement method [2], and a relative reflectometric measurement method. CERTES used two total hemispherical directional reflectometric measurement methods [3, 4]. For indirect techniques by reflectance measurements, the total hemispherical emissivity values were calculated from directional hemispherical reflectance measurement results using spectral integration when required and directional to hemispherical extrapolation. Results were compared, taking into account measurement uncertainties; an added uncertainty was introduced to account for heterogeneity over the surfaces of the samples and between samples. All techniques gave large relative uncertainties for a low emissive and very specular material (polished aluminum), and results were quite scattered. All the indirect techniques by reflectance measurement gave results within ±0.01 for a high emissivity material. A commercial aluminum paint appears to be a good candidate for producing samples with medium level of emissivity (about 0.4) and with good uniformity of emissivity values (within ±0.015).

  9. Multiple wavelength interferometry for distance measurements of moving objects with nanometer uncertainty

    NASA Astrophysics Data System (ADS)

    Kuschmierz, R.; Czarske, J.; Fischer, A.

    2014-08-01

    Optical measurement techniques offer great opportunities in diverse applications, such as lathe monitoring and microfluidics. Doppler-based interferometric techniques enable simultaneous measurement of the lateral velocity and axial distance of a moving object. However, there is a complementarity between the unambiguous axial measurement range and the uncertainty of the distance. Therefore, we present an extended sensor setup, which provides an unambiguous axial measurement range of 1 mm while achieving uncertainties below 100 nm. Measurements at a calibration system are performed. When using a pinhole for emulating a single scattering particle, the tumbling motion of the rotating object is resolved with a distance uncertainty of 50 nm. For measurements at the rough surface, the distance uncertainty amounts to 280 nm due to a lower signal-to-noise ratio. Both experimental results are close to the respective Cramér-Rao bound, which is derived analytically for both surface and single particle measurements.

  10. Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM

    NASA Astrophysics Data System (ADS)

    Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao

    2017-10-01

    Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.

  11. Calculating Measurement Uncertainty of the “Conventional Value of the Result of Weighing in Air”

    DOE PAGES

    Flicker, Celia J.; Tran, Hy D.

    2016-04-02

    The conventional value of the result of weighing in air is frequently used in commercial calibrations of balances. The guidance in OIML D-028 for reporting uncertainty of the conventional value is too terse. When calibrating mass standards at low measurement uncertainties, it is necessary to perform a buoyancy correction before reporting the result. When calculating the conventional result after calibrating true mass, the uncertainty due to calculating the conventional result is correlated with the buoyancy correction. We show through Monte Carlo simulations that the measurement uncertainty of the conventional result is less than the measurement uncertainty when reporting true mass.more » The Monte Carlo simulation tool is available in the online version of this article.« less

  12. Uncertainty of streamwater solute fluxes in five contrasting headwater catchments including model uncertainty and natural variability (Invited)

    NASA Astrophysics Data System (ADS)

    Aulenbach, B. T.; Burns, D. A.; Shanley, J. B.; Yanai, R. D.; Bae, K.; Wild, A.; Yang, Y.; Dong, Y.

    2013-12-01

    There are many sources of uncertainty in estimates of streamwater solute flux. Flux is the product of discharge and concentration (summed over time), each of which has measurement uncertainty of its own. Discharge can be measured almost continuously, but concentrations are usually determined from discrete samples, which increases uncertainty dependent on sampling frequency and how concentrations are assigned for the periods between samples. Gaps between samples can be estimated by linear interpolation or by models that that use the relations between concentration and continuously measured or known variables such as discharge, season, temperature, and time. For this project, developed in cooperation with QUEST (Quantifying Uncertainty in Ecosystem Studies), we evaluated uncertainty for three flux estimation methods and three different sampling frequencies (monthly, weekly, and weekly plus event). The constituents investigated were dissolved NO3, Si, SO4, and dissolved organic carbon (DOC), solutes whose concentration dynamics exhibit strongly contrasting behavior. The evaluation was completed for a 10-year period at five small, forested watersheds in Georgia, New Hampshire, New York, Puerto Rico, and Vermont. Concentration regression models were developed for each solute at each of the three sampling frequencies for all five watersheds. Fluxes were then calculated using (1) a linear interpolation approach, (2) a regression-model method, and (3) the composite method - which combines the regression-model method for estimating concentrations and the linear interpolation method for correcting model residuals to the observed sample concentrations. We considered the best estimates of flux to be derived using the composite method at the highest sampling frequencies. We also evaluated the importance of sampling frequency and estimation method on flux estimate uncertainty; flux uncertainty was dependent on the variability characteristics of each solute and varied for different reporting periods (e.g. 10-year, study period vs. annually vs. monthly). The usefulness of the two regression model based flux estimation approaches was dependent upon the amount of variance in concentrations the regression models could explain. Our results can guide the development of optimal sampling strategies by weighing sampling frequency with improvements in uncertainty in stream flux estimates for solutes with particular characteristics of variability. The appropriate flux estimation method is dependent on a combination of sampling frequency and the strength of concentration regression models. Sites: Biscuit Brook (Frost Valley, NY), Hubbard Brook Experimental Forest and LTER (West Thornton, NH), Luquillo Experimental Forest and LTER (Luquillo, Puerto Rico), Panola Mountain (Stockbridge, GA), Sleepers River Research Watershed (Danville, VT)

  13. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  14. Study of the uncertainty in estimation of the exposure of non-human biota to ionising radiation.

    PubMed

    Avila, R; Beresford, N A; Agüero, A; Broed, R; Brown, J; Iospje, M; Robles, B; Suañez, A

    2004-12-01

    Uncertainty in estimations of the exposure of non-human biota to ionising radiation may arise from a number of sources including values of the model parameters, empirical data, measurement errors and biases in the sampling. The significance of the overall uncertainty of an exposure assessment will depend on how the estimated dose compares with reference doses used for risk characterisation. In this paper, we present the results of a study of the uncertainty in estimation of the exposure of non-human biota using some of the models and parameters recommended in the FASSET methodology. The study was carried out for semi-natural terrestrial, agricultural and marine ecosystems, and for four radionuclides (137Cs, 239Pu, 129I and 237Np). The parameters of the radionuclide transfer models showed the highest sensitivity and contributed the most to the uncertainty in the predictions of doses to biota. The most important ones were related to the bioavailability and mobility of radionuclides in the environment, for example soil-to-plant transfer factors, the bioaccumulation factors for marine biota and the gut uptake fraction for terrestrial mammals. In contrast, the dose conversion coefficients showed low sensitivity and contributed little to the overall uncertainty. Radiobiological effectiveness contributed to the overall uncertainty of the dose estimations for alpha emitters although to a lesser degree than a number of transfer model parameters.

  15. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    PubMed

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  16. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    PubMed

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  17. The Harm that Underestimation of Uncertainty Does to Our Community: A Case Study Using Sunspot Area Measurements

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, Andres

    2017-08-01

    Data products in heliospheric physics are very often provided without clear estimates of uncertainty. From helioseismology in the solar interior, all the way to in situ solar wind measurements beyond 1AU, uncertainty estimates are typically hard for users to find (buried inside long documents that are separate from the data products), or simply non-existent.There are two main reasons why uncertainty measurements are hard to find:1. Understanding instrumental systematic errors is given a much higher priority inside instrumental teams.2. The desire to perfectly understand all sources of uncertainty postpones indefinitely the actual quantification of uncertainty in our measurements.Using the cross calibration of 200 years of sunspot area measurements as a case study, in this presentation we will discuss the negative impact that inadequate measurements of uncertainty have on users, through the appearance of toxic and unnecessary controversies, and data providers, through the creation of unrealistic expectations regarding the information that can be extracted from their data. We will discuss how empirical estimates of uncertainty represent a very good alternative to not providing any estimates at all, and finalize by discussing the bare essentials that should become our standard practice for future instruments and surveys.

  18. An uncertainty-based distributed fault detection mechanism for wireless sensor networks.

    PubMed

    Yang, Yang; Gao, Zhipeng; Zhou, Hang; Qiu, Xuesong

    2014-04-25

    Exchanging too many messages for fault detection will cause not only a degradation of the network quality of service, but also represents a huge burden on the limited energy of sensors. Therefore, we propose an uncertainty-based distributed fault detection through aided judgment of neighbors for wireless sensor networks. The algorithm considers the serious influence of sensing measurement loss and therefore uses Markov decision processes for filling in missing data. Most important of all, fault misjudgments caused by uncertainty conditions are the main drawbacks of traditional distributed fault detection mechanisms. We draw on the experience of evidence fusion rules based on information entropy theory and the degree of disagreement function to increase the accuracy of fault detection. Simulation results demonstrate our algorithm can effectively reduce communication energy overhead due to message exchanges and provide a higher detection accuracy ratio.

  19. Making High Accuracy Null Depth Measurements for the LBTI Exozodi Survey

    NASA Technical Reports Server (NTRS)

    Mennesson, Bertrand; Defrere, Denis; Nowak, Matthias; Hinz, Philip; Millan-Gabet, Rafael; Absil, Oliver; Bailey, Vanessa; Bryden, Geoffrey; Danchi, William C.; Kennedy, Grant M.; hide

    2016-01-01

    The characterization of exozodiacal light emission is both important for the understanding of planetary systems evolution and for the preparation of future space missions aiming to characterize low mass planets in the habitable zone of nearby main sequence stars. The Large Binocular Telescope Interferometer (LBTI) exozodi survey aims at providing a ten-fold improvement over current state of the art, measuring dust emission levels down to a typical accuracy of 12 zodis per star, for a representative ensemble of 30+ high priority targets. Such measurements promise to yield a final accuracy of about 2 zodis on the median exozodi level of the targets sample. Reaching a 1 sigma measurement uncertainty of 12 zodis per star corresponds to measuring interferometric cancellation (null) levels, i.e visibilities at the few 100 ppm uncertainty level. We discuss here the challenges posed by making such high accuracy mid-infrared visibility measurements from the ground and present the methodology we developed for achieving current best levels of 500 ppm or so. We also discuss current limitations and plans for enhanced exozodi observations over the next few years at LBTI.

  20. Making High Accuracy Null Depth Measurements for the LBTI ExoZodi Survey

    NASA Technical Reports Server (NTRS)

    Mennesson, Bertrand; Defrere, Denis; Nowak, Matthew; Hinz, Philip; Millan-Gabet, Rafael; Absil, Olivier; Bailey, Vanessa; Bryden, Geoffrey; Danchi, William; Kennedy, Grant M.; hide

    2016-01-01

    The characterization of exozodiacal light emission is both important for the understanding of planetary systems evolution and for the preparation of future space missions aiming to characterize low mass planets in the habitable zone of nearby main sequence stars. The Large Binocular Telescope Interferometer (LBTI) exozodi survey aims at providing a ten-fold improvement over current state of the art, measuring dust emission levels down to a typical accuracy of approximately 12 zodis per star, for a representative ensemble of approximately 30+ high priority targets. Such measurements promise to yield a final accuracy of about 2 zodis on the median exozodi level of the targets sample. Reaching a 1 sigma measurement uncertainty of 12 zodis per star corresponds to measuring interferometric cancellation (null) levels, i.e visibilities at the few 100 ppm uncertainty level. We discuss here the challenges posed by making such high accuracy mid-infrared visibility measurements from the ground and present the methodology we developed for achieving current best levels of 500 ppm or so. We also discuss current limitations and plans for enhanced exozodi observations over the next few years at LBTI.

  1. An experimental study of the thermodynamic properties of 1,1-difluoroethane

    NASA Astrophysics Data System (ADS)

    Tamatsu, T.; Sato, T.; Sato, H.; Watanabe, K.

    1992-11-01

    Experimental vapor pressures and P-ρ-T data of an important alternative refrigerant, 1, 1-difluoroethane (HFC-152a), have been measured by means of a constant-volume method coupled with expansion procedures. Sixty P-ρ-T data were measured along eight isochores in a range of temperatures T from 330 to 440 K, at pressures P from 1.6 to 9.3 MPa, and at densities ρ from 51 to 811 kg·m-3. Forty-six vapor pressures were also measured at temperatures from 320 K to the critical temperature. The uncertainties of the temperature and pressure measurements are within ±7mK and ±2kPa, respectively, while the uncertainty of the density values is within ±0.1%. The purity of the sample used is 99.9 wt%. On the basis of the measurements along each isochore, five saturation points were determined and the critical pressure was determined by correlating the vapor-pressure measurements. The second and third virial coefficients for temperatures from 360 to 440 K have also been determined.

  2. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  3. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method

    PubMed Central

    Chen, Jiunyuan; Chen, Chiachung

    2017-01-01

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15–50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty. PMID:28216599

  4. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method.

    PubMed

    Chen, Jiunyuan; Chen, Chiachung

    2017-02-14

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15-50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty.

  5. Estimate of uncertainties in polarized parton distributions

    NASA Astrophysics Data System (ADS)

    Miyama, M.; Goto, Y.; Hirai, M.; Kobayashi, H.; Kumano, S.; Morii, T.; Saito, N.; Shibata, T.-A.; Yamanishi, T.

    2001-10-01

    From \\chi^2 analysis of polarized deep inelastic scattering data, we determined polarized parton distribution functions (Y. Goto et al. (AAC), Phys. Rev. D 62, 34017 (2000).). In order to clarify the reliability of the obtained distributions, we should estimate uncertainties of the distributions. In this talk, we discuss the pol-PDF uncertainties by using a Hessian method. A Hessian matrix H_ij is given by second derivatives of the \\chi^2, and the error matrix \\varepsilon_ij is defined as the inverse matrix of H_ij. Using the error matrix, we calculate the error of a function F by (δ F)^2 = sum_i,j fracpartial Fpartial ai \\varepsilon_ij fracpartial Fpartial aj , where a_i,j are the parameters in the \\chi^2 analysis. Using this method, we show the uncertainties of the pol-PDF, structure functions g_1, and spin asymmetries A_1. Furthermore, we show a role of future experiments such as the RHIC-Spin. An important purpose of planned experiments in the near future is to determine the polarized gluon distribution function Δ g (x) in detail. We reanalyze the pol-PDF uncertainties including the gluon fake data which are expected to be given by the upcoming experiments. From this analysis, we discuss how much the uncertainties of Δ g (x) can be improved by such measurements.

  6. Uncertainty analysis of the nonideal competitive adsorption-donnan model: effects of dissolved organic matter variability on predicted metal speciation in soil solution.

    PubMed

    Groenenberg, Jan E; Koopmans, Gerwin F; Comans, Rob N J

    2010-02-15

    Ion binding models such as the nonideal competitive adsorption-Donnan model (NICA-Donnan) and model VI successfully describe laboratory data of proton and metal binding to purified humic substances (HS). In this study model performance was tested in more complex natural systems. The speciation predicted with the NICA-Donnan model and the associated uncertainty were compared with independent measurements in soil solution extracts, including the free metal ion activity and fulvic (FA) and humic acid (HA) fractions of dissolved organic matter (DOM). Potentially important sources of uncertainty are the DOM composition and the variation in binding properties of HS. HS fractions of DOM in soil solution extracts varied between 14 and 63% and consisted mainly of FA. Moreover, binding parameters optimized for individual FA samples show substantial variation. Monte Carlo simulations show that uncertainties in predicted metal speciation, for metals with a high affinity for FA (Cu, Pb), are largely due to the natural variation in binding properties (i.e., the affinity) of FA. Predictions for metals with a lower affinity (Cd) are more prone to uncertainties in the fraction FA in DOM and the maximum site density (i.e., the capacity) of the FA. Based on these findings, suggestions are provided to reduce uncertainties in model predictions.

  7. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-10-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models, as well as greenhouse gas scenarios, are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure what is referred to here as AHEAD (Adequate Human livelihood conditions for wEll-being And Development). Based on a trans-disciplinary sample of concepts addressing human well-being and livelihoods, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows for the uncertainty of climate and impact model projections to be identified and differentiated. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that livelihood conditions are compromised by water scarcity in 34 countries. However, more often, AHEAD fulfilment is limited through other elements. The analysis shows that the water-specific uncertainty ranges of the model output are outside relevant thresholds for AHEAD for 65 out of 111 countries, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. In 46 of the countries in the analysis, water-specific uncertainty is relevant to AHEAD. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy decisions.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer; Clifton, Andrew; Bonin, Timothy

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing considermore » uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict errors in lidar-measured wind speed. The results show how uncertainty varies over time and can be used to help select data with different levels of uncertainty for different applications, for example, low uncertainty data for power performance testing versus all data for plant performance monitoring.« less

  9. Spatial variability versus parameter uncertainty in freshwater fate and exposure factors of chemicals.

    PubMed

    Nijhof, Carl O P; Huijbregts, Mark A J; Golsteijn, Laura; van Zelm, Rosalie

    2016-04-01

    We compared the influence of spatial variability in environmental characteristics and the uncertainty in measured substance properties of seven chemicals on freshwater fate factors (FFs), representing the residence time in the freshwater environment, and on exposure factors (XFs), representing the dissolved fraction of a chemical. The influence of spatial variability was quantified using the SimpleBox model in which Europe was divided in 100 × 100 km regions, nested in a regional (300 × 300 km) and supra-regional (500 × 500 km) scale. Uncertainty in substance properties was quantified by means of probabilistic modelling. Spatial variability and parameter uncertainty were expressed by the ratio k of the 95%ile and 5%ile of the FF and XF. Our analysis shows that spatial variability ranges in FFs of persistent chemicals that partition predominantly into one environmental compartment was up to 2 orders of magnitude larger compared to uncertainty. For the other (less persistent) chemicals, uncertainty in the FF was up to 1 order of magnitude larger than spatial variability. Variability and uncertainty in freshwater XFs of the seven chemicals was negligible (k < 1.5). We found that, depending on the chemical and emission scenario, accounting for region-specific environmental characteristics in multimedia fate modelling, as well as accounting for parameter uncertainty, can have a significant influence on freshwater fate factor predictions. Therefore, we conclude that it is important that fate factors should not only account for parameter uncertainty, but for spatial variability as well, as this further increases the reliability of ecotoxicological impacts in LCA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Transfer Standard Uncertainty Can Cause Inconclusive Inter-Laboratory Comparisons

    PubMed Central

    Wright, John; Toman, Blaza; Mickan, Bodo; Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens

    2016-01-01

    Inter-laboratory comparisons use the best available transfer standards to check the participants’ uncertainty analyses, identify underestimated uncertainty claims or unknown measurement biases, and improve the global measurement system. For some measurands, instability of the transfer standard can lead to an inconclusive comparison result. If the transfer standard uncertainty is large relative to a participating laboratory’s uncertainty, the commonly used standardized degree of equivalence ≤ 1 criterion does not always correctly assess whether a participant is working within their uncertainty claims. We show comparison results that demonstrate this issue and propose several criteria for assessing a comparison result as passing, failing, or inconclusive. We investigate the behavior of the standardized degree of equivalence and alternative comparison measures for a range of values of the transfer standard uncertainty relative to the individual laboratory uncertainty values. The proposed alternative criteria successfully discerned between passing, failing, and inconclusive comparison results for the cases we examined. PMID:28090123

  11. Uncertainty in prostate cancer. Ethnic and family patterns.

    PubMed

    Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J

    1998-01-01

    Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their family members, higher levels of uncertainty were related to lower scores on adult role behavior (shopping, running errands). For white family members, higher levels of uncertainty were related to less active problem solving and less perceived social support. Finally, higher levels of uncertainty were related to the importance of God for white patients and family care providers. The clearest finding of the present study is that there are ethnic differences in the relationship of uncertainty to a number of quality-of-life and coping variables. This has immediate implications for the assessment of psychosocial responses to cancer and cancer treatment. Much of what is in curricula is based on clinical and research experience primarily with white individuals. The experience of uncertainty related to cancer and its treatment is influenced by the cultural perspectives of patients and their families. To assist patients and families with the inevitable uncertainties of the cancer experience, healthcare providers need to reconsider their ethnocentric assumptions and develop more skill in assessing patient and family beliefs, values, cultural perspectives, and the influence of these on patient and family uncertainties.

  12. The importance of hydrological uncertainty assessment methods in climate change impact studies

    NASA Astrophysics Data System (ADS)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind predictive uncertainty for the same hydrological model and calibration data when considering different objective functions for calibration.

  13. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  14. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  15. Accuracy assessment for a multi-parameter optical calliper in on line automotive applications

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2017-08-01

    In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.

  16. Different methodologies to quantify uncertainties of air emissions.

    PubMed

    Romano, Daniela; Bernetti, Antonella; De Lauretis, Riccardo

    2004-10-01

    Characterization of the uncertainty associated with air emission estimates is of critical importance especially in the compilation of air emission inventories. In this paper, two different theories are discussed and applied to evaluate air emissions uncertainty. In addition to numerical analysis, which is also recommended in the framework of the United Nation Convention on Climate Change guidelines with reference to Monte Carlo and Bootstrap simulation models, fuzzy analysis is also proposed. The methodologies are discussed and applied to an Italian example case study. Air concentration values are measured from two electric power plants: a coal plant, consisting of two boilers and a fuel oil plant, of four boilers; the pollutants considered are sulphur dioxide (SO(2)), nitrogen oxides (NO(X)), carbon monoxide (CO) and particulate matter (PM). Monte Carlo, Bootstrap and fuzzy methods have been applied to estimate uncertainty of these data. Regarding Monte Carlo, the most accurate results apply to Gaussian distributions; a good approximation is also observed for other distributions with almost regular features either positive asymmetrical or negative asymmetrical. Bootstrap, on the other hand, gives a good uncertainty estimation for irregular and asymmetrical distributions. The logic of fuzzy analysis, where data are represented as vague and indefinite in opposition to the traditional conception of neatness, certain classification and exactness of the data, follows a different description. In addition to randomness (stochastic variability) only, fuzzy theory deals with imprecision (vagueness) of data. Fuzzy variance of the data set was calculated; the results cannot be directly compared with empirical data but the overall performance of the theory is analysed. Fuzzy theory may appear more suitable for qualitative reasoning than for a quantitative estimation of uncertainty, but it suits well when little information and few measurements are available and when distributions of data are not properly known.

  17. A high-precision velocity measuring system design for projectiles based on S-shaped laser screen

    NASA Astrophysics Data System (ADS)

    Liu, Huayi; Qian, Zheng; Yu, Hao; Li, Yutao

    2018-03-01

    The high-precision measurement of the velocity of high-speed flying projectile is of great significance for the evaluation and development of modern weapons. The velocity of the high-speed flying projectile is usually measured by laser screen velocity measuring system. But this method cannot achieve the repeated measurements, so we cannot make an indepth evaluation of the uncertainty about the measuring system. This paper presents a design based on S-shaped laser screen velocity measuring system. This design can achieve repeated measurements. Therefore, it can effectively reduce the uncertainty of the velocity measuring system. In addition, we made a detailed analysis of the uncertainty of the measuring system. The measurement uncertainty is 0.2% when the velocity of the projectile is about 200m/s.

  18. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a sensitivity analysis to the fixed parameters of the streamgauging technique remain very useful for estimating the uncertainty related to the (non quantified) bias correction. In the absence of a reference, the uncertainty estimate is referenced to the average of all discharge measurements in the interlaboratory experiment, ignoring the technique bias. Simple equations can be used to assess the uncertainty of the uncertainty results, as a function of the number of participants and of repeated measurements. The interlaboratory method was applied to several interlaboratory experiments on ADCPs and currentmeters mounted on wading rods, in streams of different sizes and aspects, with 10 to 30 instruments, typically. The uncertainty results were consistent with the usual expert judgment and highly depended on the measurement environment. Approximately, the expanded uncertainties (within the 95% probability interval) were ±5% to ±10% for ADCPs in good or poor conditions, and ±10% to ±15% for currentmeters in shallow creeks. Due to the specific limitations related to a slow measurement process and to small, natural streams, uncertainty results for currentmeters were more uncertain than for ADCPs, for which the site-specific errors were significantly evidenced. The proposed method can be applied to a wide range of interlaboratory experiments conducted in contrasted environments for different streamgauging techniques, in a standardized way. Ideally, an international open database would enhance the investigation of hydrological data uncertainties, according to the characteristics of the measurement conditions and procedures. Such a dataset could be used for implementing and validating uncertainty propagation methods in hydrometry.

  19. Traceable measurements of the electrical parameters of solid-state lighting products

    NASA Astrophysics Data System (ADS)

    Zhao, D.; Rietveld, G.; Braun, J.-P.; Overney, F.; Lippert, T.; Christensen, A.

    2016-12-01

    In order to perform traceable measurements of the electrical parameters of solid-state lighting (SSL) products, it is necessary to technically adequately define the measurement procedures and to identify the relevant uncertainty sources. The present published written standard for SSL products specifies test conditions, but it lacks an explanation of how adequate these test conditions are. More specifically, both an identification of uncertainty sources and a quantitative uncertainty analysis are absent. This paper fills the related gap in the present written standard. New uncertainty sources with respect to conventional lighting sources are determined and their effects are quantified. It shows that for power measurements, the main uncertainty sources are temperature deviation, power supply voltage distortion, and instability of the SSL product. For current RMS measurements, the influence of bandwidth, shunt resistor, power supply source impedance and ac frequency flatness are significant as well. The measurement uncertainty depends not only on the test equipment but is also a function of the characteristics of the device under test (DUT), for example, current harmonics spectrum and input impedance. Therefore, an online calculation tool is provided to help non-electrical experts. Following our procedures, unrealistic uncertainty estimations, unnecessary procedures and expensive equipment can be prevented.

  20. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    NASA Technical Reports Server (NTRS)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  1. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    NASA Astrophysics Data System (ADS)

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  2. Conditional uncertainty principle

    NASA Astrophysics Data System (ADS)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  3. Particle image velocimetry correlation signal-to-noise ratio metrics and measurement uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Xue, Zhenyu; Charonko, John J.; Vlachos, Pavlos P.

    2014-11-01

    In particle image velocimetry (PIV) the measurement signal is contained in the recorded intensity of the particle image pattern superimposed on a variety of noise sources. The signal-to-noise-ratio (SNR) strength governs the resulting PIV cross correlation and ultimately the accuracy and uncertainty of the resulting PIV measurement. Hence we posit that correlation SNR metrics calculated from the correlation plane can be used to quantify the quality of the correlation and the resulting uncertainty of an individual measurement. In this paper we extend the original work by Charonko and Vlachos and present a framework for evaluating the correlation SNR using a set of different metrics, which in turn are used to develop models for uncertainty estimation. Several corrections have been applied in this work. The SNR metrics and corresponding models presented herein are expanded to be applicable to both standard and filtered correlations by applying a subtraction of the minimum correlation value to remove the effect of the background image noise. In addition, the notion of a ‘valid’ measurement is redefined with respect to the correlation peak width in order to be consistent with uncertainty quantification principles and distinct from an ‘outlier’ measurement. Finally the type and significance of the error distribution function is investigated. These advancements lead to more robust and reliable uncertainty estimation models compared with the original work by Charonko and Vlachos. The models are tested against both synthetic benchmark data as well as experimental measurements. In this work, {{U}68.5} uncertainties are estimated at the 68.5% confidence level while {{U}95} uncertainties are estimated at 95% confidence level. For all cases the resulting calculated coverage factors approximate the expected theoretical confidence intervals, thus demonstrating the applicability of these new models for estimation of uncertainty for individual PIV measurements.

  4. SU-G-BRB-14: Uncertainty of Radiochromic Film Based Relative Dose Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devic, S; Tomic, N; DeBlois, F

    2016-06-15

    Purpose: Due to inherently non-linear dose response, measurement of relative dose distribution with radiochromic film requires measurement of absolute dose using a calibration curve following previously established reference dosimetry protocol. On the other hand, a functional form that converts the inherently non-linear dose response curve of the radiochromic film dosimetry system into linear one has been proposed recently [Devic et al, Med. Phys. 39 4850–4857 (2012)]. However, there is a question what would be the uncertainty of such measured relative dose. Methods: If the relative dose distribution is determined going through the reference dosimetry system (conversion of the response bymore » using calibration curve into absolute dose) the total uncertainty of such determined relative dose will be calculated by summing in quadrature total uncertainties of doses measured at a given and at the reference point. On the other hand, if the relative dose is determined using linearization method, the new response variable is calculated as ζ=a(netOD)n/ln(netOD). In this case, the total uncertainty in relative dose will be calculated by summing in quadrature uncertainties for a new response function (σζ) for a given and the reference point. Results: Except at very low doses, where the measurement uncertainty dominates, the total relative dose uncertainty is less than 1% for the linear response method as compared to almost 2% uncertainty level for the reference dosimetry method. The result is not surprising having in mind that the total uncertainty of the reference dose method is dominated by the fitting uncertainty, which is mitigated in the case of linearization method. Conclusion: Linearization of the radiochromic film dose response provides a convenient and a more precise method for relative dose measurements as it does not require reference dosimetry and creation of calibration curve. However, the linearity of the newly introduced function must be verified. Dave Lewis is inventor and runs a consulting company for radiochromic films.« less

  5. When autocratic leaders become an option--uncertainty and self-esteem predict implicit leadership preferences.

    PubMed

    Schoel, Christiane; Bluemke, Matthias; Mueller, Patrick; Stahlberg, Dagmar

    2011-09-01

    We investigated the impact of uncertainty on leadership preferences and propose that the conjunction of self-esteem level and stability is an important moderator in this regard. Self-threatening uncertainty is aversive and activates the motivation to regain control. People with high and stable self-esteem should be confident of achieving this goal by self-determined amelioration of the situation and should therefore show a stronger preference for democratic leadership under conditions of uncertainty. By contrast, people with low and unstable self-esteem should place their trust and hope in the abilities of powerful others, resulting in a preference for autocratic leadership. Studies 1a and 1b validate explicit and implicit leadership measures and demonstrate a general prodemocratic default attitude under conditions of certainty. Studies 2 and 3 reveal a democratic reaction for individuals with stable high self-esteem and a submissive reaction for individuals with unstable low self-esteem under conditions of uncertainty. In Study 4, this pattern is cancelled out when individuals evaluate leadership styles from a leader instead of a follower perspective. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  6. Intolerance of uncertainty correlates with insula activation during affective ambiguity

    PubMed Central

    Simmons, Alan; Matthews, Scott C.; Paulus, Martin P.; Stein, Murray B.

    2009-01-01

    Intolerance of uncertainty (IU), or the increased affective response to situations with uncertain outcomes, is an important component process of anxiety disorders. Increased IU is observed in panic disorder (PD), obsessive compulsive disorder (OCD) and generalized anxiety disorder (GAD), and is thought to relate to dysfunctional behaviors and thought patterns in these disorders. Identifying what brain systems are associated with IU would contribute to a comprehensive model of anxiety processing, and increase our understanding of the neurobiology of anxiety disorders. Here, we used a behavioral task, Wall of Faces (WOF), during functional magnetic resonance imaging (fMRI), which probes both affect and ambiguity, to examine the neural circuitry of IU in fourteen (10 females) college age (18.8 yrs) subjects. All subjects completed the Intolerance of Uncertainty Scale (IUS), Anxiety Sensitivity Index (ASI), and a measure of neuroticism (i.e. the NEO-N). IUS scores but neither ASI nor NEO-N scores, correlated positively with activation in bilateral insula during affective ambiguity. Thus, the experience of IU during certain types of emotion processing may relate to the degree to which bilateral insula processes uncertainty. Previously observed insula hyperactivity in anxiety disorder individuals may therefore be directly linked to altered processes of uncertainty. PMID:18079060

  7. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE PAGES

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  8. The Intolerance of Uncertainty Index: Replication and Extension with an English Sample

    ERIC Educational Resources Information Center

    Carleton, R. Nicholas; Gosselin, Patrick; Asmundson, Gordon J. G.

    2010-01-01

    Intolerance of uncertainty (IU) is related to anxiety, depression, worry, and anxiety sensitivity. Precedent IU measures were criticized for psychometric instability and redundancy; alternative measures include the novel 45-item measure (Intolerance of Uncertainty Index; IUI). The IUI was developed in French with 2 parts, assessing general…

  9. Top down arsenic uncertainty measurement in water and sediments from Guarapiranga dam (Brazil)

    NASA Astrophysics Data System (ADS)

    Faustino, M. G.; Lange, C. N.; Monteiro, L. R.; Furusawa, H. A.; Marques, J. R.; Stellato, T. B.; Soares, S. M. V.; da Silva, T. B. S. C.; da Silva, D. B.; Cotrim, M. E. B.; Pires, M. A. F.

    2018-03-01

    Total arsenic measurements assessment regarding legal threshold demands more than average and standard deviation approach. In this way, analytical measurement uncertainty evaluation was conducted in order to comply with legal requirements and to allow the balance of arsenic in both water and sediment compartments. A top-down approach for measurement uncertainties was applied to evaluate arsenic concentrations in water and sediments from Guarapiranga dam (São Paulo, Brazil). Laboratory quality control and arsenic interlaboratory tests data were used in this approach to estimate the uncertainties associated with the methodology.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  11. TEMPORAL TRENDS OF BLACK CARBON CONCENTRATIONS AND REGIONAL CLIMATE FORCING IN THE SOUTHEASTERN UNITED STATES. (R825248)

    EPA Science Inventory

    The effect of black carbon (BC) on climate forcing is potentially important, but its estimates have large uncertainties due to a lack of sufficient observational data. The BC mass concentration in the southeastern US was measured at a regionally representative site, Mount Gibb...

  12. Lamb shift of electronic states in neutral muonic helium, an electron-muon-nucleus system

    NASA Astrophysics Data System (ADS)

    Karshenboim, Savely G.; Ivanov, Vladimir G.; Amusia, Miron

    2015-03-01

    Neutral muonic helium is an exotic atomic system consisting of an electron, a muon, and a nucleus. Being a three-body system, it possesses a clear hierarchy. This allows us to consider it as a hydrogenlike atom with a compound nucleus, which is, in turn, another hydrogenlike system. There are a number of corrections to the Bohr energy levels, all of which can be treated as contributions of generic hydrogenlike theory. While the form of those contributions is the same for all hydrogenlike atoms, their relative numerical importance differs from atom to atom. Here, the leading contribution to the (electronic) Lamb shift in neutral muonic helium is found in a closed analytic form together with the most important corrections. We believe that the Lamb shift in neutral muonic hydrogen is measurable, at least through a measurement of the (electronic) 1 s -2 s transition. We present a theoretical prediction for the 1 s -2 s transitions with an uncertainty of 3 ppm (9 GHz ), as well as for the 2 s -2 p Lamb shift with an uncertainty of 1.3 GHz .

  13. Study of DNA binding sites using the Rényi parametric entropy measure.

    PubMed

    Krishnamachari, A; moy Mandal, Vijnan; Karmeshu

    2004-04-07

    Shannon's definition of uncertainty or surprisal has been applied extensively to measure the information content of aligned DNA sequences and characterizing DNA binding sites. In contrast to Shannon's uncertainty, this study investigates the applicability and suitability of a parametric uncertainty measure due to Rényi. It is observed that this measure also provides results in agreement with Shannon's measure, pointing to its utility in analysing DNA binding site region. For facilitating the comparison between these uncertainty measures, a dimensionless quantity called "redundancy" has been employed. It is found that Rényi's measure at low parameter values possess a better delineating feature of binding sites (of binding regions) than Shannon's measure. The critical value of the parameter is chosen with an outlier criterion.

  14. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  15. Research yields precise uncertainty equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E.H.; Ferguson, K.R.

    1987-08-03

    Results of a study of orifice-meter accuracy by Chevron Oil Field Research Co. at its Venice, La., calibration facility have important implications for natural gas custody-transfer measurement. The calibration facility, data collection, and equipment calibration were described elsewhere. This article explains the derivation of uncertainty factors and details the study's findings. The results were based on calibration of two 16-in. orifice-meter runs. The experimental data cover a beta-ratio range of from 0.27 to 0.71 and a Reynolds number range of from 4,000,000 to 35,000,000. Discharge coefficients were determined by comparing the orifice flow to the flow from critical-flow nozzles.

  16. PRaVDA: High Energy Physics towards proton Computed Tomography

    NASA Astrophysics Data System (ADS)

    Price, T.; PRaVDA Consortium

    2016-07-01

    Proton radiotherapy is an increasingly popular modality for treating cancers of the head and neck, and in paediatrics. To maximise the potential of proton radiotherapy it is essential to know the distribution, and more importantly the proton stopping powers, of the body tissues between the proton beam and the tumour. A stopping power map could be measured directly, and uncertainties in the treatment vastly reduce, if the patient was imaged with protons instead of conventional x-rays. Here we outline the application of technologies developed for High Energy Physics to provide clinical-quality proton Computed Tomography, in so reducing range uncertainties and enhancing the treatment of cancer.

  17. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  18. Challenges and regulatory considerations in the acoustic measurement of high-frequency (>20 MHz) ultrasound.

    PubMed

    Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M

    2013-11-01

    This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.

  19. Uncertainty in hydrological signatures for gauged and ungauged catchments

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  20. A hybrid finite element - statistical energy analysis approach to robust sound transmission modeling

    NASA Astrophysics Data System (ADS)

    Reynders, Edwin; Langley, Robin S.; Dijckmans, Arne; Vermeir, Gerrit

    2014-09-01

    When considering the sound transmission through a wall in between two rooms, in an important part of the audio frequency range, the local response of the rooms is highly sensitive to uncertainty in spatial variations in geometry, material properties and boundary conditions, which have a wave scattering effect, while the local response of the wall is rather insensitive to such uncertainty. For this mid-frequency range, a computationally efficient modeling strategy is adopted that accounts for this uncertainty. The partitioning wall is modeled deterministically, e.g. with finite elements. The rooms are modeled in a very efficient, nonparametric stochastic way, as in statistical energy analysis. All components are coupled by means of a rigorous power balance. This hybrid strategy is extended so that the mean and variance of the sound transmission loss can be computed as well as the transition frequency that loosely marks the boundary between low- and high-frequency behavior of a vibro-acoustic component. The method is first validated in a simulation study, and then applied for predicting the airborne sound insulation of a series of partition walls of increasing complexity: a thin plastic plate, a wall consisting of gypsum blocks, a thicker masonry wall and a double glazing. It is found that the uncertainty caused by random scattering is important except at very high frequencies, where the modal overlap of the rooms is very high. The results are compared with laboratory measurements, and both are found to agree within the prediction uncertainty in the considered frequency range.

  1. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Sengupta, Manajit

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  2. Photoionization of Se+ and Se2+ Ions: Experiment and Theory

    NASA Astrophysics Data System (ADS)

    Esteves, D. A.; Sterling, N. C.; Alna'Washi, Ghassan; Aguilar, A.; Kilcoyne, A. L. D.; Balance, C. P.; Norrington, P. H.; McLaughlin, B. M.

    2007-06-01

    The determination of elemental abundances in astrophysical nebulae are highly dependent on the accuracy of the available atomic data. Numerical simulations show that derived Se abundances in ionized nebulae can be uncertain by factors of two or more from atomic data uncertainties alone. Of these uncertainties, photoionization cross section data are the most important, particularly in the near threshold region of the valence shell. Absolute photoionization cross sections for Se^+ and Se^2+ ions near their thresholds have been measured at the Advanced Light Source in Berkeley, using the merged beams photo-ion technique. Theoretical photoionization cross sections calculations were performed for both of these Se ions using the state-of-the-art fully relativistic Dirac R-matrix code (DARC). The calculations show encouraging agreement with the experimental measurements. A more comprehensive set of results will be presented at the meeting.

  3. Retrieval of surface temperature by remote sensing. [of earth surface using brightness temperature of air pollutants

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1976-01-01

    A simple procedure and computer program were developed for retrieving the surface temperature from the measurement of upwelling infrared radiance in a single spectral region in the atmosphere. The program evaluates the total upwelling radiance at any altitude in the region of the CO fundamental band (2070-2220 1/cm) for several values of surface temperature. Actual surface temperature is inferred by interpolation of the measured upwelling radiance between the computed values of radiance for the same altitude. Sensitivity calculations were made to determine the effect of uncertainty in various surface, atmospheric and experimental parameters on the inferred value of surface temperature. It is found that the uncertainties in water vapor concentration and surface emittance are the most important factors affecting the accuracy of the inferred value of surface temperature.

  4. Application of identified sensitive physical parameters in reducing the uncertainty of numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2016-04-01

    An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  5. Entropic uncertainty from effective anticommutators

    NASA Astrophysics Data System (ADS)

    Kaniewski, Jedrzej; Tomamichel, Marco; Wehner, Stephanie

    2014-07-01

    We investigate entropic uncertainty relations for two or more binary measurements, for example, spin-1/2 or polarization measurements. We argue that the effective anticommutators of these measurements, i.e., the anticommutators evaluated on the state prior to measuring, are an expedient measure of measurement incompatibility. Based on the knowledge of pairwise effective anticommutators we derive a class of entropic uncertainty relations in terms of conditional Rényi entropies. Our uncertainty relations are formulated in terms of effective measures of incompatibility, which can be certified in a device-independent fashion. Consequently, we discuss potential applications of our findings to device-independent quantum cryptography. Moreover, to investigate the tightness of our analysis we consider the simplest (and very well studied) scenario of two measurements on a qubit. We find that our results outperform the celebrated bound due to Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103] and provide an analytical expression for the minimum uncertainty which also outperforms some recent bounds based on majorization.

  6. Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty.

    PubMed

    Mdluli, Thembi; Buzzard, Gregery T; Rundell, Ann E

    2015-09-01

    This model-based design of experiments (MBDOE) method determines the input magnitudes of an experimental stimuli to apply and the associated measurements that should be taken to optimally constrain the uncertain dynamics of a biological system under study. The ideal global solution for this experiment design problem is generally computationally intractable because of parametric uncertainties in the mathematical model of the biological system. Others have addressed this issue by limiting the solution to a local estimate of the model parameters. Here we present an approach that is independent of the local parameter constraint. This approach is made computationally efficient and tractable by the use of: (1) sparse grid interpolation that approximates the biological system dynamics, (2) representative parameters that uniformly represent the data-consistent dynamical space, and (3) probability weights of the represented experimentally distinguishable dynamics. Our approach identifies data-consistent representative parameters using sparse grid interpolants, constructs the optimal input sequence from a greedy search, and defines the associated optimal measurements using a scenario tree. We explore the optimality of this MBDOE algorithm using a 3-dimensional Hes1 model and a 19-dimensional T-cell receptor model. The 19-dimensional T-cell model also demonstrates the MBDOE algorithm's scalability to higher dimensions. In both cases, the dynamical uncertainty region that bounds the trajectories of the target system states were reduced by as much as 86% and 99% respectively after completing the designed experiments in silico. Our results suggest that for resolving dynamical uncertainty, the ability to design an input sequence paired with its associated measurements is particularly important when limited by the number of measurements.

  7. Efficient Optimization of Stimuli for Model-Based Design of Experiments to Resolve Dynamical Uncertainty

    PubMed Central

    Mdluli, Thembi; Buzzard, Gregery T.; Rundell, Ann E.

    2015-01-01

    This model-based design of experiments (MBDOE) method determines the input magnitudes of an experimental stimuli to apply and the associated measurements that should be taken to optimally constrain the uncertain dynamics of a biological system under study. The ideal global solution for this experiment design problem is generally computationally intractable because of parametric uncertainties in the mathematical model of the biological system. Others have addressed this issue by limiting the solution to a local estimate of the model parameters. Here we present an approach that is independent of the local parameter constraint. This approach is made computationally efficient and tractable by the use of: (1) sparse grid interpolation that approximates the biological system dynamics, (2) representative parameters that uniformly represent the data-consistent dynamical space, and (3) probability weights of the represented experimentally distinguishable dynamics. Our approach identifies data-consistent representative parameters using sparse grid interpolants, constructs the optimal input sequence from a greedy search, and defines the associated optimal measurements using a scenario tree. We explore the optimality of this MBDOE algorithm using a 3-dimensional Hes1 model and a 19-dimensional T-cell receptor model. The 19-dimensional T-cell model also demonstrates the MBDOE algorithm’s scalability to higher dimensions. In both cases, the dynamical uncertainty region that bounds the trajectories of the target system states were reduced by as much as 86% and 99% respectively after completing the designed experiments in silico. Our results suggest that for resolving dynamical uncertainty, the ability to design an input sequence paired with its associated measurements is particularly important when limited by the number of measurements. PMID:26379275

  8. Atmospheric Longwave Irradiance Uncertainty: Pyrgeometers Compared to an Absolute Sky-Scanning Radiometer, Atmospheric Emitted Radiance Interferometer, and Radiative Transfer Model Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Philipona, J. R.; Dutton, Ellsworth G.; Stoffel, T.

    2001-06-04

    Because atmospheric longwave radiation is one of the most fundamental elements of an expected climate change, there has been a strong interest in improving measurements and model calculations in recent years. Important questions are how reliable and consistent are atmospheric longwave radiation measurements and calculations and what are the uncertainties? The First International Pyrgeometer and Absolute Sky-scanning Radiometer Comparison, which was held at the Atmospheric Radiation Measurement program's Souther Great Plains site in Oklahoma, answers these questions at least for midlatitude summer conditions and reflects the state of the art for atmospheric longwave radiation measurements and calculations. The 15 participatingmore » pyrgeometers were all calibration-traced standard instruments chosen from a broad international community. Two new chopped pyrgeometers also took part in the comparison. And absolute sky-scanning radiometer (ASR), which includes a pyroelectric detector and a reference blackbody source, was used for the first time as a reference standard instrument to field calibrate pyrgeometers during clear-sky nighttime measurements. Owner-provided and uniformly determined blackbody calibration factors were compared. Remarkable improvements and higher pyrgeometer precision were achieved with field calibration factors. Results of nighttime and daytime pyrgeometer precision and absolute uncertainty are presented for eight consecutive days of measurements, during which period downward longwave irradiance varied between 260 and 420 W m-2. Comparisons between pyrgeometers and the absolute ASR, the atmospheric emitted radiance interferometer, and radiative transfer models LBLRTM and MODTRAN show a surprisingly good agreement of <2 W m-2 for nighttime atmospheric longwave irradiance measurements and calculations.« less

  9. Prediction and measurement of direct-normal solar irradiance: A closure experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halthore, R.N.; Schwartz, S.E.; Michalsky, J.J.

    1997-03-01

    Direct-normal solar irradiance (DNSI), the total energy in the solar spectrum incident on a plane perpendicular to the Sun`s direction on a unit area at the earth`s surface in unit time, depends only on the atmospheric extinction of sunlight without regard to the details of extinction--whether absorption or scattering. Here the authors describe a set of closure experiments performed in north-central Oklahoma, wherein measured atmospheric composition is input to a radiative transfer model, MODTRAN-3, to predict DNSI, which is then compared to measured values. Thirty six independent comparisons are presented; the agreement between predicted and measured values falls within themore » combined uncertainties in the prediction (2%) and measurement (0.2%) albeit with a slight bias ({approximately} 1% overprediction) that is independent of the solar zenith angle. Thus these results establish the adequacy of current knowledge of the solar spectrum and atmospheric extinction as embodied in MODTRAN-3 for use in climate models. An important consequence is the overwhelming likelihood that the atmospheric clear-sky absorption is accurately described to within comparable uncertainties.« less

  10. Prediction and measurement of direct-normal solar irradiance: A closure experiment

    NASA Technical Reports Server (NTRS)

    Halthore, R. N.; Schwartz, S. E.; Michalsky, J. J.; Anderson, G. P.; Ferrare, R. A.; Ten Brink, H. M.

    1997-01-01

    Direct-Normal Solar Irradiance (DNSI), the total energy in the solar spectrum incident on a plane perpendicular to the Sun's direction on a unit area at the earth's surface in unit time, depends only on the atmospheric extinction of sunlight without regard to the details of extinction-whether absorption or scattering. Here the authors describe a set of closure experiments performed in north-central Oklahoma, wherein measured atmospheric composition is input to a radiative transfer model, MODTRAN-3, to predict DNSI, which is then compared to measured values. Thirty six independent comparisons are presented; the agreement between predicted and measured values falls within the combined uncertainties in the prediction (2%) and measurement (0.2%) albeit with a slight bias ((approximately) 1% overprediction) that is independent of the solar zenith angle. Thus these results establish the adequacy of current knowledge of the solar spectrum and atmospheric extinction as embodied in MODTRAN-3 for use in climate models. An important consequence is the overwhelming likelihood that the atmospheric clear-sky absorption is accurately described to within comparable uncertainties.

  11. New Parallaxes for the Upper Scorpius OB Association

    NASA Astrophysics Data System (ADS)

    Donaldson, J. K.; Weinberger, A. J.; Gagné, J.; Boss, A. P.; Keiser, S. A.

    2017-11-01

    Upper Scorpius is a subgroup of the nearest OB association, Scorpius-Centaurus. Its young age makes it an important association to study star and planet formation. We present parallaxes to 52 low-mass stars in Upper Scorpius, 28 of which have full kinematics. We measure ages of the individual stars by combining our measured parallaxes with pre-main-sequence evolutionary tracks. We find a significant difference in the ages of stars with and without circumstellar disks. The stars without disks have a mean age of 4.9 ± 0.8 Myr and those with disks have an older mean age of 8.2 ± 0.9 Myr. This somewhat counterintuitive result suggests that evolutionary effects in young stars can dominate their apparent ages. We also attempt to use the 28 stars with full kinematics (I.e., proper motion, radial velocity (RV), and parallax) to trace the stars back in time to their original birthplace to obtain a trackback age. As expected, given the large measurement uncertainties on available RV measurements, we find that measurement uncertainties alone cause the group to diverge after a few Myr.

  12. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  13. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  14. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    PubMed

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over the previous work through increased PIV image resolution, use of robust image processing algorithms for near-wall velocity measurements and wall shear stress calculations, and uncertainty analyses for both velocity and wall shear stress measurements. The velocity and shear stress analysis, with spatially distributed uncertainty estimates, highlights the challenges of flow quantification in medical devices and provides potential methods to overcome such challenges.

  15. Sensitivity of Emissions to Uncertainties in Residual Gas Fraction Measurements in Automotive Engines: A Numerical Study

    DOE PAGES

    Aithal, S. M.

    2018-01-01

    Initial conditions of the working fluid (air-fuel mixture) within an engine cylinder, namely, mixture composition and temperature, greatly affect the combustion characteristics and emissions of an engine. In particular, the percentage of residual gas fraction (RGF) in the engine cylinder can significantly alter the temperature and composition of the working fluid as compared with the air-fuel mixture inducted into the engine, thus affecting engine-out emissions. Accurate measurement of the RGF is cumbersome and expensive, thus making it hard to accurately characterize the initial mixture composition and temperature in any given engine cycle. This uncertainty can lead to challenges in accuratelymore » interpreting experimental emissions data and in implementing real-time control strategies. Quantifying the effects of the RGF can have important implications for the diagnostics and control of internal combustion engines. This paper reports on the use of a well-validated, two-zone quasi-dimensional model to compute the engine-out NO and CO emission in a gasoline engine. The effect of varying the RGF on the emissions under lean, near-stoichiometric, and rich engine conditions was investigated. Numerical results show that small uncertainties (~2–4%) in the measured/computed values of the RGF can significantly affect the engine-out NO/CO emissions.« less

  16. Sensitivity of Emissions to Uncertainties in Residual Gas Fraction Measurements in Automotive Engines: A Numerical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aithal, S. M.

    Initial conditions of the working fluid (air-fuel mixture) within an engine cylinder, namely, mixture composition and temperature, greatly affect the combustion characteristics and emissions of an engine. In particular, the percentage of residual gas fraction (RGF) in the engine cylinder can significantly alter the temperature and composition of the working fluid as compared with the air-fuel mixture inducted into the engine, thus affecting engine-out emissions. Accurate measurement of the RGF is cumbersome and expensive, thus making it hard to accurately characterize the initial mixture composition and temperature in any given engine cycle. This uncertainty can lead to challenges in accuratelymore » interpreting experimental emissions data and in implementing real-time control strategies. Quantifying the effects of the RGF can have important implications for the diagnostics and control of internal combustion engines. This paper reports on the use of a well-validated, two-zone quasi-dimensional model to compute the engine-out NO and CO emission in a gasoline engine. The effect of varying the RGF on the emissions under lean, near-stoichiometric, and rich engine conditions was investigated. Numerical results show that small uncertainties (~2–4%) in the measured/computed values of the RGF can significantly affect the engine-out NO/CO emissions.« less

  17. The Multi-Sensor Aerosol Products Sampling System (MAPSS) for Integrated Analysis of Satellite Retrieval Uncertainties

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Petrenko, Maksym; Leptoukh, Gregory

    2010-01-01

    Among the known atmospheric constituents, aerosols represent the greatest uncertainty in climate research. Although satellite-based aerosol retrieval has practically become routine, especially during the last decade, there is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. As long as there is no consensus and the inconsistencies are not well characterized and understood ', there will be no way of developing reliable climate data records from satellite aerosol measurements. Fortunately, the most globally representative well-calibrated ground-based aerosol measurements corresponding to the satellite-retrieved products are available from the Aerosol Robotic Network (AERONET). To adequately utilize the advantages offered by this vital resource,., an online Multi-sensor Aerosol Products Sampling System (MAPSS) was recently developed. The aim of MAPSS is to facilitate detailed comparative analysis of satellite aerosol measurements from different sensors (Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and Calipso-CALIOP) based on the collocation of these data products over AERONET stations. In this presentation, we will describe the strategy of the MAPSS system, its potential advantages for the aerosol community, and the preliminary results of an integrated comparative uncertainty analysis of aerosol products from multiple satellite sensors.

  18. Uncertainty for Part Density Determination: An Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Mario Orlando

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM)more » for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.« less

  19. Effects of in situ stress measurement uncertainties on assessment of predicted seismic activity and risk associated with a hypothetical industrial-scale geologic CO 2 sequestration operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeanne, Pierre; Rutqvist, Jonny; Wainwright, Haruko M.

    Carbon capture and storage (CCS) in geologic formations has been recognized as a promising option for reducing carbon dioxide (CO 2) emissions from large stationary sources. However, the pressure buildup inside the storage formation can potentially induce slip along preexisting faults, which could lead to felt seismic ground motion and also provide pathways for brine/CO 2 leakage into shallow drinking water aquifers. To assess the geomechanical stability of faults, it is of crucial importance to know the in situ state of stress. In situ stress measurements can provide some information on the stresses acting on faults but with considerable uncertainties.more » In this paper, we investigate how such uncertainties, as defined by the variation of stress measurements obtained within the study area, could influence the assessment of the geomechanical stability of faults and the characteristics of potential injection-induced seismic events. Our modeling study is based on a hypothetical industrial-scale carbon sequestration project assumed to be located in the Southern San Joaquin Basin in California, USA. We assess the stability on the major (25 km long) fault that bounds the sequestration site and is subjected to significant reservoir pressure changes as a result of 50 years of CO 2 injection. We also present a series of geomechanical simulations in which the resolved stresses on the fault were varied over ranges of values corresponding to various stress measurements performed around the study area. The simulation results are analyzed by a statistical approach. Our main results are that the variations in resolved stresses as defined by the range of stress measurements had a negligible effect on the prediction of the seismic risk (maximum magnitude), but an important effect on the timing, the seismicity rate (number of seismic events) and the location of seismic activity.« less

  20. Effects of in situ stress measurement uncertainties on assessment of predicted seismic activity and risk associated with a hypothetical industrial-scale geologic CO 2 sequestration operation

    DOE PAGES

    Jeanne, Pierre; Rutqvist, Jonny; Wainwright, Haruko M.; ...

    2016-10-05

    Carbon capture and storage (CCS) in geologic formations has been recognized as a promising option for reducing carbon dioxide (CO 2) emissions from large stationary sources. However, the pressure buildup inside the storage formation can potentially induce slip along preexisting faults, which could lead to felt seismic ground motion and also provide pathways for brine/CO 2 leakage into shallow drinking water aquifers. To assess the geomechanical stability of faults, it is of crucial importance to know the in situ state of stress. In situ stress measurements can provide some information on the stresses acting on faults but with considerable uncertainties.more » In this paper, we investigate how such uncertainties, as defined by the variation of stress measurements obtained within the study area, could influence the assessment of the geomechanical stability of faults and the characteristics of potential injection-induced seismic events. Our modeling study is based on a hypothetical industrial-scale carbon sequestration project assumed to be located in the Southern San Joaquin Basin in California, USA. We assess the stability on the major (25 km long) fault that bounds the sequestration site and is subjected to significant reservoir pressure changes as a result of 50 years of CO 2 injection. We also present a series of geomechanical simulations in which the resolved stresses on the fault were varied over ranges of values corresponding to various stress measurements performed around the study area. The simulation results are analyzed by a statistical approach. Our main results are that the variations in resolved stresses as defined by the range of stress measurements had a negligible effect on the prediction of the seismic risk (maximum magnitude), but an important effect on the timing, the seismicity rate (number of seismic events) and the location of seismic activity.« less

  1. Gap Size Uncertainty Quantification in Advanced Gas Reactor TRISO Fuel Irradiation Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Binh T.; Einerson, Jeffrey J.; Hawkes, Grant L.

    The Advanced Gas Reactor (AGR)-3/4 experiment is the combination of the third and fourth tests conducted within the tristructural isotropic fuel development and qualification research program. The AGR-3/4 test consists of twelve independent capsules containing a fuel stack in the center surrounded by three graphite cylinders and shrouded by a stainless steel shell. This capsule design enables temperature control of both the fuel and the graphite rings by varying the neon/helium gas mixture flowing through the four resulting gaps. Knowledge of fuel and graphite temperatures is crucial for establishing the functional relationship between fission product release and irradiation thermal conditions.more » These temperatures are predicted for each capsule using the commercial finite-element heat transfer code ABAQUS. Uncertainty quantification reveals that the gap size uncertainties are among the dominant factors contributing to predicted temperature uncertainty due to high input sensitivity and uncertainty. Gap size uncertainty originates from the fact that all gap sizes vary with time due to dimensional changes of the fuel compacts and three graphite rings caused by extended exposure to high temperatures and fast neutron irradiation. Gap sizes are estimated using as-fabricated dimensional measurements at the start of irradiation and post irradiation examination dimensional measurements at the end of irradiation. Uncertainties in these measurements provide a basis for quantifying gap size uncertainty. However, lack of gap size measurements during irradiation and lack of knowledge about the dimension change rates lead to gap size modeling assumptions, which could increase gap size uncertainty. In addition, the dimensional measurements are performed at room temperature, and must be corrected to account for thermal expansion of the materials at high irradiation temperatures. Uncertainty in the thermal expansion coefficients for the graphite materials used in the AGR-3/4 capsules also increases gap size uncertainty. This study focuses on analysis of modeling assumptions and uncertainty sources to evaluate their impacts on the gap size uncertainty.« less

  2. Evaluation of Uncertainties in Measuring Particulate Matter Emission Factors from Atmospheric Fugitive Sources Using Optical Remote Sensing

    NASA Astrophysics Data System (ADS)

    Yuen, W.; Ma, Q.; Du, K.; Koloutsou-Vakakis, S.; Rood, M. J.

    2015-12-01

    Measurements of particulate matter (PM) emissions generated from fugitive sources are of interest in air pollution studies, since such emissions vary widely both spatially and temporally. This research focuses on determining the uncertainties in quantifying fugitive PM emission factors (EFs) generated from mobile vehicles using a vertical scanning micro-pulse lidar (MPL). The goal of this research is to identify the greatest sources of uncertainty of the applied lidar technique in determining fugitive PM EFs, and to recommend methods to reduce the uncertainties in this measurement. The MPL detects the PM plume generated by mobile fugitive sources that are carried downwind to the MPL's vertical scanning plane. Range-resolved MPL signals are measured, corrected, and converted to light extinction coefficients, through inversion of the lidar equation and calculation of the lidar ratio. In this research, both the near-end and far-end lidar equation inversion methods are considered. Range-resolved PM mass concentrations are then determined from the extinction coefficient measurements using the measured mass extinction efficiency (MEE) value, which is an intensive PM property. MEE is determined by collocated PM mass concentration and light extinction measurements, provided respectively by a DustTrak and an open-path laser transmissometer. These PM mass concentrations are then integrated with wind information, duration of plume event, and vehicle distance travelled to obtain fugitive PM EFs. To obtain the uncertainty of PM EFs, uncertainties in MPL signals, lidar ratio, MEE, and wind variation are considered. Error propagation method is applied to each of the above intermediate steps to aggregate uncertainty sources. Results include determination of uncertainties in each intermediate step, and comparison of uncertainties between the use of near-end and far-end lidar equation inversion methods.

  3. Inferential consequences of modeling rather than measuring snow accumulation in studies of animal ecology

    USGS Publications Warehouse

    Cross, Paul C.; Klaver, Robert W.; Brennan, Angela; Creel, Scott; Beckmann, Jon P.; Higgs, Megan D.; Scurlock, Brandon M.

    2013-01-01

    Abstract. It is increasingly common for studies of animal ecology to use model-based predictions of environmental variables as explanatory or predictor variables, even though model prediction uncertainty is typically unknown. To demonstrate the potential for misleading inferences when model predictions with error are used in place of direct measurements, we compared snow water equivalent (SWE) and snow depth as predicted by the Snow Data Assimilation System (SNODAS) to field measurements of SWE and snow depth. We examined locations on elk (Cervus canadensis) winter ranges in western Wyoming, because modeled data such as SNODAS output are often used for inferences on elk ecology. Overall, SNODAS predictions tended to overestimate field measurements, prediction uncertainty was high, and the difference between SNODAS predictions and field measurements was greater in snow shadows for both snow variables compared to non-snow shadow areas. We used a simple simulation of snow effects on the probability of an elk being killed by a predator to show that, if SNODAS prediction uncertainty was ignored, we might have mistakenly concluded that SWE was not an important factor in where elk were killed in predatory attacks during the winter. In this simulation, we were interested in the effects of snow at finer scales (2) than the resolution of SNODAS. If bias were to decrease when SNODAS predictions are averaged over coarser scales, SNODAS would be applicable to population-level ecology studies. In our study, however, averaging predictions over moderate to broad spatial scales (9–2200 km2) did not reduce the differences between SNODAS predictions and field measurements. This study highlights the need to carefully evaluate two issues when using model output as an explanatory variable in subsequent analysis: (1) the model’s resolution relative to the scale of the ecological question of interest and (2) the implications of prediction uncertainty on inferences when using model predictions as explanatory or predictor variables.

  4. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  5. A conceptual precipitation-runoff modeling suite: Model selection, calibration and predictive uncertainty assessment

    Treesearch

    Tyler Jon Smith

    2008-01-01

    In Montana and much of the Rocky Mountain West, the single most important parameter in forecasting the controls on regional water resources is snowpack. Despite the heightened importance of snowpack, few studies have considered the representation of uncertainty in coupled snowmelt/hydrologic conceptual models. Uncertainty estimation provides a direct interpretation of...

  6. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  7. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  8. Plasmonic trace sensing below the photon shot noise limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pooser, Raphael C.; Lawrie, Benjamin J.

    Plasmonic sensors are important detectors of biochemical trace compounds, but those that utilize optical readout are approaching their absolute limits of detection as defined by the Heisenberg uncertainty principle in both differential intensity and phase readout. However, the use of more general minimum uncertainty states in the form of squeezed light can push the noise floor in these sensors below the shot noise limit (SNL) in one analysis variable at the expense of another. Here, we demonstrate a quantum plasmonic sensor whose noise floor is reduced below the SNL in order to perform index of refraction measurements with sensitivities unobtainablemore » with classical plasmonic sensors. The increased signal-to-noise ratio can result in faster detection of analyte concentrations that were previously lost in the noise. As a result, these benefits are the hallmarks of a sensor exploiting quantum readout fields in order to manipulate the limits of the Heisenberg uncertainty principle.« less

  9. Plasmonic trace sensing below the photon shot noise limit

    DOE PAGES

    Pooser, Raphael C.; Lawrie, Benjamin J.

    2015-12-09

    Plasmonic sensors are important detectors of biochemical trace compounds, but those that utilize optical readout are approaching their absolute limits of detection as defined by the Heisenberg uncertainty principle in both differential intensity and phase readout. However, the use of more general minimum uncertainty states in the form of squeezed light can push the noise floor in these sensors below the shot noise limit (SNL) in one analysis variable at the expense of another. Here, we demonstrate a quantum plasmonic sensor whose noise floor is reduced below the SNL in order to perform index of refraction measurements with sensitivities unobtainablemore » with classical plasmonic sensors. The increased signal-to-noise ratio can result in faster detection of analyte concentrations that were previously lost in the noise. As a result, these benefits are the hallmarks of a sensor exploiting quantum readout fields in order to manipulate the limits of the Heisenberg uncertainty principle.« less

  10. Goal-oriented Site Characterization in Hydrogeological Applications: An Overview

    NASA Astrophysics Data System (ADS)

    Nowak, W.; de Barros, F.; Rubin, Y.

    2011-12-01

    In this study, we address the importance of goal-oriented site characterization. Given the multiple sources of uncertainty in hydrogeological applications, information needs of modeling, prediction and decision support should be satisfied with efficient and rational field campaigns. In this work, we provide an overview of an optimal sampling design framework based on Bayesian decision theory, statistical parameter inference and Bayesian model averaging. It optimizes the field sampling campaign around decisions on environmental performance metrics (e.g., risk, arrival times, etc.) while accounting for parametric and model uncertainty in the geostatistical characterization, in forcing terms, and measurement error. The appealing aspects of the framework lie on its goal-oriented character and that it is directly linked to the confidence in a specified decision. We illustrate how these concepts could be applied in a human health risk problem where uncertainty from both hydrogeological and health parameters are accounted.

  11. An Uncertainty-Based Distributed Fault Detection Mechanism for Wireless Sensor Networks

    PubMed Central

    Yang, Yang; Gao, Zhipeng; Zhou, Hang; Qiu, Xuesong

    2014-01-01

    Exchanging too many messages for fault detection will cause not only a degradation of the network quality of service, but also represents a huge burden on the limited energy of sensors. Therefore, we propose an uncertainty-based distributed fault detection through aided judgment of neighbors for wireless sensor networks. The algorithm considers the serious influence of sensing measurement loss and therefore uses Markov decision processes for filling in missing data. Most important of all, fault misjudgments caused by uncertainty conditions are the main drawbacks of traditional distributed fault detection mechanisms. We draw on the experience of evidence fusion rules based on information entropy theory and the degree of disagreement function to increase the accuracy of fault detection. Simulation results demonstrate our algorithm can effectively reduce communication energy overhead due to message exchanges and provide a higher detection accuracy ratio. PMID:24776937

  12. Systematic Analysis Of Ocean Colour Uncertainties

    NASA Astrophysics Data System (ADS)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  13. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  14. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  15. Covariance propagation in spectral indices

    DOE PAGES

    Griffin, P. J.

    2015-01-09

    In this study, the dosimetry community has a history of using spectral indices to support neutron spectrum characterization and cross section validation efforts. An important aspect to this type of analysis is the proper consideration of the contribution of the spectrum uncertainty to the total uncertainty in calculated spectral indices (SIs). This study identifies deficiencies in the traditional treatment of the SI uncertainty, provides simple bounds to the spectral component in the SI uncertainty estimates, verifies that these estimates are reflected in actual applications, details a methodology that rigorously captures the spectral contribution to the uncertainty in the SI, andmore » provides quantified examples that demonstrate the importance of the proper treatment the spectral contribution to the uncertainty in the SI.« less

  16. TIMEKEEPING IN THE AMERICAS.

    PubMed

    López, J M; Lombardi, M A

    Time and its measurement belong to the most fundamental core of physics, and many scientific and technological advances are directly or indirectly related to time measurements. Timekeeping is essential to everyday life, and thus is the most measured physical quantity in modern societies. Time can also be measured with less uncertainty and more resolution than any other physical quantity. The measurement of time is of the utmost importance for many applications, including: global navigation satellite systems, communications networks, electric power generation, astronomy, electronic commerce, and national defense and security. This paper discusses how time is kept, coordinated, and disseminated in the Americas.

  17. Timekeeping in the Americas

    NASA Astrophysics Data System (ADS)

    López, J. M.; Lombardi, M. A.

    2015-10-01

    Time and its measurement belong to the most fundamental core of physics, and many scientific and technological advances are directly or indirectly related to time measurements. Timekeeping is essential to everyday life, and thus is the most measured physical quantity in modern societies. Time can also be measured with less uncertainty and more resolution than any other physical quantity. The measurement of time is of the utmost importance for many applications, including: global navigation satellite systems, communications networks, electric power generation, astronomy, electronic commerce, and national defense and security. This paper discusses how time is kept, coordinated, and disseminated in the Americas.

  18. TIMEKEEPING IN THE AMERICAS

    PubMed Central

    López, J. M.; Lombardi, M. A.

    2016-01-01

    Time and its measurement belong to the most fundamental core of physics, and many scientific and technological advances are directly or indirectly related to time measurements. Timekeeping is essential to everyday life, and thus is the most measured physical quantity in modern societies. Time can also be measured with less uncertainty and more resolution than any other physical quantity. The measurement of time is of the utmost importance for many applications, including: global navigation satellite systems, communications networks, electric power generation, astronomy, electronic commerce, and national defense and security. This paper discusses how time is kept, coordinated, and disseminated in the Americas. PMID:26973371

  19. Examples of measurement uncertainty evaluations in accordance with the revised GUM

    NASA Astrophysics Data System (ADS)

    Runje, B.; Horvatic, A.; Alar, V.; Medic, S.; Bosnjakovic, A.

    2016-11-01

    The paper presents examples of the evaluation of uncertainty components in accordance with the current and revised Guide to the expression of uncertainty in measurement (GUM). In accordance with the proposed revision of the GUM a Bayesian approach was conducted for both type A and type B evaluations.The law of propagation of uncertainty (LPU) and the law of propagation of distribution applied through the Monte Carlo method, (MCM) were used to evaluate associated standard uncertainties, expanded uncertainties and coverage intervals. Furthermore, the influence of the non-Gaussian dominant input quantity and asymmetric distribution of the output quantity y on the evaluation of measurement uncertainty was analyzed. In the case when the probabilistically coverage interval is not symmetric, the coverage interval for the probability P is estimated from the experimental probability density function using the Monte Carlo method. Key highlights of the proposed revision of the GUM were analyzed through a set of examples.

  20. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  1. Impulsivity modulates performance under response uncertainty in a reaching task.

    PubMed

    Tzagarakis, C; Pellizzer, G; Rogers, R D

    2013-03-01

    We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.

  2. Profile measurements in the plasma edge of mega amp spherical tokamak using a ball pen probe

    NASA Astrophysics Data System (ADS)

    Walkden, N. R.; Adamek, J.; Allan, S.; Dudson, B. D.; Elmore, S.; Fishpool, G.; Harrison, J.; Kirk, A.; Komm, M.

    2015-02-01

    The ball pen probe (BPP) technique is used successfully to make profile measurements of plasma potential, electron temperature, and radial electric field on the Mega Amp Spherical Tokamak. The potential profile measured by the BPP is shown to significantly differ from the floating potential both in polarity and profile shape. By combining the BPP potential and the floating potential, the electron temperature can be measured, which is compared with the Thomson scattering (TS) diagnostic. Excellent agreement between the two diagnostics is obtained when secondary electron emission is accounted for in the floating potential. From the BPP profile, an estimate of the radial electric field is extracted which is shown to be of the order ˜1 kV/m and increases with plasma current. Corrections to the BPP measurement, constrained by the TS comparison, introduce uncertainty into the ER measurements. The uncertainty is most significant in the electric field well inside the separatrix. The electric field is used to estimate toroidal and poloidal rotation velocities from E × B motion. This paper further demonstrates the ability of the ball pen probe to make valuable and important measurements in the boundary plasma of a tokamak.

  3. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  4. Sediment‐associated organic matter sources and sediment oxygen demand in a Special Area of Conservation (SAC): A case study of the River Axe, UK

    PubMed Central

    Zhang, Y.; McMillan, S.; Dixon, E. R.; Stringfellow, A.; Bateman, S.; Sear, D. A.

    2017-01-01

    Abstract Oxygen demand in river substrates providing important habitats for the early life stages of aquatic ecology, including lithophilous fish, can arise due to the oxidation of sediment‐associated organic matter. Oxygen depletion associated with this component of river biogeochemical cycling, will, in part, depend on the sources of such material. A reconnaissance survey was therefore undertaken to assess the relative contributions from bed sediment‐associated organic matter sources potentially impacting on the River Axe Special Area of Conservation (SAC), in SW England. Source fingerprinting, including Monte Carlo uncertainty analysis, suggested that the relative frequency‐weighted average median source contributions ranged between 19% (uncertainty range 0–82%) and 64% (uncertainty range 0–99%) for farmyard manures or slurries, 4% (uncertainty range 0–49%) and 35% (uncertainty range 0–100%) for damaged road verges, 2% (uncertainty range 0–100%) and 68% (uncertainty range 0–100%) for decaying instream vegetation, and 2% (full uncertainty range 0–15%) and 6% (uncertainty range 0–48%) for human septic waste. A reconnaissance survey of sediment oxygen demand (SOD) along the channel designated as a SAC yielded a mean SOD5 of 4 mg O2 g−1 dry sediment and a corresponding SOD20 of 7 mg O2 g−1 dry sediment, compared with respective ranges of 1–15 and 2–30 mg O2 g−1 dry sediment, measured by the authors for a range of river types across the UK. The findings of the reconnaissance survey were used in an agency (SW region) catchment appraisal exercise for informing targeted management to help protect the SAC. PMID:29527135

  5. Extended Importance Sampling for Reliability Analysis under Evidence Theory

    NASA Astrophysics Data System (ADS)

    Yuan, X. K.; Chen, B.; Zhang, B. Q.

    2018-05-01

    In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.

  6. Model-independent determination of the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes

    NASA Astrophysics Data System (ADS)

    Harnew, Samuel; Naik, Paras; Prouve, Claire; Rademacker, Jonas; Asner, David

    2018-01-01

    For the first time, the strong phase difference between D 0 and {\\overline{D}}^0\\to {π}+{π}-{π}+{π}- amplitudes is determined in bins of the decay phase space. The measurement uses 818 pb-1 of e + e - collision data that is taken at the ψ(3770) resonance and collected by the CLEO-c experiment. The measurement is important for the determination of the CP -violating phase γ in B ± → DK ± (and similar) decays, where the D meson (which represents a superposition of D 0 and {\\overline{D}}^0 ) subsequently decays to π + π - π + π -. To obtain optimal sensitivity to γ, the phase space of the D → π + π - π + π - decay is divided into bins based on a recent amplitude model of the decay. Although an amplitude model is used to define the bins, the measurements obtained are model-independent. The CP -even fraction of the D → π + π - π + π - decay is determined to be F + 4 π = 0.769 ± 0.021 ± 0.010, where the uncertainties are statistical and systematic, respectively. Using simulated B ± → DK ±, D → π + π - π + π - decays, it is estimated that by the end of the current LHC run, the LHCb experiment could determine γ from this decay mode with an uncertainty of (±10 ± 7)°, where the first uncertainty is statistical based on estimated LHCb event yields, and the second is due to the uncertainties on the parameters determined in this paper.

  7. Organizational uncertainty and stress among teachers in Hong Kong: work characteristics and organizational justice.

    PubMed

    Hassard, Juliet; Teoh, Kevin; Cox, Tom

    2017-10-01

    A growing literature now exists examining the relationship between organizational justice and employees' experience of stress. Despite the growth in this field of enquiry, there remain continued gaps in knowledge. In particular, the contribution of perceptions of justice to employees' stress within an organizational context of uncertainty and change, and in relation to the new and emerging concept of procedural-voice justice. The aim of the current study was to examine the main, interaction and additive effects of work characteristics and organizational justice perceptions to employees' experience of stress (as measured by their feelings of helplessness and perceived coping) during an acknowledged period of organizational uncertainty. Questionnaires were distributed among teachers in seven public primary schools in Hong Kong that were under threat of closure (n = 212). Work characteristics were measured using the demand-control-support model. Hierarchical regression analyses observed perceptions of job demands and procedural-voice justice to predict both teachers' feelings of helplessness and perceived coping ability. Furthermore, teacher's perceived coping was predicted by job control and a significant interaction between procedural-voice justice and distributive justice. The addition of organizational justice variables did account for unique variance, but only in relation to the measure of perceived coping. The study concludes that in addition to 'traditional' work characteristics, health promotion strategies should also address perceptions of organizational justice during times of organizational uncertainty; and, in particular, the value and importance of enhancing employee's perceived 'voice' in influencing and shaping justice-related decisions. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul; Al Hassan, Mohammad; Ring, Robert

    2017-01-01

    Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  9. Estimation of the measurement uncertainty in magnetic resonance velocimetry based on statistical models

    NASA Astrophysics Data System (ADS)

    Bruschewski, Martin; Freudenhammer, Daniel; Buchenberg, Waltraud B.; Schiffer, Heinz-Peter; Grundmann, Sven

    2016-05-01

    Velocity measurements with magnetic resonance velocimetry offer outstanding possibilities for experimental fluid mechanics. The purpose of this study was to provide practical guidelines for the estimation of the measurement uncertainty in such experiments. Based on various test cases, it is shown that the uncertainty estimate can vary substantially depending on how the uncertainty is obtained. The conventional approach to estimate the uncertainty from the noise in the artifact-free background can lead to wrong results. A deviation of up to -75 % is observed with the presented experiments. In addition, a similarly high deviation is demonstrated with the data from other studies. As a more accurate approach, the uncertainty is estimated directly from the image region with the flow sample. Two possible estimation methods are presented.

  10. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  11. A continuous-time adaptive particle filter for estimations under measurement time uncertainties with an application to a plasma-leucine mixed effects model

    PubMed Central

    2013-01-01

    Background When mathematical modelling is applied to many different application areas, a common task is the estimation of states and parameters based on measurements. With this kind of inference making, uncertainties in the time when the measurements have been taken are often neglected, but especially in applications taken from the life sciences, this kind of errors can considerably influence the estimation results. As an example in the context of personalized medicine, the model-based assessment of the effectiveness of drugs is becoming to play an important role. Systems biology may help here by providing good pharmacokinetic and pharmacodynamic (PK/PD) models. Inference on these systems based on data gained from clinical studies with several patient groups becomes a major challenge. Particle filters are a promising approach to tackle these difficulties but are by itself not ready to handle uncertainties in measurement times. Results In this article, we describe a variant of the standard particle filter (PF) algorithm which allows state and parameter estimation with the inclusion of measurement time uncertainties (MTU). The modified particle filter, which we call MTU-PF, also allows the application of an adaptive stepsize choice in the time-continuous case to avoid degeneracy problems. The modification is based on the model assumption of uncertain measurement times. While the assumption of randomness in the measurements themselves is common, the corresponding measurement times are generally taken as deterministic and exactly known. Especially in cases where the data are gained from measurements on blood or tissue samples, a relatively high uncertainty in the true measurement time seems to be a natural assumption. Our method is appropriate in cases where relatively few data are used from a relatively large number of groups or individuals, which introduce mixed effects in the model. This is a typical setting of clinical studies. We demonstrate the method on a small artificial example and apply it to a mixed effects model of plasma-leucine kinetics with data from a clinical study which included 34 patients. Conclusions Comparisons of our MTU-PF with the standard PF and with an alternative Maximum Likelihood estimation method on the small artificial example clearly show that the MTU-PF obtains better estimations. Considering the application to the data from the clinical study, the MTU-PF shows a similar performance with respect to the quality of estimated parameters compared with the standard particle filter, but besides that, the MTU algorithm shows to be less prone to degeneration than the standard particle filter. PMID:23331521

  12. From climate-change spaghetti to climate-change distributions for 21st Century California

    USGS Publications Warehouse

    Dettinger, M.D.

    2005-01-01

    The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.

  13. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability riskmore » metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.« less

  14. On-line prognosis of fatigue crack propagation based on Gaussian weight-mixture proposal particle filter.

    PubMed

    Chen, Jian; Yuan, Shenfang; Qiu, Lei; Wang, Hui; Yang, Weibo

    2018-01-01

    Accurate on-line prognosis of fatigue crack propagation is of great meaning for prognostics and health management (PHM) technologies to ensure structural integrity, which is a challenging task because of uncertainties which arise from sources such as intrinsic material properties, loading, and environmental factors. The particle filter algorithm has been proved to be a powerful tool to deal with prognostic problems those are affected by uncertainties. However, most studies adopted the basic particle filter algorithm, which uses the transition probability density function as the importance density and may suffer from serious particle degeneracy problem. This paper proposes an on-line fatigue crack propagation prognosis method based on a novel Gaussian weight-mixture proposal particle filter and the active guided wave based on-line crack monitoring. Based on the on-line crack measurement, the mixture of the measurement probability density function and the transition probability density function is proposed to be the importance density. In addition, an on-line dynamic update procedure is proposed to adjust the parameter of the state equation. The proposed method is verified on the fatigue test of attachment lugs which are a kind of important joint components in aircraft structures. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Uncertainty propagation for SPECT/CT-based renal dosimetry in 177Lu peptide receptor radionuclide therapy

    NASA Astrophysics Data System (ADS)

    Gustafsson, Johan; Brolin, Gustav; Cox, Maurice; Ljungberg, Michael; Johansson, Lena; Sjögreen Gleisner, Katarina

    2015-11-01

    A computer model of a patient-specific clinical 177Lu-DOTATATE therapy dosimetry system is constructed and used for investigating the variability of renal absorbed dose and biologically effective dose (BED) estimates. As patient models, three anthropomorphic computer phantoms coupled to a pharmacokinetic model of 177Lu-DOTATATE are used. Aspects included in the dosimetry-process model are the gamma-camera calibration via measurement of the system sensitivity, selection of imaging time points, generation of mass-density maps from CT, SPECT imaging, volume-of-interest delineation, calculation of absorbed-dose rate via a combination of local energy deposition for electrons and Monte Carlo simulations of photons, curve fitting and integration to absorbed dose and BED. By introducing variabilities in these steps the combined uncertainty in the output quantity is determined. The importance of different sources of uncertainty is assessed by observing the decrease in standard deviation when removing a particular source. The obtained absorbed dose and BED standard deviations are approximately 6% and slightly higher if considering the root mean square error. The most important sources of variability are the compensation for partial volume effects via a recovery coefficient and the gamma-camera calibration via the system sensitivity.

  16. International Workshop on Comparing Ice Nucleation Measuring Systems 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cziczo, Daniel

    The relationship of ambient aerosol particles to the formation of ice-containing clouds is one of the largest uncertainties in understanding the Earth’s climate. The uncertainty is due to several poorly understood processes and measurements including, but not limited to: (1) the microphysics of how particles nucleate ice, (2) the number of ice forming particles as a function of atmospheric properties such as temperature and relative humidity, (3) the atmospheric distribution of ice forming particles and (4) the role of anthropogenic activities in producing or changing the behavior of ice forming particles. The ways in which ice forming particles can impactmore » climate is also multi-faceted. More ice forming particles can lead to clouds with more ice crystals and different optical properties than clouds with less ice forming particles. More effective ice forming particles can lead to ice at higher temperature and/or lower saturation, resulting in clouds at lower altitude or latitude which also changes the Earth’s radiative balance. Ice nucleation also initiates most of the Earth’s precipitation, even in the mid- and low-latitudes, since cloud-top temperatures are often below freezing. The limited measurements and lack of understanding directly translates to restrictions in our ability to model atmospheric ice formation and project changes into the future. The importance of ice nucleation research is further exemplified by Figure 1 which shows the publications per decade and citations per year on the topic of ice nucleation [DeMott et al., 2011]. After a lull at the end of the last century, there has been a dramatic increase in both publications and citations related to ice nucleation; this directly corresponds to the importance of ice nucleation on the Earth’s climate and the uncertainty in this area noted by the Solomon [2007].« less

  17. Determining the Uncertainty of X-Ray Absorption Measurements

    PubMed Central

    Wojcik, Gary S.

    2004-01-01

    X-ray absorption (or more properly, x-ray attenuation) techniques have been applied to study the moisture movement in and moisture content of materials like cement paste, mortar, and wood. An increase in the number of x-ray counts with time at a location in a specimen may indicate a decrease in moisture content. The uncertainty of measurements from an x-ray absorption system, which must be known to properly interpret the data, is often assumed to be the square root of the number of counts, as in a Poisson process. No detailed studies have heretofore been conducted to determine the uncertainty of x-ray absorption measurements or the effect of averaging data on the uncertainty. In this study, the Poisson estimate was found to adequately approximate normalized root mean square errors (a measure of uncertainty) of counts for point measurements and profile measurements of water specimens. The Poisson estimate, however, was not reliable in approximating the magnitude of the uncertainty when averaging data from paste and mortar specimens. Changes in uncertainty from differing averaging procedures were well-approximated by a Poisson process. The normalized root mean square errors decreased when the x-ray source intensity, integration time, collimator size, and number of scanning repetitions increased. Uncertainties in mean paste and mortar count profiles were kept below 2 % by averaging vertical profiles at horizontal spacings of 1 mm or larger with counts per point above 4000. Maximum normalized root mean square errors did not exceed 10 % in any of the tests conducted. PMID:27366627

  18. Technical notes: A detailed study for the provision of measurement uncertainty and traceability for goniospectrometers

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Jouni I.; Hakala, Teemu; Suomalainen, Juha; Honkavaara, Eija; Markelin, Lauri; Gritsevich, Maria; Eskelinen, Juho; Jaanson, Priit; Ikonen, Erkki

    2014-10-01

    The measurement uncertainty and traceability of the Finnish Geodetic Institutess field gonio-spectro-polarimeter FIGIFIGO have been assessed. First, the reference standard (Spectralon sample) was measured at the National Standard Laboratory of MIKES-Aalto. This standard was transferred to FGIs field reference standard (larger Spectralon sample), and from that to the unmanned aerial vehicle (UAV), reference standards (1 m2 plates). The reflectance measurement uncertainty of FIGIFIGO has been estimated to be 0.01 in ideal laboratory conditions, but about 0.02-0.05 in typical field conditions, larger at larger solar or observation zenith angles. Target specific uncertainties can increase total uncertainty even to 0.1-0.2. The angular reading uncertainty is between 1° and 3°, depending on user selection, and the polarisation uncertainty is around 0.01. For UAV, the transferred reflectance uncertainty is about 0.05-0.1, depending on, how ideal the measurement conditions are. The design concept of FIGIFIGO has been proved to have a number of advantages, such as a well-adopted user-friendly interface, a high level of automation and excellent suitability for the field measurements. It is a perfect instrument for collection of reference data on a given target in natural (and well-recorded) conditions. In addition to the strong points of FIGIFIGO, the current study reveals several issues that need further attention, such as the field of view, illumination quality, polarisation calibration, Spectralon reflectance and polarisation properties in the 1000-2400 nm range.

  19. Computer-assisted uncertainty assessment of k0-NAA measurement results

    NASA Astrophysics Data System (ADS)

    Bučar, T.; Smodiš, B.

    2008-10-01

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.

  20. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE PAGES

    Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...

    2017-09-01

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  1. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddington, C. L.; Carslaw, K. S.; Stier, P.

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  2. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  3. Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites

    DOE PAGES

    Madonna, F.; Rosoldi, M.; Güldner, J.; ...

    2014-11-19

    The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less

  4. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  5. Effect of Oils on Kinematic Viscosity of R134a

    NASA Astrophysics Data System (ADS)

    Sato, Tomoaki; Takaishi, Yoshinori; Oguchi, Kosei

    The kinematic viscosity defined as a ratio of viscosity to density is one of the key properties in producing technically important dimensionless numbers such as Prandtl and Reynolds numbers. We measured both viscosity and density of R134a/POE and R134a/PAG mixtures at saturation in the range of relatively low oilconcentrations. The density measurements for oil-concentrations up to 50 mass% were conducted with a densimeter making use of glass buoys within overall uncertainty of ±1.0%, and the viscosity measurements for oil-concentrations up to 16 mass% were carried out with an oscillating-cup viscometer making use of polarizer with overall uncertainty less than ±3.5%. The kinematic viscosities obtained from the experimental viscosity and density data are presented for both R134a/POE and R134a/PAG mixtures in the range of temperatures from 278 K to 288 K for oil-concentrations up to 15 mass%. The oil-concentration dependence of the kinematic viscosity for both mixtures is also reported.

  6. Improvements in aircraft extraction programs

    NASA Technical Reports Server (NTRS)

    Balakrishnan, A. V.; Maine, R. E.

    1976-01-01

    Flight data from an F-8 Corsair and a Cessna 172 was analyzed to demonstrate specific improvements in the LRC parameter extraction computer program. The Cramer-Rao bounds were shown to provide a satisfactory relative measure of goodness of parameter estimates. It was not used as an absolute measure due to an inherent uncertainty within a multiplicative factor, traced in turn to the uncertainty in the noise bandwidth in the statistical theory of parameter estimation. The measure was also derived on an entirely nonstatistical basis, yielding thereby also an interpretation of the significance of off-diagonal terms in the dispersion matrix. The distinction between coefficients as linear and non-linear was shown to be important in its implication to a recommended order of parameter iteration. Techniques of improving convergence generally, were developed, and tested out on flight data. In particular, an easily implemented modification incorporating a gradient search was shown to improve initial estimates and thus remove a common cause for lack of convergence.

  7. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    NASA Astrophysics Data System (ADS)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive set of real aerosol and radiation observations taken from ground stations, flight campaigns and satellite. This research has been supported by the UK-China Research & Innovation Partnership Fund through the Met Office Climate Science for Service Partnership (CSSP) China as part of the Newton Fund, and by the NERC funded GASSP project.

  8. Advancing Solar Irradiance Measurement for Climate-Related Studies: Accurate Constraint on Direct Aerosol Radiative Effect (DARE)

    NASA Technical Reports Server (NTRS)

    Tsay, Si-Chee; Ji, Q. Jack

    2011-01-01

    Earth's climate is driven primarily by solar radiation. As summarized in various IPCC reports, the global average of radiative forcing for different agents and mechanisms, such as aerosols or CO2 doubling, is in the range of a few W/sq m. However, when solar irradiance is measured by broadband radiometers, such as the fleet of Eppley Precision Solar Pyranometers (PSP) and equivalent instrumentation employed worldwide, the measurement uncertainty is larger than 2% (e.g., WMO specification of pyranometer, 2008). Thus, out of the approx. 184 W/sq m (approx.263 W/sq m if cloud-free) surface solar insolation (Trenberth et al. 2009), the measurement uncertainty is greater than +/-3.6 W/sq m, overwhelming the climate change signals. To discern these signals, less than a 1 % measurement uncertainty is required and is currently achievable only by means of a newly developed methodology employing a modified PSP-like pyranometer and an updated calibration equation to account for its thermal effects (li and Tsay, 2010). In this talk, we will show that some auxiliary measurements, such as those from a collocated pyrgeometer or air temperature sensors, can help correct historical datasets. Additionally, we will also demonstrate that a pyrheliometer is not free of the thermal effect; therefore, comparing to a high cost yet still not thermal-effect-free "direct + diffuse" approach in measuring surface solar irradiance, our new method is more economical, and more likely to be suitable for correcting a wide variety of historical datasets. Modeling simulations will be presented that a corrected solar irradiance measurement has a significant impact on aerosol forcing, and thus plays an important role in climate studies.

  9. An Approach to Forecasting Health Expenditures, with Application to the U.S. Medicare System

    PubMed Central

    Lee, Ronald; Miller, Timothy

    2002-01-01

    Objective To quantify uncertainty in forecasts of health expenditures. Study Design Stochastic time series models are estimated for historical variations in fertility, mortality, and health spending per capita in the United States, and used to generate stochastic simulations of the growth of Medicare expenditures. Individual health spending is modeled to depend on the number of years until death. Data Sources/Study Setting A simple accounting model is developed for forecasting health expenditures, using the U.S. Medicare system as an example. Principal Findings Medicare expenditures are projected to rise from 2.2 percent of GDP (gross domestic product) to about 8 percent of GDP by 2075. This increase is due in equal measure to increasing health spending per beneficiary and to population aging. The traditional projection method constructs high, medium, and low scenarios to assess uncertainty, an approach that has many problems. Using stochastic forecasting, we find a 95 percent probability that Medicare spending in 2075 will fall between 4 percent and 18 percent of GDP, indicating a wide band of uncertainty. Although there is substantial uncertainty about future mortality decline, it contributed little to uncertainty about future Medicare spending, since lower mortality both raises the number of elderly, tending to raise spending, and is associated with improved health of the elderly, tending to reduce spending. Uncertainty about fertility, by contrast, leads to great uncertainty about the future size of the labor force, and therefore adds importantly to uncertainty about the health-share of GDP. In the shorter term, the major source of uncertainty is health spending per capita. Conclusions History is a valuable guide for quantifying our uncertainty about future health expenditures. The probabilistic model we present has several advantages over the high–low scenario approach to forecasting. It indicates great uncertainty about future Medicare expenditures relative to GDP. PMID:12479501

  10. Prediction uncertainty and data worth assessment for groundwater transport times in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Zell, Wesley O.; Culver, Teresa B.; Sanford, Ward E.

    2018-06-01

    Uncertainties about the age of base-flow discharge can have serious implications for the management of degraded environmental systems where subsurface pathways, and the ongoing release of pollutants that accumulated in the subsurface during past decades, dominate the water quality signal. Numerical groundwater models may be used to estimate groundwater return times and base-flow ages and thus predict the time required for stakeholders to see the results of improved agricultural management practices. However, the uncertainty inherent in the relationship between (i) the observations of atmospherically-derived tracers that are required to calibrate such models and (ii) the predictions of system age that the observations inform have not been investigated. For example, few if any studies have assessed the uncertainty of numerically-simulated system ages or evaluated the uncertainty reductions that may result from the expense of collecting additional subsurface tracer data. In this study we combine numerical flow and transport modeling of atmospherically-derived tracers with prediction uncertainty methods to accomplish four objectives. First, we show the relative importance of head, discharge, and tracer information for characterizing response times in a uniquely data rich catchment that includes 266 age-tracer measurements (SF6, CFCs, and 3H) in addition to long term monitoring of water levels and stream discharge. Second, we calculate uncertainty intervals for model-simulated base-flow ages using both linear and non-linear methods, and find that the prediction sensitivity vector used by linear first-order second-moment methods results in much larger uncertainties than non-linear Monte Carlo methods operating on the same parameter uncertainty. Third, by combining prediction uncertainty analysis with multiple models of the system, we show that data-worth calculations and monitoring network design are sensitive to variations in the amount of water leaving the system via stream discharge and irrigation withdrawals. Finally, we demonstrate a novel model-averaged computation of potential data worth that can account for these uncertainties in model structure.

  11. SUB-PIXEL RAINFALL VARIABILITY AND THE IMPLICATIONS FOR UNCERTAINTIES IN RADAR RAINFALL ESTIMATES

    EPA Science Inventory

    Radar estimates of rainfall are subject to significant measurement uncertainty. Typically, uncertainties are measured by the discrepancies between real rainfall estimates based on radar reflectivity and point rainfall records of rain gauges. This study investigates how the disc...

  12. R-Matrix Analysis of the 13C(α,n)16O Reaction

    NASA Astrophysics Data System (ADS)

    Kock, Arthur; Rogachev, Grigory

    2015-10-01

    The 13C(α,n)16O reaction plays a crucial role in the main s-process occurring in low-mass thermally-pulsing asymptotic giant branch (TP-AGB) stars, which produces about half of all nuclei heavier than iron. However, direct measurements of this reaction cross section near the Gamow-peak energy are currently not possible due to very small reaction cross sections. Additionally, available cross-section data at higher energy have some inconsistencies, leading to significant uncertainties in low energy extrapolations. A global R-matrix fit was conducted, using all available data for the 13C(α,n)16O, 13C(α, α)13C, and 16O(n,n)16O reactions. Of particular importance was the inclusion of the fixed ANC for the 1 / 2 + state at 6 . 356 MeV in 17O, which was measured recently using the sub-Coulomb α-transfer reaction, as well as the new 13C+ α elastic-scattering data measured in the low-energy region 1 . 6 - 3 . 8 MeV. Important constraining information on various resonances was found, and the uncertainty for the astrophysical 13C(α,n)16O reaction rate was dramatically reduced. Much work on the analysis was done by A. K. Nurmukhanbetova from National Laboratory Astana in Astana, Kazakhstan.

  13. Different top-down approaches to estimate measurement uncertainty of whole blood tacrolimus mass concentration values.

    PubMed

    Rigo-Bonnin, Raül; Blanco-Font, Aurora; Canalias, Francesca

    2018-05-08

    Values of mass concentration of tacrolimus in whole blood are commonly used by the clinicians for monitoring the status of a transplant patient and for checking whether the administered dose of tacrolimus is effective. So, clinical laboratories must provide results as accurately as possible. Measurement uncertainty can allow ensuring reliability of these results. The aim of this study was to estimate measurement uncertainty of whole blood mass concentration tacrolimus values obtained by UHPLC-MS/MS using two top-down approaches: the single laboratory validation approach and the proficiency testing approach. For the single laboratory validation approach, we estimated the uncertainties associated to the intermediate imprecision (using long-term internal quality control data) and the bias (utilizing a certified reference material). Next, we combined them together with the uncertainties related to the calibrators-assigned values to obtain a combined uncertainty for, finally, to calculate the expanded uncertainty. For the proficiency testing approach, the uncertainty was estimated in a similar way that the single laboratory validation approach but considering data from internal and external quality control schemes to estimate the uncertainty related to the bias. The estimated expanded uncertainty for single laboratory validation, proficiency testing using internal and external quality control schemes were 11.8%, 13.2%, and 13.0%, respectively. After performing the two top-down approaches, we observed that their uncertainty results were quite similar. This fact would confirm that either two approaches could be used to estimate the measurement uncertainty of whole blood mass concentration tacrolimus values in clinical laboratories. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  15. Two-point method uncertainty during control and measurement of cylindrical element diameters

    NASA Astrophysics Data System (ADS)

    Glukhov, V. I.; Shalay, V. V.; Radev, H.

    2018-04-01

    The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.

  16. Plans for a measurement of the neutron lifetime to better than 0.3s using a Penning trap and absolute measurement of neutron fluence

    NASA Astrophysics Data System (ADS)

    Mulholland, Jonathan; NBL3 Collaboration

    2014-09-01

    The decay of the free neutron is the prototypical charged current semi-leptonic weak process. A precise value for the neutron lifetime is required for consistency tests of the Standard Model and is needed to predict the primordial He4 abundance from the theory of Big Bang Nucleosynthesis. Plans are being made for an in-beam measurement of the neutron lifetime with an anticipated 0.3s of uncertainty or better. This effort is part of a phased campaign of neutron lifetime measurements based at the NIST Center for Neutron Research, using the Sussex-ILL-NIST technique. Advances in neutron fluence measurement, used in to provide the best existing in-beam determination of the neutron lifetime, as well as new silicon detector technology, in use now at LANSCE, address the two largest contributors to the uncertainty of in-beam measurements-the statistical uncertainty associated with proton counting and the systematic uncertainty in the neutron fluence measurement. The experimental design and projected uncertainties for the 0.3s measurement will be discussed.

  17. Uncertainties in s -process nucleosynthesis in low mass stars determined from Monte Carlo variations

    NASA Astrophysics Data System (ADS)

    Cescutti, G.; Hirschi, R.; Nishimura, N.; den Hartogh, J. W.; Rauscher, T.; Murphy, A. St J.; Cristallo, S.

    2018-05-01

    The main s-process taking place in low mass stars produces about half of the elements heavier than iron. It is therefore very important to determine the importance and impact of nuclear physics uncertainties on this process. We have performed extensive nuclear reaction network calculations using individual and temperature-dependent uncertainties for reactions involving elements heavier than iron, within a Monte Carlo framework. Using this technique, we determined the uncertainty in the main s-process abundance predictions due to nuclear uncertainties link to weak interactions and neutron captures on elements heavier than iron. We also identified the key nuclear reactions dominating these uncertainties. We found that β-decay rate uncertainties affect only a few nuclides near s-process branchings, whereas most of the uncertainty in the final abundances is caused by uncertainties in neutron capture rates, either directly producing or destroying the nuclide of interest. Combined total nuclear uncertainties due to reactions on heavy elements are in general small (less than 50%). Three key reactions, nevertheless, stand out because they significantly affect the uncertainties of a large number of nuclides. These are 56Fe(n,γ), 64Ni(n,γ), and 138Ba(n,γ). We discuss the prospect of reducing uncertainties in the key reactions identified in this study with future experiments.

  18. Experimental Test of Entropic Noise-Disturbance Uncertainty Relations for Spin-1/2 Measurements.

    PubMed

    Sulyok, Georg; Sponar, Stephan; Demirel, Bülent; Buscemi, Francesco; Hall, Michael J W; Ozawa, Masanao; Hasegawa, Yuji

    2015-07-17

    Information-theoretic definitions for noise and disturbance in quantum measurements were given in [Phys. Rev. Lett. 112, 050401 (2014)] and a state-independent noise-disturbance uncertainty relation was obtained. Here, we derive a tight noise-disturbance uncertainty relation for complementary qubit observables and carry out an experimental test. Successive projective measurements on the neutron's spin-1/2 system, together with a correction procedure which reduces the disturbance, are performed. Our experimental results saturate the tight noise-disturbance uncertainty relation for qubits when an optimal correction procedure is applied.

  19. Disturbance, the uncertainty principle and quantum optics

    NASA Technical Reports Server (NTRS)

    Martens, Hans; Demuynck, Willem M.

    1993-01-01

    It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.

  20. An experimental study of the thermodynamic properties of 1,1-difluoroethane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamatsu, T.; Sato, T.; Sato, H.

    1992-01-01

    Experimental vapor pressures and P-[rho]-T data of an important alternative refrigerant, 1,1-difluoroethane (HFC-152a), have been measured by means of a constant-volume method coupled with expansion procedures. Sixty P-[rho]-T data were measured along eight isochores in a range of temperatures T from 330 to 440 K, at pressures P from 1.6 to 9.3 MPa, and at densities [rho] from 51 to 811 kg [times] m[sup [minus]3]. Forty-six vapor pressures were also measured at temperatures from 320K to the critical temperature. The uncertainties of the temperature and pressure measurements are within [plus minus]7 mK and [plus minus]2kPa, respectively, while the uncertainties ofmore » the density values is within [plus minus]1%. The purity of the sample used is 99.9 wt%. On the basis of the measurements along each isochore, five saturation points were determined and the critical pressure was determined by correlating the vapor-pressure measurements. The second and third viral coefficients for temperatures from 360 to 440 K have also been determined. 21 refs., 9 figs., 5 tabs.« less

  1. Multi-site assimilation of a terrestrial biosphere model (BETHY) using satellite derived soil moisture data

    NASA Astrophysics Data System (ADS)

    Wu, Mousong; Sholze, Marko

    2017-04-01

    We investigated the importance of soil moisture data on assimilation of a terrestrial biosphere model (BETHY) for a long time period from 2010 to 2015. Totally, 101 parameters related to carbon turnover, soil respiration, as well as soil texture were selected for optimization within a carbon cycle data assimilation system (CCDAS). Soil moisture data from Soil Moisture and Ocean Salinity (SMOS) product was derived for 10 sites representing different plant function types (PFTs) as well as different climate zones. Uncertainty of SMOS soil moisture data was also estimated using triple collocation analysis (TCA) method by comparing with ASCAT dataset and BETHY forward simulation results. Assimilation of soil moisture to the system improved soil moisture as well as net primary productivity(NPP) and net ecosystem productivity (NEP) when compared with soil moisture derived from in-situ measurements and fluxnet datasets. Parameter uncertainties were largely reduced relatively to prior values. Using SMOS soil moisture data for assimilation of a terrestrial biosphere model proved to be an efficient approach in reducing uncertainty in ecosystem fluxes simulation. It could be further used in regional an global assimilation work to constrain carbon dioxide concentration simulation by combining with other sources of measurements.

  2. Charge and energy dependence of the residence time of cosmic ray nuclei below 15 GeV/nucleon

    NASA Technical Reports Server (NTRS)

    Soutoul, A.; Engelmann, J. J.; Ferrando, P.; Koch-Miramond, L.; Masse, P.; Webber, W. R.

    1985-01-01

    The relative abundance of nuclear species measured in cosmic rays at Earth has often been interpreted with the simple leaky box model. For this model to be consistent an essential requirement is that the escape length does not depend on the nuclear species. The discrepancy between escape length values derived from iron secondaries and from the B/C ratio was identified by Garcia-Munoz and his co-workers using a large amount of experimental data. Ormes and Protheroe found a similar trend in the HEAO data although they questioned its significance against uncertainties. They also showed that the change in the B/C ratio values implies a decrease of the residence time of cosmic rays at low energies in conflict with the diffusive convective picture. These conclusions crucially depend on the partial cross section values and their uncertainties. Recently new accurate cross sections of key importance for propagation calculations have been measured. Their statistical uncertainties are often better than 4% and their values significantly different from those previously accepted. Here, these new cross sections are used to compare the observed B/C+O and (Sc to Cr)/Fe ratio to those predicted with the simple leaky box model.

  3. A comprehensive survey of thermoelectric homogeneity of commonly used thermocouple types

    NASA Astrophysics Data System (ADS)

    Machin, Jonathan; Tucker, Declan; Pearce, Jonathan V.

    2018-06-01

    Thermocouples are widely used as temperature sensors in industry. The electromotive force generated by a thermocouple is produced in a temperature gradient and not at the thermocouple tip. This means that the thermoelectric inhomogeneity represents one of the most important contributions to the overall measurement uncertainty associated with thermocouples. To characterise this effect, and to provide some general recommendations concerning the magnitude of this contribution to use when formulating uncertainty analyses, a comprehensive literature survey has been performed. Significant information was found for Types K, N, R, S, B, Pt/Pd, Au/Pt and various other Pt/Rh thermocouples. In the case of Type K and N thermocouples, the survey has been augmented by a substantial amount of data based on calibrations of mineral-insulated, metal-sheathed thermocouple cable reels from thermocouple manufacturers. Some general conclusions are drawn and outline recommendations given concerning typical values for the uncertainty arising from thermoelectric inhomogeneity for the most widely used thermocouple types in the as-new state. It is stressed that these recommendations should only be heeded when individual homogeneity measurements are not possible. It is also stressed that the homogeneity can deteriorate rapidly during use, particularly for base metal thermocouples.

  4. The significance of parameter uncertainties for the prediction of offshore pile driving noise.

    PubMed

    Lippert, Tristan; von Estorff, Otto

    2014-11-01

    Due to the construction of offshore wind farms and its potential effect on marine wildlife, the numerical prediction of pile driving noise over long ranges has recently gained importance. In this contribution, a coupled finite element/wavenumber integration model for noise prediction is presented and validated by measurements. The ocean environment, especially the sea bottom, can only be characterized with limited accuracy in terms of input parameters for the numerical model at hand. Therefore the effect of these parameter uncertainties on the prediction of sound pressure levels (SPLs) in the water column is investigated by a probabilistic approach. In fact, a variation of the bottom material parameters by means of Monte-Carlo simulations shows significant effects on the predicted SPLs. A sensitivity analysis of the model with respect to the single quantities is performed, as well as a global variation. Based on the latter, the probability distribution of the SPLs at an exemplary receiver position is evaluated and compared to measurements. The aim of this procedure is to develop a model to reliably predict an interval for the SPLs, by quantifying the degree of uncertainty of the SPLs with the MC simulations.

  5. [Estimation of uncertainty of measurement in clinical biochemistry].

    PubMed

    Enea, Maria; Hristodorescu, Cristina; Schiriac, Corina; Morariu, Dana; Mutiu, Tr; Dumitriu, Irina; Gurzu, B

    2009-01-01

    The uncertainty of measurement (UM) or measurement uncertainty is known as the parameter associated with the result of a measurement. Repeated measurements usually reveal slightly different results for the same analyte, sometimes a little higher, sometimes a little lower, because the results of a measurement are depending not only by the analyte itself, but also, by a number of error factors that could give doubts about the estimate. The uncertainty of the measurement represent the quantitative, mathematically expression of this doubt. UM is a range of measured values which is probably to enclose the true value of the measured. Calculation of UM for all types of laboratories is regularized by the ISO Guide to the Expression of Uncertainty in Measurement (abbreviated GUM) and the SR ENV 13005 : 2003 (both recognized by European Accreditation). Even if the GUM rules about UM estimation are very strictly, the offering of the result together with UM will increase the confidence of customers (patients or physicians). In this study the authors are presenting the possibilities of UM assessing in labs from our country by using the data obtained in the procedures of methods validation, during the internal and external quality control.

  6. Understanding sources of uncertainty and bias error in models of human response to low amplitude sonic booms

    NASA Astrophysics Data System (ADS)

    Collmar, M.; Cook, B. G.; Cowart, R.; Freund, D.; Gavin, J.

    2015-10-01

    A pool of 240 subjects was exposed to a library of waveforms consisting of example signatures of low boom aircraft. The signature library included intentional variations in both loudness and spectral content, and were auralized using the Gulfstream SASS-II sonic boom simulator. Post-processing was used to quantify the impacts of test design decisions on the "quality" of the resultant database. Specific lessons learned from this study include insight regarding potential for bias error due to variations in loudness or peak over-pressure, sources of uncertainty and their relative importance on objective measurements and robustness of individual metrics to wide variations in spectral content. Results provide clear guidance for design of future large scale community surveys, where one must optimize the complex tradeoffs between the size of the surveyed population, spatial footprint of those participants, and the fidelity/density of objective measurements.

  7. Metrological aspects of enzyme production

    NASA Astrophysics Data System (ADS)

    Kerber, T. M.; Dellamora-Ortiz, G. M.; Pereira-Meirelles, F. V.

    2010-05-01

    Enzymes are frequently used in biotechnology to carry out specific biological reactions, either in industrial processes or for the production of bioproducts and drugs. Microbial lipases are an important group of biotechnologically valuable enzymes that present widely diversified applications. Lipase production by microorganisms is described in several published papers; however, none of them refer to metrological evaluation and the estimation of the uncertainty in measurement. Moreover, few of them refer to process optimization through experimental design. The objectives of this work were to enhance lipase production in shaken-flasks with Yarrowia lipolytica cells employing experimental design and to evaluate the uncertainty in measurement of lipase activity. The highest lipolytic activity obtained was about three- and fivefold higher than the reported activities of CRMs BCR-693 and BCR-694, respectively. Lipase production by Y. lipolytica cells aiming the classification as certified reference material is recommended after further purification and stability studies.

  8. Approach to determine measurement uncertainty in complex nanosystems with multiparametric dependencies and multivariate output quantities

    NASA Astrophysics Data System (ADS)

    Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.

    2018-03-01

    In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.

  9. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  10. Development of an automatic test equipment for nano gauging displacement transducers

    NASA Astrophysics Data System (ADS)

    Wang, Yung-Chen; Jywe, Wen-Yuh; Liu, Chien-Hung

    2005-01-01

    In order to satisfy the increasing demands on the precision in manufacturing technology, nanaometrology gradually becomes more important in manufacturing process. To ensure the precision of manufacture, precise measuring instruments and sensors play a decesive role for the accurate characterization and inspection of products. For linear length inspection, high precision gauging displacement transducers, i.e. nano gauging displacement transducers (NGDT), have been often utilized, which have been often utilized, which have the resolution in the nanometer range and can achieve an accuracy of less than 100 nm. Such measurement instruments include transducers based on electronic as well as optical measurement principles, e.g. inductive, incremental-optical or interference optical. To guarantee the accuracy and the traceability to the definition of the meter, calibration and test of NGDT are essential. Currently, there are some methods and machines for test of NGDT, but they suffer from various disadvantages. Some of them permit only manual test procedures which are time-consuming, e.g. with high accurate gauge blocks as material measures. Other tests can reach higher accuracy only in the micrometer range or result in uncertainties of more than 100 nm in the large measuring ranges. To realize the test of NGDT with a high resolution as well as a large measuring range, an automatic test equipment was constructed, that has a resolution of 1.24 nm, a measuring range of up to 20 nm (60 mm) and a measuring uncertainty of approximate ±10 nm can fulfil the requirements of high resolution within the nanometer range while simultaneously covering a large measuring range in the order of millimeters. The test system includes a stable frame, a polarization interferometer, an angle sensor, an angular control, a drive system and piezo translators. During the test procedure, the angular control and piezo translators minimize the Abbe error. For the automation of the test procedure a measuring program adhering to the measurement principle outlined in VDI/VDE 2617 guidelines was designed. With this program NGDT can be tested in less than thirty minutes with eleven measuring points and five repetitions. By mean of theoretical and experimental investigations it can be proved that the automatic test system achieves a test uncertainty of approx. ±10 nm at the measuring range of 18 mm, that corresponds to a relative uncertainty of approximately ±5 × 10-7. With small uncertainty, the minimization of the Abbe error and short test time, this system can be regarded as a universal and efficient precision test equipment, which is available for the accurate test of arbitrary high precision gauging displacement transducers.

  11. What motivates researchers in times of economic uncertainty.

    NASA Technical Reports Server (NTRS)

    Bucher, G. C.; Reece, J. E.

    1972-01-01

    Results of a study initiated late in 1970 to obtain both a measure of on-and-around-the-job factors which were 'motivating' to engineers and scientists, and to obtain an indication of how the relative importance of these factors changes as a result of the uncertain economic environment. A questionnaire, 'The Jackman Job Satisfaction Schedule,' was used to satisfy the needs of the study. It is concluded that managers can enhance the feeling of motivation by making individual job assignments interesting and challenging, by formulating significant milestones and end points into job content, and by assigning ample rewards with corresponding responsibility. In times of economic uncertainty increased emphasis should be given to security-related aspects of employment.

  12. Application of radiosonde data to VERITAS simulations

    NASA Astrophysics Data System (ADS)

    Daniel, M. K.

    The atmosphere is a vital component of the detector in an atmospheric Cherenkov telescope. In order to understand observations from these instruments and reduce systematic uncertainties and biases in their data it is important to correctly model the atmosphere in simulations of the extensive air showers they detect. The Very High Energy Telescope Array (VERITAS) is a system of 4 such telescopes located at the Whipple Observatory in Southern Arizona. Daily radiosonde measurements from the nearby Tucson airport allow an accurate model of the atmosphere for the VERITAS experiment to be constructed. Comparison of the radiosonde data to existing atmospheric models is performed and the expected effects on the systematic uncertainties are summarised here.

  13. Uncertainty evaluation in the chloroquine phosphate potentiometric titration: application of three different approaches.

    PubMed

    Rodomonte, Andrea Luca; Montinaro, Annalisa; Bartolomei, Monica

    2006-09-11

    A measurement result cannot be properly interpreted if not accompanied by its uncertainty. Several methods to estimate uncertainty have been developed. From those methods three in particular were chosen in this work to estimate the uncertainty of the Eu. Ph. chloroquine phosphate assay, a potentiometric titration commonly used in medicinal control laboratories. The famous error-budget approach (also called bottom-up or step-by-step) described by the ISO Guide to the expression of Uncertainty in Measurement (GUM) was the first method chosen. It is based on the combination of uncertainty contributions that have to be directly derived from the measurement process. The second method employed was the Analytical Method Committee top-down which estimates uncertainty through reproducibility obtained during inter-laboratory studies. Data for its application were collected in a proficiency testing study carried out by over 50 laboratories throughout Europe. The last method chosen was the one proposed by Barwick and Ellison. It uses a combination of precision, trueness and ruggedness data to estimate uncertainty. These data were collected from a validation process specifically designed for uncertainty estimation. All the three approaches presented a distinctive set of advantages and drawbacks in their implementation. An expanded uncertainty of about 1% was assessed for the assay investigated.

  14. Measuring decisional certainty among women seeking abortion.

    PubMed

    Ralph, Lauren J; Foster, Diana Greene; Kimport, Katrina; Turok, David; Roberts, Sarah C M

    2017-03-01

    Evaluating decisional certainty is an important component of medical care, including preabortion care. However, minimal research has examined how to measure certainty with reliability and validity among women seeking abortion. We examine whether the Decisional Conflict Scale (DCS), a measure widely used in other health specialties and considered the gold standard for measuring this construct, and the Taft-Baker Scale (TBS), a measure developed by abortion counselors, are valid and reliable for use with women seeking abortion and predict the decision to continue the pregnancy. Eligible women at four family planning facilities in Utah completed baseline demographic surveys and scales before their abortion information visit and follow-up interviews 3 weeks later. For each scale, we calculated mean scores and explored factors associated with high uncertainty. We evaluated internal reliability using Cronbach's alpha and assessed predictive validity by examining whether higher scale scores, indicative of decisional uncertainty or conflict, were associated with still being pregnant at follow-up. Five hundred women completed baseline surveys; two-thirds (63%) completed follow-up, at which time 11% were still pregnant. Mean scores on the DCS (15.5/100) and TBS (12.4/100) indicated low uncertainty, with acceptable reliability (α=.93 and .72, respectively). Higher scores on each scale were significantly and positively associated with still being pregnant at follow-up in both unadjusted and adjusted analyses. The DCS and TBS demonstrate acceptable reliability and validity among women seeking abortion care. Comparing scores on the DCS in this population to other studies of decision making suggests that the level of uncertainty in abortion decision making is comparable to or lower than other health decisions. The high levels of decisional certainty found in this study challenge the narrative that abortion decision making is exceptional compared to other healthcare decisions and requires additional protection such as laws mandating waiting periods, counseling and ultrasound viewing. Copyright © 2016. Published by Elsevier Inc.

  15. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits.more » Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.« less

  16. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-03-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target-measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models as well as greenhouse gas scenarios are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure Adequate Human livelihood conditions for wEll-being And Development (AHEAD). Based on a transdisciplinary sample of influential concepts addressing human well-being, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows identifying and differentiating uncertainty of climate and impact model projections. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that in many countries today, livelihood conditions are compromised by water scarcity. However, more often, AHEAD fulfilment is limited through other elements. Moreover, the analysis shows that for 44 out of 111 countries, the water-specific uncertainty ranges are outside relevant thresholds for AHEAD, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy-decisions.

  17. An uncertainty budget for VHF and UHF reflectometers

    NASA Astrophysics Data System (ADS)

    Ridler, N. M.; Medley, C. J.

    1992-05-01

    Details of the derivation of an uncertainty budget for one port immittance or complex voltage reflection coefficient measuring instruments, operating at VHF and UHF in the 14 mm 50 ohm coaxial line size, are reported. The principles of the uncertainty budget are given along with experimental results obtained using six ports and a network analyzer as the measuring instruments. Details of the types of calibration for which the uncertainty budget is suitable are reported. Various aspects of the uncertainty budget are considered and general principles and treatment of the type A and type B contributions are discussed. Experimental results obtained using the uncertainty budget are given. A summary of uncertainties for the six ports and HP8753B automatic network analyzer are also given.

  18. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  19. Artificial neural network modelling of uncertainty in gamma-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Dragović, S.; Onjia, A.; Stanković, S.; Aničin, I.; Bačić, G.

    2005-03-01

    An artificial neural network (ANN) model for the prediction of measuring uncertainties in gamma-ray spectrometry was developed and optimized. A three-layer feed-forward ANN with back-propagation learning algorithm was used to model uncertainties of measurement of activity levels of eight radionuclides ( 226Ra, 238U, 235U, 40K, 232Th, 134Cs, 137Cs and 7Be) in soil samples as a function of measurement time. It was shown that the neural network provides useful data even from small experimental databases. The performance of the optimized neural network was found to be very good, with correlation coefficients ( R2) between measured and predicted uncertainties ranging from 0.9050 to 0.9915. The correlation coefficients did not significantly deteriorate when the network was tested on samples with greatly different uranium-to-thorium ( 238U/ 232Th) ratios. The differences between measured and predicted uncertainties were not influenced by the absolute values of uncertainties of measured radionuclide activities. Once the ANN is trained, it could be employed in analyzing soil samples regardless of the 238U/ 232Th ratio. It was concluded that a considerable saving in time could be obtained using the trained neural network model for predicting the measurement times needed to attain the desired statistical accuracy.

  20. A measure of uncertainty regarding the interval constraint of normal mean elicited by two stages of a prior hierarchy.

    PubMed

    Kim, Hea-Jung

    2014-01-01

    This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.

  1. Best Practices of Uncertainty Estimation for the National Solar Radiation Database (NSRDB 1998-2015): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Sengupta, Manajit

    It is essential to apply a traceable and standard approach to determine the uncertainty of solar resource data. Solar resource data are used for all phases of solar energy conversion projects, from the conceptual phase to routine solar power plant operation, and to determine performance guarantees of solar energy conversion systems. These guarantees are based on the available solar resource derived from a measurement station or modeled data set such as the National Solar Radiation Database (NSRDB). Therefore, quantifying the uncertainty of these data sets provides confidence to financiers, developers, and site operators of solar energy conversion systems and ultimatelymore » reduces deployment costs. In this study, we implemented the Guide to the Expression of Uncertainty in Measurement (GUM) 1 to quantify the overall uncertainty of the NSRDB data. First, we start with quantifying measurement uncertainty, then we determine each uncertainty statistic of the NSRDB data, and we combine them using the root-sum-of-the-squares method. The statistics were derived by comparing the NSRDB data to the seven measurement stations from the National Oceanic and Atmospheric Administration's Surface Radiation Budget Network, National Renewable Energy Laboratory's Solar Radiation Research Laboratory, and the Atmospheric Radiation Measurement program's Southern Great Plains Central Facility, in Billings, Oklahoma. The evaluation was conducted for hourly values, daily totals, monthly mean daily totals, and annual mean monthly mean daily totals. Varying time averages assist to capture the temporal uncertainty of the specific modeled solar resource data required for each phase of a solar energy project; some phases require higher temporal resolution than others. Overall, by including the uncertainty of measurements of solar radiation made at ground stations, bias, and root mean square error, the NSRDB data demonstrated expanded uncertainty of 17 percent - 29 percent on hourly and an approximate 5 percent - 8 percent annual bases.« less

  2. Sensitivity of equilibrium profile reconstruction to motional Stark effect measurements

    NASA Astrophysics Data System (ADS)

    Batha, S. H.; Levinton, F. M.; Hirshman, S. P.; Bell, M. G.; Wieland, R. M.

    1996-09-01

    The magnetic-field pitch-angle profile, gamma p(R) identical to tan-1(Bpol/Btor), is measured on TFTR using a motional Stark effect (MSE) polarimeter. Measured pitch angle profiles, along with kinetic profiles and external magnetic measurements, are used to compute a self-consistent equilibrium using the free-boundary variational moments equilibrium code VMEC. Uncertainties in the q profile due to uncertainties in gamma P(R), magnetic measurements and kinetic measurements are found to be small. Subsequent uncertainties in the VMEC-calculated current density and shear profiles are also small

  3. Uncertainty of in-flight thrust determination

    NASA Technical Reports Server (NTRS)

    Abernethy, Robert B.; Adams, Gary R.; Steurer, John W.; Ascough, John C.; Baer-Riedhart, Jennifer L.; Balkcom, George H.; Biesiadny, Thomas

    1986-01-01

    Methods for estimating the measurement error or uncertainty of in-flight thrust determination in aircraft employing conventional turbofan/turbojet engines are reviewed. While the term 'in-flight thrust determination' is used synonymously with 'in-flight thrust measurement', in-flight thrust is not directly measured but is determined or calculated using mathematical modeling relationships between in-flight thrust and various direct measurements of physical quantities. The in-flight thrust determination process incorporates both ground testing and flight testing. The present text is divided into the following categories: measurement uncertainty methodoogy and in-flight thrust measurent processes.

  4. Flowing-water optical power meter for primary-standard, multi-kilowatt laser power measurements

    NASA Astrophysics Data System (ADS)

    Williams, P. A.; Hadler, J. A.; Cromer, C.; West, J.; Li, X.; Lehman, J. H.

    2018-06-01

    A primary-standard flowing-water optical power meter for measuring multi-kilowatt laser emission has been built and operated. The design and operational details of this primary standard are described, and a full uncertainty analysis is provided covering the measurement range from 1–10 kW with an expanded uncertainty of 1.2%. Validating measurements at 5 kW and 10 kW show agreement with other measurement techniques to within the measurement uncertainty. This work of the U.S. Government is not subject to U.S. copyright.

  5. Uncertainty Assessment of Space-Borne Passive Soil Moisture Retrievals

    NASA Technical Reports Server (NTRS)

    Quets, Jan; De Lannoy, Gabrielle; Reichle, Rolf; Cosh, Michael; van der Schalie, Robin; Wigneron, Jean-Pierre

    2017-01-01

    The uncertainty associated with passive soil moisture retrieval is hard to quantify, and known to be underlain by various, diverse, and complex causes. Factors affecting space-borne retrieved soil moisture estimation include: (i) the optimization or inversion method applied to the radiative transfer model (RTM), such as e.g. the Single Channel Algorithm (SCA), or the Land Parameter Retrieval Model (LPRM), (ii) the selection of the observed brightness temperatures (Tbs), e.g. polarization and incidence angle, (iii) the definition of the cost function and the impact of prior information in it, and (iv) the RTM parameterization (e.g. parameterizations officially used by the SMOS L2 and SMAP L2 retrieval products, ECMWF-based SMOS assimilation product, SMAP L4 assimilation product, and perturbations from those configurations). This study aims at disentangling the relative importance of the above-mentioned sources of uncertainty, by carrying out soil moisture retrieval experiments, using SMOS Tb observations in different settings, of which some are mentioned above. The ensemble uncertainties are evaluated at 11 reference CalVal sites, over a time period of more than 5 years. These experimental retrievals were inter-compared, and further confronted with in situ soil moisture measurements and operational SMOS L2 retrievals, using commonly used skill metrics to quantify the temporal uncertainty in the retrievals.

  6. How much can we trust a geological model underlying a subsurface hydrological investigation?

    NASA Astrophysics Data System (ADS)

    Wellmann, Florian; de la Varga, Miguel; Schaaf, Alexander; Burs, David

    2017-04-01

    Geological models often provide an important basis for subsequent hydrological investigations. As these models are generally built with a limited amount of information, they can contain significant uncertainties - and it is reasonable to assume that these uncertainties can potentially influence subsequent hydrological simulations. However, the investigation of uncertainties in geological models is not straightforward - and, even though recent advances have been made in the field, there is no out-of-the-box implementation to analyze uncertainties in a standard geological modeling package. We present here results of recent developments to address this problem with an efficient implementation of a geological modeling method for complex structural models, integrated in a Bayesian inference framework. The implemented geological modeling approach is based on a full 3-D implicit interpolation that directly respects interface positions and orientation measurements, as well as the influence of faults. In combination, the approach allows us to generate ensembles of geological model realizations, constrained by additional information in the form of likelihood functions to ensure consistency with additional geological aspects (e.g. sequence continuity, topology, fault network consistency), and we demonstrate the potential of the method in an exemplified case study. With this approach, we aim to contribute to a better understanding of the influence of geological uncertainties on subsurface hydrological investigations.

  7. Estimating uncertainty in subsurface glider position using transmissions from fixed acoustic tomography sources.

    PubMed

    Van Uffelen, Lora J; Nosal, Eva-Marie; Howe, Bruce M; Carter, Glenn S; Worcester, Peter F; Dzieciuch, Matthew A; Heaney, Kevin D; Campbell, Richard L; Cross, Patrick S

    2013-10-01

    Four acoustic Seagliders were deployed in the Philippine Sea November 2010 to April 2011 in the vicinity of an acoustic tomography array. The gliders recorded over 2000 broadband transmissions at ranges up to 700 km from moored acoustic sources as they transited between mooring sites. The precision of glider positioning at the time of acoustic reception is important to resolve the fundamental ambiguity between position and sound speed. The Seagliders utilized GPS at the surface and a kinematic model below for positioning. The gliders were typically underwater for about 6.4 h, diving to depths of 1000 m and traveling on average 3.6 km during a dive. Measured acoustic arrival peaks were unambiguously associated with predicted ray arrivals. Statistics of travel-time offsets between received arrivals and acoustic predictions were used to estimate range uncertainty. Range (travel time) uncertainty between the source and the glider position from the kinematic model is estimated to be 639 m (426 ms) rms. Least-squares solutions for glider position estimated from acoustically derived ranges from 5 sources differed by 914 m rms from modeled positions, with estimated uncertainty of 106 m rms in horizontal position. Error analysis included 70 ms rms of uncertainty due to oceanic sound-speed variability.

  8. Retrieval of ice cloud properties using an optimal estimation algorithm and MODIS infrared observations: 1. Forward model, error analysis, and information content

    NASA Astrophysics Data System (ADS)

    Wang, Chenxi; Platnick, Steven; Zhang, Zhibo; Meyer, Kerry; Yang, Ping

    2016-05-01

    An optimal estimation (OE) retrieval method is developed to infer three ice cloud properties simultaneously: optical thickness (τ), effective radius (reff), and cloud top height (h). This method is based on a fast radiative transfer (RT) model and infrared (IR) observations from the MODerate resolution Imaging Spectroradiometer (MODIS). This study conducts thorough error and information content analyses to understand the error propagation and performance of retrievals from various MODIS band combinations under different cloud/atmosphere states. Specifically, the algorithm takes into account four error sources: measurement uncertainty, fast RT model uncertainty, uncertainties in ancillary data sets (e.g., atmospheric state), and assumed ice crystal habit uncertainties. It is found that the ancillary and ice crystal habit error sources dominate the MODIS IR retrieval uncertainty and cannot be ignored. The information content analysis shows that for a given ice cloud, the use of four MODIS IR observations is sufficient to retrieve the three cloud properties. However, the selection of MODIS IR bands that provide the most information and their order of importance varies with both the ice cloud properties and the ambient atmospheric and the surface states. As a result, this study suggests the inclusion of all MODIS IR bands in practice since little a priori information is available.

  9. Retrieval of ice cloud properties using an optimal estimation algorithm and MODIS infrared observations. Part I: Forward model, error analysis, and information content.

    PubMed

    Wang, Chenxi; Platnick, Steven; Zhang, Zhibo; Meyer, Kerry; Yang, Ping

    2016-05-27

    An optimal estimation (OE) retrieval method is developed to infer three ice cloud properties simultaneously: optical thickness ( τ ), effective radius ( r eff ), and cloud-top height ( h ). This method is based on a fast radiative transfer (RT) model and infrared (IR) observations from the MODerate resolution Imaging Spectroradiometer (MODIS). This study conducts thorough error and information content analyses to understand the error propagation and performance of retrievals from various MODIS band combinations under different cloud/atmosphere states. Specifically, the algorithm takes into account four error sources: measurement uncertainty, fast RT model uncertainty, uncertainties in ancillary datasets (e.g., atmospheric state), and assumed ice crystal habit uncertainties. It is found that the ancillary and ice crystal habit error sources dominate the MODIS IR retrieval uncertainty and cannot be ignored. The information content analysis shows that, for a given ice cloud, the use of four MODIS IR observations is sufficient to retrieve the three cloud properties. However, the selection of MODIS IR bands that provide the most information and their order of importance varies with both the ice cloud properties and the ambient atmospheric and the surface states. As a result, this study suggests the inclusion of all MODIS IR bands in practice since little a priori information is available.

  10. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Kaiser, Maria; Špačková, Olga; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2018-05-01

    Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  11. Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.

    PubMed

    Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller

    2015-01-01

    An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement.

  12. Position-momentum uncertainty relations in the presence of quantum memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Berta, Mario; Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting ofmore » position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.« less

  13. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    PubMed

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their environment are needed to improve human well-being. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. a Rigorous Comparison of Theoretical and Measured Carbon Dioxide Line Intensities

    NASA Astrophysics Data System (ADS)

    Yi, Hongming; Fleisher, Adam J.; Gameson, Lyn; Zak, Emil J.; Polyansky, Oleg; Tennyson, Jonathan; Hodges, Joseph T.

    2017-06-01

    The ability to calculate molecular line intensities from first principles plays an increasingly important role in populating line-by-line spectroscopic databases because of its generality and extensibility to various isotopologues, spectral ranges and temperature conditions. Such calculations require a spectroscopically determined potential energy surface, and an accurate dipole moment surface that can be either fully ab initio or an effective quantity based on fits to measurements Following our recent work where we used high-precision measurements of intensities in the (30013 →00001) band of ^{12}C^{16}O_2 to bound the uncertainty of calculated line lists, here we carry out high-precision, frequency-stabilized cavity ring-down spectroscopy measurements in the R-branch of the ^{12}C^{16}O_2 (20012 →00001) band from J = 16 to 52. Gas samples consisted of 50 μmol mol^{-1} or 100 μmol mol^{-1} of nitrogen-broadened carbon dioxide with gravimetrically determined SI-traceable molar composition. We demonstrate relative measurement precision (Type A) at the 0.15 % level and estimate systematic (Type B) uncertainty contributions in % of: isotopic abundance 0.01; sample density, 0.016; cavity free spectral rang,e 0.03; line shape, 0.05; line interferences, 0.05; and carbon dioxide molar fraction, 0.06. Combined in quadrature, these components yield a relative standard uncertainty in measured line intensity less than 0.2 % for most observed transitions. These intensities differ by more than 2 % from those measured by Fourier transform spectroscopy and archived in HITRAN 2012 but differ by less than 0.5 % with the calculations of Zak et al. E. Zak et al., J. Quant. Spectrosc. Radiat. Transf. 177, (2016) 31. Huang et al., J. Quant. Spectrosc. Radiat. Transf. 130, (2013) 134. Tashkun et al., J. Quant. Spectrosc. Radiat. Transf. 152, (2015) 45.

  15. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  16. Uncertainty representation of grey numbers and grey sets.

    PubMed

    Yang, Yingjie; Liu, Sifeng; John, Robert

    2014-09-01

    In the literature, there is a presumption that a grey set and an interval-valued fuzzy set are equivalent. This presumption ignores the existence of discrete components in a grey number. In this paper, new measurements of uncertainties of grey numbers and grey sets, consisting of both absolute and relative uncertainties, are defined to give a comprehensive representation of uncertainties in a grey number and a grey set. Some simple examples are provided to illustrate that the proposed uncertainty measurement can give an effective representation of both absolute and relative uncertainties in a grey number and a grey set. The relationships between grey sets and interval-valued fuzzy sets are also analyzed from the point of view of the proposed uncertainty representation. The analysis demonstrates that grey sets and interval-valued fuzzy sets provide different but overlapping models for uncertainty representation in sets.

  17. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    PubMed

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  18. Calibrating the sqHIMMELI v1.0 wetland methane emission model with hierarchical modeling and adaptive MCMC

    NASA Astrophysics Data System (ADS)

    Susiluoto, Jouni; Raivonen, Maarit; Backman, Leif; Laine, Marko; Makela, Jarmo; Peltola, Olli; Vesala, Timo; Aalto, Tuula

    2018-03-01

    Estimating methane (CH4) emissions from natural wetlands is complex, and the estimates contain large uncertainties. The models used for the task are typically heavily parameterized and the parameter values are not well known. In this study, we perform a Bayesian model calibration for a new wetland CH4 emission model to improve the quality of the predictions and to understand the limitations of such models.The detailed process model that we analyze contains descriptions for CH4 production from anaerobic respiration, CH4 oxidation, and gas transportation by diffusion, ebullition, and the aerenchyma cells of vascular plants. The processes are controlled by several tunable parameters. We use a hierarchical statistical model to describe the parameters and obtain the posterior distributions of the parameters and uncertainties in the processes with adaptive Markov chain Monte Carlo (MCMC), importance resampling, and time series analysis techniques. For the estimation, the analysis utilizes measurement data from the Siikaneva flux measurement site in southern Finland. The uncertainties related to the parameters and the modeled processes are described quantitatively. At the process level, the flux measurement data are able to constrain the CH4 production processes, methane oxidation, and the different gas transport processes. The posterior covariance structures explain how the parameters and the processes are related. Additionally, the flux and flux component uncertainties are analyzed both at the annual and daily levels. The parameter posterior densities obtained provide information regarding importance of the different processes, which is also useful for development of wetland methane emission models other than the square root HelsinkI Model of MEthane buiLd-up and emIssion for peatlands (sqHIMMELI). The hierarchical modeling allows us to assess the effects of some of the parameters on an annual basis. The results of the calibration and the cross validation suggest that the early spring net primary production could be used to predict parameters affecting the annual methane production. Even though the calibration is specific to the Siikaneva site, the hierarchical modeling approach is well suited for larger-scale studies and the results of the estimation pave way for a regional or global-scale Bayesian calibration of wetland emission models.

  19. Sensitivity Analysis of Nuclide Importance to One-Group Neutron Cross Sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sekimoto, Hiroshi; Nemoto, Atsushi; Yoshimura, Yoshikane

    The importance of nuclides is useful when investigating nuclide characteristics in a given neutron spectrum. However, it is derived using one-group microscopic cross sections, which may contain large errors or uncertainties. The sensitivity coefficient shows the effect of these errors or uncertainties on the importance.The equations for calculating sensitivity coefficients of importance to one-group nuclear constants are derived using the perturbation method. Numerical values are also evaluated for some important cases for fast and thermal reactor systems.Many characteristics of the sensitivity coefficients are derived from the derived equations and numerical results. The matrix of sensitivity coefficients seems diagonally dominant. However,more » it is not always satisfied in a detailed structure. The detailed structure of the matrix and the characteristics of coefficients are given.By using the obtained sensitivity coefficients, some demonstration calculations have been performed. The effects of error and uncertainty of nuclear data and of the change of one-group cross-section input caused by fuel design changes through the neutron spectrum are investigated. These calculations show that the sensitivity coefficient is useful when evaluating error or uncertainty of nuclide importance caused by the cross-section data error or uncertainty and when checking effectiveness of fuel cell or core design change for improving neutron economy.« less

  20. Building Quantitative Hydrologic Storylines from Process-based Models for Managing Water Resources in the U.S. Under Climate-changed Futures

    NASA Astrophysics Data System (ADS)

    Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.

    2016-12-01

    Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.

  1. Synthesizing Global and Local Datasets to Estimate Jurisdictional Forest Carbon Fluxes in Berau, Indonesia

    PubMed Central

    Griscom, Bronson W.; Ellis, Peter W.; Baccini, Alessandro; Marthinus, Delon; Evans, Jeffrey S.; Ruslandi

    2016-01-01

    Background Forest conservation efforts are increasingly being implemented at the scale of sub-national jurisdictions in order to mitigate global climate change and provide other ecosystem services. We see an urgent need for robust estimates of historic forest carbon emissions at this scale, as the basis for credible measures of climate and other benefits achieved. Despite the arrival of a new generation of global datasets on forest area change and biomass, confusion remains about how to produce credible jurisdictional estimates of forest emissions. We demonstrate a method for estimating the relevant historic forest carbon fluxes within the Regency of Berau in eastern Borneo, Indonesia. Our method integrates best available global and local datasets, and includes a comprehensive analysis of uncertainty at the regency scale. Principal Findings and Significance We find that Berau generated 8.91 ± 1.99 million tonnes of net CO2 emissions per year during 2000–2010. Berau is an early frontier landscape where gross emissions are 12 times higher than gross sequestration. Yet most (85%) of Berau’s original forests are still standing. The majority of net emissions were due to conversion of native forests to unspecified agriculture (43% of total), oil palm (28%), and fiber plantations (9%). Most of the remainder was due to legal commercial selective logging (17%). Our overall uncertainty estimate offers an independent basis for assessing three other estimates for Berau. Two other estimates were above the upper end of our uncertainty range. We emphasize the importance of including an uncertainty range for all parameters of the emissions equation to generate a comprehensive uncertainty estimate–which has not been done before. We believe comprehensive estimates of carbon flux uncertainty are increasingly important as national and international institutions are challenged with comparing alternative estimates and identifying a credible range of historic emissions values. PMID:26752298

  2. Exploring uncertainty of Amazon dieback in a perturbed parameter Earth system ensemble.

    PubMed

    Boulton, Chris A; Booth, Ben B B; Good, Peter

    2017-12-01

    The future of the Amazon rainforest is unknown due to uncertainties in projected climate change and the response of the forest to this change (forest resiliency). Here, we explore the effect of some uncertainties in climate and land surface processes on the future of the forest, using a perturbed physics ensemble of HadCM3C. This is the first time Amazon forest changes are presented using an ensemble exploring both land vegetation processes and physical climate feedbacks in a fully coupled modelling framework. Under three different emissions scenarios, we measure the change in the forest coverage by the end of the 21st century (the transient response) and make a novel adaptation to a previously used method known as "dry-season resilience" to predict the long-term committed response of the forest, should the state of the climate remain constant past 2100. Our analysis of this ensemble suggests that there will be a high chance of greater forest loss on longer timescales than is realized by 2100, especially for mid-range and low emissions scenarios. In both the transient and predicted committed responses, there is an increasing uncertainty in the outcome of the forest as the strength of the emissions scenarios increases. It is important to note however, that very few of the simulations produce future forest loss of the magnitude previously shown under the standard model configuration. We find that low optimum temperatures for photosynthesis and a high minimum leaf area index needed for the forest to compete for space appear to be precursors for dieback. We then decompose the uncertainty into that associated with future climate change and that associated with forest resiliency, finding that it is important to reduce the uncertainty in both of these if we are to better determine the Amazon's outcome. © 2017 John Wiley & Sons Ltd.

  3. Maritime Aerosol Network as a Component of AERONET - a Useful Tool for Evaluation of the Global Sea-Salt Aerosol Distribution

    NASA Astrophysics Data System (ADS)

    Smirnov, A.; Holben, B. N.; Kinne, S.; Nelson, N. B.; Stenchikov, G. L.; Broccardo, S. P.; Sowers, D.; Lobecker, E.; Ondrusek, M.; Zielinski, T. P.; Gray, L. M.; Frouin, R.; Radionov, V. F.; Smyth, T. J.; Zibordi, G.; Heller, M. I.; Slabakova, V.; Krüger, K.; Reid, E. A.; Istomina, L.; Vandermeulen, R. A.; O'Neill, N. T.; Levy, G.; Giles, D. M.; Slutsker, I.; Sorokin, M. G.; Eck, T. F.

    2016-02-01

    Sea-salt aerosol plays an important role in radiation balance and chemistry of marine atmosphere. Sea-salt production depends on various factors. There is a significant uncertainty in the parametrization of the sea-salt production and budget. Ship-based aerosol optical depth (AOD) measurements can be used as an important validation tool for various global models and in-situ measurements. The paper presents the current status of the Maritime Aerosol Network (MAN) which is a component of Aerosol Robotic Network. Since 2006 over 300 cruises were completed and data archive of more than 5500 measurement days is accessible at http://aeronet.gsfc.nasa.gov/new_web/maritime_aerosol_network.html . AOD measurements from ships of opportunity complemented island-based AERONET measurements and provided important reference points for satellite retrieved and modelled AOD climatology over the oceans. The program exemplifies mutually beneficial international, multi-agency effort in atmospheric aerosol optical studies over the oceans.

  4. Not Normal: the uncertainties of scientific measurements

    NASA Astrophysics Data System (ADS)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  5. Not Normal: the uncertainties of scientific measurements

    PubMed Central

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557

  6. Development of a Low-Level Ar-37 Calibration Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Richard M.; Aalseth, Craig E.; Bowyer, Ted W.

    Argon-37 is an important environmental signature of an underground nuclear explosion. Producing and quantifying low-level 37Ar standards is an important step in the development of sensitive field measurement instruments for use during an On-Site Inspection, a key provision of the Comprehensive Nuclear-Test-Ban Treaty. This paper describes progress at Pacific Northwest National Laboratory (PNNL) in the development of a process to generate and quantify low-level 37Ar standard material, which can then be used to calibrate sensitive field systems at activities consistent with soil background levels. The 37Ar used for our work was generated using a laboratory-scale, high-energy neutron source to irradiatemore » powdered samples of calcium carbonate. Small aliquots of 37Ar were then extracted from the head space of the irradiated samples. The specific activity of the head space samples, mixed with P10 (90% stable argon:10% methane by mole fraction) count gas, is then derived using the accepted Length-Compensated Internal-Source Proportional Counting method. Due to the low activity of the samples, a set of three Ultra-Low Background Proportional-Counters designed and fabricated at PNNL from radio-pure electroformed copper was used to make the measurements in PNNL’s shallow underground counting laboratory. Very low background levels (<10 counts/day) have been observed in the spectral region near the 37Ar emission feature at 2.8 keV. Two separate samples from the same irradiation were measured. The first sample was counted for 12 days beginning 28 days after irradiation, the second sample was counted for 24 days beginning 70 days after irradiation (the half-life of 37Ar is 35.0 days). Both sets of measurements were analyzed and yielded very similar results for the starting activity (~0.1 Bq) and activity concentration (0.15 mBq/ccSTP argon) after P10 count gas was added. A detailed uncertainty model was developed based on the ISO Guide to the Expression of Uncertainty in Measurement. This paper presents a discussion of the measurement analysis, along with assumptions and uncertainty estimates.« less

  7. Photochemical parameters of atmospheric source gases: accurate determination of OH reaction rate constants over atmospheric temperatures, UV and IR absorption spectra

    NASA Astrophysics Data System (ADS)

    Orkin, V. L.; Khamaganov, V. G.; Martynova, L. E.; Kurylo, M. J.

    2012-12-01

    The emissions of halogenated (Cl, Br containing) organics of both natural and anthropogenic origin contribute to the balance of and changes in the stratospheric ozone concentration. The associated chemical cycles are initiated by the photochemical decomposition of the portion of source gases that reaches the stratosphere. Reactions with hydroxyl radicals and photolysis are the main processes dictating the compound lifetime in the troposphere and release of active halogen in the stratosphere for a majority of halogen source gases. Therefore, the accuracy of photochemical data is of primary importance for the purpose of comprehensive atmospheric modeling and for simplified kinetic estimations of global impacts on the atmosphere, such as in ozone depletion (i.e., the Ozone Depletion Potential, ODP) and climate change (i.e., the Global Warming Potential, GWP). The sources of critically evaluated photochemical data for atmospheric modeling, NASA/JPL Publications and IUPAC Publications, recommend uncertainties within 10%-60% for the majority of OH reaction rate constants with only a few cases where uncertainties lie at the low end of this range. These uncertainties can be somewhat conservative because evaluations are based on the data from various laboratories obtained during the last few decades. Nevertheless, even the authors of the original experimental works rarely estimate the total combined uncertainties of the published OH reaction rate constants to be less than ca. 10%. Thus, uncertainties in the photochemical properties of potential and current atmospheric trace gases obtained under controlled laboratory conditions still may constitute a major source of uncertainty in estimating the compound's environmental impact. One of the purposes of the presentation is to illustrate the potential for obtaining accurate laboratory measurements of the OH reaction rate constant over the temperature range of atmospheric interest. A detailed inventory of accountable sources of instrumental uncertainties related to our FP-RF experiment proves a total uncertainty of the OH reaction rate constant to be as small as ca. 2-3%. The high precision of kinetic measurements allows reliable determination of weak temperature dependences of the rate constants and clear resolution of the curvature of the Arrhenius plots for the OH reaction rate constants of various compounds. The results of OH reaction rate constant determinations between 220 K and 370 K will be presented. Similarly, the accuracy of UV and IR absorption measurements will be highlighted to provide an improved basis for atmospheric modeling.

  8. Measurement uncertainty associated with chromatic confocal profilometry for 3D surface texture characterization of natural human enamel.

    PubMed

    Mullan, F; Bartlett, D; Austin, R S

    2017-06-01

    To investigate the measurement performance of a chromatic confocal profilometer for quantification of surface texture of natural human enamel in vitro. Contributions to the measurement uncertainty from all potential sources of measurement error using a chromatic confocal profilometer and surface metrology software were quantified using a series of surface metrology calibration artifacts and pre-worn enamel samples. The 3D surface texture analysis protocol was optimized across 0.04mm 2 of natural and unpolished enamel undergoing dietary acid erosion (pH 3.2, titratable acidity 41.3mmolOH/L). Flatness deviations due to the x, y stage mechanical movement were the major contribution to the measurement uncertainty; with maximum Sz flatness errors of 0.49μm. Whereas measurement noise; non-linearity's in x, y, z and enamel sample dimensional instability contributed minimal errors. The measurement errors were propagated into an uncertainty budget following a Type B uncertainty evaluation in order to calculate the Standard Combined Uncertainty (u c ), which was ±0.28μm. Statistically significant increases in the median (IQR) roughness (Sa) of the polished samples occurred after 15 (+0.17 (0.13)μm), 30 (+0.12 (0.09)μm) and 45 (+0.18 (0.15)μm) min of erosion (P<0.001 vs. baseline). In contrast, natural unpolished enamel samples revealed a statistically significant decrease in Sa roughness of -0.14 (0.34) μm only after 45min erosion (P<0.05s vs. baseline). The main contribution to measurement uncertainty using chromatic confocal profilometry was from flatness deviations however by optimizing measurement protocols the profilometer successfully characterized surface texture changes in enamel from erosive wear in vitro. Copyright © 2017 The Academy of Dental Materials. All rights reserved.

  9. Outdoor solar UVA dose assessment with EBT2 radiochromic film using spectrophotometer and densitometer measurements.

    PubMed

    Abukassem, I; Bero, M A

    2015-04-01

    Direct measurements of solar ultraviolet radiations (UVRs) have an important role in the protection of humans against UVR hazard. This work presents simple technique based on the application of EBT2 GAFCHROMIC(®) film for direct solar UVA dose assessment. It demonstrates the effects of different parts of the solar spectrum (UVB, visible and infrared) on performed UVA field measurements and presents the measurement uncertainty budget. The gradient of sunlight exposure level permitted the authors to establish the mathematical relationships between the measured solar UVA dose and two measured quantities: the first was the changes in spectral absorbance at the wavelength 633 nm (A633) and the second was the optical density (OD). The established standard relations were also applied to calculate the solar UVA dose variations during the whole day; 15 min of exposure each hour between 8:00 and 17:00 was recorded. Results show that both applied experimental methods, spectrophotometer absorbance and densitometer OD, deliver comparable figures for EBT2 solar UVA dose assessment with relative uncertainty of 11% for spectral absorbance measurements and 15% for OD measurements. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    NASA Astrophysics Data System (ADS)

    Stankunas, Gediminas; Batistoni, Paola; Sjöstrand, Henrik; Conroy, Sean; JET Contributors

    2015-07-01

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  11. Disaggregating measurement uncertainty from population variability and Bayesian treatment of uncensored results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strom, Daniel J.; Joyce, Kevin E.; Maclellan, Jay A.

    2012-04-17

    In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average ofmore » the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.« less

  12. A new design approach to achieve a minimum impulse limit cycle in the presence of significant measurement uncertainties

    NASA Technical Reports Server (NTRS)

    Martin, M. W.; Kubiak, E. T.

    1982-01-01

    A new design was developed for the Space Shuttle Transition Phase Digital Autopilot to reduce the impact of large measurement uncertainties in the rate signal during attitude control. The signal source, which was dictated by early computer constraints, is characterized by large quantization, noise, bias, and transport lag which produce a measurement uncertainty larger than the minimum impulse rate change. To ensure convergence to a minimum impulse limit cycle, the design employed bias and transport lag compensation and a switching logic with hysteresis, rate deadzone, and 'walking' switching line. The design background, the rate measurement uncertainties, and the design solution are documented.

  13. Free tropospheric observations of Carbonyl Sulfide from Aura Tropospheric Emission Spectrometer over ocean

    NASA Astrophysics Data System (ADS)

    Kuai, Le; Worden, John; Campbell, Ellitt; Kulawik, Susan; Montzka, Stephen; Liu, Jiabin

    2014-05-01

    Carbonyl sulfide (OCS) is the most abundant sulfur gas in the troposphere with a global averaging mixing ratio of about 500 part per trillion (ppt). The ocean is the primary source of OCS, emitting OCS directly or its precursors, carbon disulfide and dimethyl sulfide. The most important atmospheric sink of OCS is uptake by terrestrial plants via photosynthesis. Although the global budget of atmospheric OCS has been studied, the global integrated OCS fluxes have large uncertainties, e.g. the uncertainties of the ocean fluxes are as large as 100% or more and how the ocean sources are distributed is not well known. We developed a retrieval algorithm for free tropospheric carbonyl sulfide (OCS) observations above the ocean using radiance measurements from the Tropospheric Emission Spectrometer (TES). These first observations of the free tropospheric OCS provide global maps with information of OCS seasonal and spatial variability in the mid troposphere. These data will help to characterize ocean OCS fluxes. Evaluation of the biases and uncertainties in the TES OCS estimates against aircraft profiles from the HIPPO campaign and ground data from the NOAA Mauna Loa site suggests that the OCS retrievals (1) have less than 1.0 degree of freedom for signals (DOFs), (2) are sensitive in the mid-troposphere with a peak sensitivity typically between 300 to 500 hPa, (3) and have much smaller systematic errors from temperature, CO2 and H2O calibrations relative to random errors from measurement noise. Here we estimate the monthly means from TES measurements averaged over multiple years so that random errors are reduced and useful information about OCS seasonal and latitudinal variability can be derived. With this averaging, TES OCS data are found to be consistent (within the calculated uncertainties) with NOAA ground observations and HIPPO aircraft measurements and captures the seasonal and latitudinal variations observed by these in situ data within the estimated uncertainties. This TES OCS monthly data will be used to constrain the ocean flux, understand the tropical ocean variability (e.g., west-east contrast over the Pacific).

  14. Managing the uncertainties of the streamflow data produced by the French national hydrological services

    NASA Astrophysics Data System (ADS)

    Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level records to produce continuous discharge time series. The management of rating curves is also done using BAREME. The BaRatin method (Le Coz et al., 2014) was developed as a Bayesian approach of rating curve development and uncertainty analysis. Since BaRatin accounts for the individual uncertainties of gauging data used to build the rating curve, it was coupled with BAREME. The BaRatin method is still undergoing development and research, in particular to address non univocal or time-varying stage-discharge relations, due to hysteresis, variable backwater, rating shifts, etc. A new interface including new options is under development. The next steps are now to propagate the uncertainties of water level records, through uncertain rating curves, up to discharge time series and derived variables (e.g. annual mean flow) and statistics (e.g. flood quantiles). Bayesian tools are already available for both tasks but further validation and development is necessary for their integration in the operational data workflow of the French NHS. References Le Coz, J., Camenen, B., Peyrard, X., Dramais, G., 2012. Uncertainty in open-channel discharges measured with the velocity-area method. Flow Measurement and Instrumentation 26, 18-29. Le Coz, J., Renard, B., Bonnifait, L., Branger, F., Le Boursicaud, R., 2014. Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.

  15. Comparison of methods for the determination of NO-O3-NO2 fluxes and chemical interactions over a bare soil

    NASA Astrophysics Data System (ADS)

    Stella, P.; Loubet, B.; Laville, P.; Lamaud, E.; Cazaunau, M.; Laufs, S.; Bernard, F.; Grosselin, B.; Mascher, N.; Kurtenbach, R.; Mellouki, A.; Kleffmann, J.; Cellier, P.

    2011-08-01

    Tropospheric ozone (O3) is a known greenhouse gas responsible for impacts on human and animal health and ecosystem functioning. In addition, O3 plays an important role in tropospheric chemistry, together with nitrogen oxides. Flux measurements of these trace gases are a major issue to establish their atmospheric budget and evaluate the ozone impact onto the biosphere. In this study, ozone, nitric oxide (NO) and nitrogen dioxide (NO2) fluxes were measured using the aerodynamic gradient method over a bare soil in an agricultural field. Vertical mixing ratio profile measurements were performed with fast response sensors. It was demonstrated that corrections of the aerodynamic gradient for chemical reactions between O3-NO-NO2 appeared to be negligible for O3 fluxes, whereas they accounted for about 10 % on average of the NO and NO2 fluxes. The flux uncertainties were mainly due to uncertainties of the friction velocity. In addition, the use of fast response sensors allowed to reduce the remaining part of the flux uncertainty. The aerodynamic gradient and eddy-covariance methods gave similar O3 fluxes (within 4 %). The chamber NO fluxes were up to 70 % lower than the aerodynamic gradient fluxes probably caused by either the spatial heterogeneity of the soil NO emissions or the environmental perturbation due to the chamber.

  16. Assessing patient-centered communication in a family practice setting: how do we measure it, and whose opinion matters?

    PubMed

    Clayton, Margaret F; Latimer, Seth; Dunn, Todd W; Haas, Leonard

    2011-09-01

    This study evaluated variables thought to influence patient's perceptions of patient-centeredness. We also compared results from two coding schemes that purport to evaluate patient-centeredness, the Measure of Patient-Centered Communication (MPCC) and the 4 Habits Coding Scheme (4HCS). 174 videotaped family practice office visits, and patient self-report measures were analyzed. Patient factors contributing to positive perceptions of patient-centeredness were successful negotiation of decision-making roles and lower post-visit uncertainty. MPCC coding found visits were on average 59% patient-centered (range 12-85%). 4HCS coding showed an average of 83 points (maximum possible 115). However, patients felt their visits were highly patient-centered (mean 3.7, range 1.9-4; maximum possible 4). There was a weak correlation between coding schemes, but no association between coding results and patient variables (number of pre-visit concerns, attainment of desired decision-making role, post-visit uncertainty, patients' perception of patient-centeredness). Coder inter-rater reliability was lower than expected; convergent and divergent validity were not supported. The 4HCS and MPCC operationalize patient-centeredness differently, illustrating a lack of conceptual clarity. The patient's perspective is important. Family practice providers can facilitate a more positive patient perception of patient-centeredness by addressing patient concerns to help reduce patient uncertainty, and by negotiating decision-making roles. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Confidence Intervals for Proportion Estimates in Complex Samples. Research Report. ETS RR-06-21

    ERIC Educational Resources Information Center

    Oranje, Andreas

    2006-01-01

    Confidence intervals are an important tool to indicate uncertainty of estimates and to give an idea of probable values of an estimate if a different sample from the population was drawn or a different sample of measures was used. Standard symmetric confidence intervals for proportion estimates based on a normal approximation can yield bounds…

  18. [Dealing with diagnostic uncertainty in general practice].

    PubMed

    Wübken, Magdalena; Oswald, Jana; Schneider, Antonius

    2013-01-01

    In general, the prevalence of diseases is low in primary care. Therefore, the positive predictive value of diagnostic tests is lower than in hospitals where patients are highly selected. In addition, the patients present with milder forms of disease; and many diseases might hide behind the initial symptom(s). These facts lead to diagnostic uncertainty which is somewhat inherent to general practice. This narrative review discusses different sources of and reasons for uncertainty and strategies to deal with it in the context of the current literature. Fear of uncertainty correlates with higher diagnostic activities. The attitude towards uncertainty correlates with the choice of medical speciality by vocational trainees or medical students. An intolerance of uncertainty, which still increases as medicine is making steady progress, might partly explain the growing shortage of general practitioners. The bio-psycho-social context appears to be important to diagnostic decision-making. The effect of intuition and heuristics are investigated by cognitive psychologists. It is still unclear whether these aspects are prone to bias or useful, which might depend on the context of medical decisions. Good communication is of great importance to share uncertainty with the patients in a transparent way and to alleviate shared decision-making. Dealing with uncertainty should be seen as an important core component of general practice and needs to be investigated in more detail to improve the respective medical decisions. Copyright © 2013. Published by Elsevier GmbH.

  19. The 1995 scientific assessment of the atmospheric effects of stratospheric aircraft

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Baughcum, Steven L.; Brune, William H.; Douglass, Anne R.; Fahey, David W.; Friedl, Randall R.; Liu, Shaw C.; Plumb, R. Alan; Poole, Lamont R.; Wesoky, Howard L.

    1995-01-01

    This report provides a scientific assessment of our knowledge concerning the impact of proposed high-speed civil transport (HSCT) aircraft on the atmosphere. It comes at the end of Phase 1 of the Atmospheric Effects of Stratospheric Aircraft element of the NASA High-Speed Research Program. The fundamental problem with stratospheric flight is that pollutant residence times are long because the stratosphere is a region of permanent temperature inversion with stable stratification. Using improved two-dimensional assessment models and detailed fleet emissions scenarios, the assessment examines the possible impact of the range of effluents from aircraft. Emphasis is placed on the effects of NO(x) and H2O on the atmospheric ozone content. Measurements in the plume of an in-flight Concorde supersonic transport indicated a large number of small particles. These measurements, coupled with model sensitivity studies, point out the importance of obtaining a more detailed understanding of the fate of sulfur in the HSCT exhaust. Uncertainties in the current understanding of the processes important for determining the overall effects of HSCT's on the atmosphere are discussed and partially quantified. Research directions are identified to improve the quantification of uncertainties and to reduce their magnitude.

  20. CO2, H2O, and chlorophyll fluorescence retrieved from OCO-2 measurements using a fast radiative transfer model approximating multiple scattering effects

    NASA Astrophysics Data System (ADS)

    Reuter, Maximilian; Bovensmann, Heinrich; Buchwitz, Michael; Burrows, John P.; Heymann, Jens; Noël, Stefan; Rozanov, Vladimir; Schneising, Oliver

    2017-04-01

    Carbon dioxide is the most important anthropogenic greenhouse gas. Its global increasing concentration in the Earth's atmosphere is the main driver for global climate change. In spite of its importance, there are still large uncertainties on its global sources and sinks. Satellite measurements have the potential to reduce these surface flux uncertainties. However, the demanding accuracy requirements usually involve the need for precise radiative transfer calculations in a scattering atmosphere. These can be computationally so expensive that hundreds or thousands of CPU cores are need to keep up with the data stream of an instrument like OCO-2. Future instruments will further increase the amount of soundings at least by an order of magnitude. A radiative transfer model has been developed approximating scattering effects by multiple scattering at an optically thin scattering layer reducing the computational costs by several orders of magnitude. The model can be used to simulate the radiance in all three OCO-2 spectral bands allowing the simultaneous retrieval of CO2, H2O, and chlorophyll fluorescence. First retrieval results for OCO-2 data will be presented.

Top